Dec 06 09:06:21 crc systemd[1]: Starting Kubernetes Kubelet... Dec 06 09:06:21 crc restorecon[4657]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:21 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 09:06:22 crc restorecon[4657]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 09:06:22 crc restorecon[4657]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 06 09:06:22 crc kubenswrapper[4672]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 06 09:06:22 crc kubenswrapper[4672]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 06 09:06:22 crc kubenswrapper[4672]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 06 09:06:22 crc kubenswrapper[4672]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 06 09:06:22 crc kubenswrapper[4672]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 06 09:06:22 crc kubenswrapper[4672]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.382411 4672 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386418 4672 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386438 4672 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386444 4672 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386449 4672 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386454 4672 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386459 4672 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386463 4672 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386467 4672 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386472 4672 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386476 4672 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386480 4672 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386485 4672 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386489 4672 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386501 4672 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386505 4672 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386509 4672 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386512 4672 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386516 4672 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386519 4672 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386524 4672 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386529 4672 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386533 4672 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386536 4672 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386540 4672 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386543 4672 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386548 4672 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386551 4672 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386555 4672 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386558 4672 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386561 4672 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386565 4672 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386568 4672 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386573 4672 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386578 4672 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386581 4672 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386585 4672 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386589 4672 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386593 4672 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386613 4672 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386617 4672 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386620 4672 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386624 4672 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386627 4672 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386632 4672 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386635 4672 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386639 4672 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386642 4672 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386645 4672 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386649 4672 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386652 4672 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386655 4672 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386659 4672 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386662 4672 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386666 4672 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386669 4672 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386672 4672 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386675 4672 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386680 4672 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386684 4672 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386688 4672 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386692 4672 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386696 4672 feature_gate.go:330] unrecognized feature gate: Example Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386699 4672 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386703 4672 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386706 4672 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386710 4672 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386713 4672 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386716 4672 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386720 4672 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386724 4672 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.386727 4672 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.386932 4672 flags.go:64] FLAG: --address="0.0.0.0" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.386946 4672 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.386972 4672 flags.go:64] FLAG: --anonymous-auth="true" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.386978 4672 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.386984 4672 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.386988 4672 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.386994 4672 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387003 4672 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387007 4672 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387011 4672 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387016 4672 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387020 4672 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387024 4672 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387028 4672 flags.go:64] FLAG: --cgroup-root="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387033 4672 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387038 4672 flags.go:64] FLAG: --client-ca-file="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387043 4672 flags.go:64] FLAG: --cloud-config="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387048 4672 flags.go:64] FLAG: --cloud-provider="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387053 4672 flags.go:64] FLAG: --cluster-dns="[]" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387060 4672 flags.go:64] FLAG: --cluster-domain="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387066 4672 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387071 4672 flags.go:64] FLAG: --config-dir="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387076 4672 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387082 4672 flags.go:64] FLAG: --container-log-max-files="5" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387089 4672 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387094 4672 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387099 4672 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387104 4672 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387109 4672 flags.go:64] FLAG: --contention-profiling="false" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387114 4672 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387118 4672 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387124 4672 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387128 4672 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387136 4672 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387142 4672 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387147 4672 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387153 4672 flags.go:64] FLAG: --enable-load-reader="false" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387158 4672 flags.go:64] FLAG: --enable-server="true" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387163 4672 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387168 4672 flags.go:64] FLAG: --event-burst="100" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387172 4672 flags.go:64] FLAG: --event-qps="50" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387177 4672 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387181 4672 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387185 4672 flags.go:64] FLAG: --eviction-hard="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387190 4672 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387194 4672 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387198 4672 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387203 4672 flags.go:64] FLAG: --eviction-soft="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387207 4672 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387211 4672 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387215 4672 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387219 4672 flags.go:64] FLAG: --experimental-mounter-path="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387223 4672 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387227 4672 flags.go:64] FLAG: --fail-swap-on="true" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387231 4672 flags.go:64] FLAG: --feature-gates="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387236 4672 flags.go:64] FLAG: --file-check-frequency="20s" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387240 4672 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387245 4672 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387249 4672 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387254 4672 flags.go:64] FLAG: --healthz-port="10248" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387258 4672 flags.go:64] FLAG: --help="false" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387262 4672 flags.go:64] FLAG: --hostname-override="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387266 4672 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387270 4672 flags.go:64] FLAG: --http-check-frequency="20s" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387274 4672 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387279 4672 flags.go:64] FLAG: --image-credential-provider-config="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387283 4672 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387287 4672 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387291 4672 flags.go:64] FLAG: --image-service-endpoint="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387295 4672 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387300 4672 flags.go:64] FLAG: --kube-api-burst="100" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387304 4672 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387309 4672 flags.go:64] FLAG: --kube-api-qps="50" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387312 4672 flags.go:64] FLAG: --kube-reserved="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387317 4672 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387321 4672 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387325 4672 flags.go:64] FLAG: --kubelet-cgroups="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387329 4672 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387332 4672 flags.go:64] FLAG: --lock-file="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387336 4672 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387342 4672 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387346 4672 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387352 4672 flags.go:64] FLAG: --log-json-split-stream="false" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387356 4672 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387360 4672 flags.go:64] FLAG: --log-text-split-stream="false" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387364 4672 flags.go:64] FLAG: --logging-format="text" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387368 4672 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387372 4672 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387376 4672 flags.go:64] FLAG: --manifest-url="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387380 4672 flags.go:64] FLAG: --manifest-url-header="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387386 4672 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387390 4672 flags.go:64] FLAG: --max-open-files="1000000" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387395 4672 flags.go:64] FLAG: --max-pods="110" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387399 4672 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387403 4672 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387407 4672 flags.go:64] FLAG: --memory-manager-policy="None" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387411 4672 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387416 4672 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387420 4672 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387424 4672 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387434 4672 flags.go:64] FLAG: --node-status-max-images="50" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387438 4672 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387442 4672 flags.go:64] FLAG: --oom-score-adj="-999" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387455 4672 flags.go:64] FLAG: --pod-cidr="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387459 4672 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387466 4672 flags.go:64] FLAG: --pod-manifest-path="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387470 4672 flags.go:64] FLAG: --pod-max-pids="-1" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387475 4672 flags.go:64] FLAG: --pods-per-core="0" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387479 4672 flags.go:64] FLAG: --port="10250" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387483 4672 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387487 4672 flags.go:64] FLAG: --provider-id="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387491 4672 flags.go:64] FLAG: --qos-reserved="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387495 4672 flags.go:64] FLAG: --read-only-port="10255" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387499 4672 flags.go:64] FLAG: --register-node="true" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387503 4672 flags.go:64] FLAG: --register-schedulable="true" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387507 4672 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387514 4672 flags.go:64] FLAG: --registry-burst="10" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387518 4672 flags.go:64] FLAG: --registry-qps="5" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387522 4672 flags.go:64] FLAG: --reserved-cpus="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387526 4672 flags.go:64] FLAG: --reserved-memory="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387531 4672 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387535 4672 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387540 4672 flags.go:64] FLAG: --rotate-certificates="false" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387544 4672 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387548 4672 flags.go:64] FLAG: --runonce="false" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387552 4672 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387556 4672 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387561 4672 flags.go:64] FLAG: --seccomp-default="false" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387566 4672 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387571 4672 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387577 4672 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387583 4672 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387588 4672 flags.go:64] FLAG: --storage-driver-password="root" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387592 4672 flags.go:64] FLAG: --storage-driver-secure="false" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387614 4672 flags.go:64] FLAG: --storage-driver-table="stats" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387619 4672 flags.go:64] FLAG: --storage-driver-user="root" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387624 4672 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387628 4672 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387633 4672 flags.go:64] FLAG: --system-cgroups="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387637 4672 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387643 4672 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387647 4672 flags.go:64] FLAG: --tls-cert-file="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387653 4672 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387658 4672 flags.go:64] FLAG: --tls-min-version="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387662 4672 flags.go:64] FLAG: --tls-private-key-file="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387666 4672 flags.go:64] FLAG: --topology-manager-policy="none" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387670 4672 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387674 4672 flags.go:64] FLAG: --topology-manager-scope="container" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387678 4672 flags.go:64] FLAG: --v="2" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387684 4672 flags.go:64] FLAG: --version="false" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387690 4672 flags.go:64] FLAG: --vmodule="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387695 4672 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.387699 4672 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387802 4672 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387807 4672 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387811 4672 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387815 4672 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387818 4672 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387822 4672 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387825 4672 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387830 4672 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387835 4672 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387840 4672 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387844 4672 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387848 4672 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387853 4672 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387857 4672 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387861 4672 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387865 4672 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387868 4672 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387872 4672 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387875 4672 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387879 4672 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387882 4672 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387886 4672 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387889 4672 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387892 4672 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387896 4672 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387900 4672 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387904 4672 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387907 4672 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387911 4672 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387914 4672 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387918 4672 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387921 4672 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387926 4672 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387930 4672 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387933 4672 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387937 4672 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387941 4672 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387944 4672 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387947 4672 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387951 4672 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387954 4672 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387958 4672 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387962 4672 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387966 4672 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387969 4672 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387973 4672 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387977 4672 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387980 4672 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387983 4672 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387987 4672 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387990 4672 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387994 4672 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.387997 4672 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.388002 4672 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.388006 4672 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.388010 4672 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.388014 4672 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.388018 4672 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.388022 4672 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.388026 4672 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.388030 4672 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.388034 4672 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.388038 4672 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.388042 4672 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.388045 4672 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.388049 4672 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.388052 4672 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.388056 4672 feature_gate.go:330] unrecognized feature gate: Example Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.388059 4672 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.388063 4672 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.388066 4672 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.388201 4672 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.400082 4672 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.400132 4672 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400307 4672 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400328 4672 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400339 4672 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400350 4672 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400359 4672 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400369 4672 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400378 4672 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400386 4672 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400395 4672 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400403 4672 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400412 4672 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400420 4672 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400429 4672 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400438 4672 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400446 4672 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400455 4672 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400464 4672 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400473 4672 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400481 4672 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400489 4672 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400498 4672 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400506 4672 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400515 4672 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400526 4672 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400536 4672 feature_gate.go:330] unrecognized feature gate: Example Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400546 4672 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400555 4672 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400566 4672 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400578 4672 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400591 4672 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400665 4672 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400686 4672 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400700 4672 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400711 4672 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400722 4672 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400737 4672 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400773 4672 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400785 4672 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400798 4672 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400809 4672 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400820 4672 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400832 4672 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400843 4672 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400854 4672 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400864 4672 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400875 4672 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400886 4672 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400897 4672 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400907 4672 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400919 4672 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400930 4672 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400959 4672 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400969 4672 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400977 4672 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400985 4672 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.400994 4672 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401002 4672 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401011 4672 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401019 4672 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401027 4672 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401045 4672 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401055 4672 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401065 4672 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401073 4672 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401082 4672 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401090 4672 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401098 4672 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401107 4672 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401116 4672 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401137 4672 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401145 4672 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.401169 4672 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401493 4672 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401537 4672 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401550 4672 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401563 4672 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401575 4672 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401586 4672 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401633 4672 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401646 4672 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401657 4672 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401668 4672 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401722 4672 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401731 4672 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401740 4672 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401749 4672 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401757 4672 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401766 4672 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401775 4672 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401785 4672 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401793 4672 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401806 4672 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401818 4672 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401828 4672 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401837 4672 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401848 4672 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401856 4672 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401868 4672 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401878 4672 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401887 4672 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401895 4672 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401905 4672 feature_gate.go:330] unrecognized feature gate: Example Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401913 4672 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401923 4672 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401931 4672 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401940 4672 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401949 4672 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401957 4672 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401965 4672 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401976 4672 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401985 4672 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.401993 4672 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402002 4672 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402010 4672 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402018 4672 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402030 4672 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402040 4672 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402049 4672 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402058 4672 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402067 4672 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402075 4672 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402083 4672 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402091 4672 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402101 4672 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402110 4672 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402119 4672 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402127 4672 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402136 4672 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402145 4672 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402154 4672 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402162 4672 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402170 4672 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402178 4672 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402187 4672 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402195 4672 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402204 4672 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402212 4672 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402221 4672 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402229 4672 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402240 4672 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402250 4672 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402259 4672 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.402267 4672 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.402281 4672 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.402912 4672 server.go:940] "Client rotation is on, will bootstrap in background" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.408162 4672 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.408364 4672 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.409304 4672 server.go:997] "Starting client certificate rotation" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.409352 4672 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.409582 4672 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-03 09:17:48.699283589 +0000 UTC Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.409706 4672 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 672h11m26.289581994s for next certificate rotation Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.415201 4672 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.416946 4672 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.425792 4672 log.go:25] "Validated CRI v1 runtime API" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.443683 4672 log.go:25] "Validated CRI v1 image API" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.446148 4672 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.449386 4672 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-06-09-00-26-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.449453 4672 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.472147 4672 manager.go:217] Machine: {Timestamp:2025-12-06 09:06:22.4704687 +0000 UTC m=+0.214729067 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2799998 MemoryCapacity:25199476736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:7e6e2ea0-eb53-4cec-8366-444329cefc63 BootID:dee4872a-ee41-4a28-b591-3da52b9dd3d6 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599738368 Type:vfs Inodes:3076108 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039898624 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599738368 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:3076108 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:63:44:bd Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:63:44:bd Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:2f:d1:73 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:e8:07:5a Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:28:eb:da Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:73:8e:16 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:96:27:b8 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:92:42:19:7a:bc:4e Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:8a:b0:32:58:c8:7e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199476736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.472507 4672 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.472843 4672 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.473266 4672 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.473494 4672 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.473532 4672 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.473826 4672 topology_manager.go:138] "Creating topology manager with none policy" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.473838 4672 container_manager_linux.go:303] "Creating device plugin manager" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.474118 4672 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.474169 4672 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.474504 4672 state_mem.go:36] "Initialized new in-memory state store" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.474596 4672 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.475281 4672 kubelet.go:418] "Attempting to sync node with API server" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.475313 4672 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.475360 4672 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.475398 4672 kubelet.go:324] "Adding apiserver pod source" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.475417 4672 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.479766 4672 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.481479 4672 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Dec 06 09:06:22 crc kubenswrapper[4672]: E1206 09:06:22.481698 4672 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.481823 4672 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.481769 4672 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Dec 06 09:06:22 crc kubenswrapper[4672]: E1206 09:06:22.481929 4672 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.482694 4672 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.483209 4672 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.483233 4672 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.483242 4672 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.483252 4672 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.483265 4672 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.483275 4672 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.483284 4672 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.483298 4672 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.483311 4672 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.483338 4672 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.483353 4672 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.483363 4672 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.483531 4672 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.484089 4672 server.go:1280] "Started kubelet" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.484446 4672 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.484776 4672 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.485390 4672 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 06 09:06:22 crc systemd[1]: Started Kubernetes Kubelet. Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.486421 4672 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.487005 4672 server.go:460] "Adding debug handlers to kubelet server" Dec 06 09:06:22 crc kubenswrapper[4672]: E1206 09:06:22.486894 4672 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.30:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187e950bf0452ac6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-06 09:06:22.484056774 +0000 UTC m=+0.228317061,LastTimestamp:2025-12-06 09:06:22.484056774 +0000 UTC m=+0.228317061,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.487729 4672 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.487768 4672 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.488139 4672 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 15:04:02.821348954 +0000 UTC Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.488186 4672 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 101h57m40.333167448s for next certificate rotation Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.488394 4672 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.488422 4672 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.488545 4672 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.489933 4672 factory.go:55] Registering systemd factory Dec 06 09:06:22 crc kubenswrapper[4672]: E1206 09:06:22.490674 4672 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 06 09:06:22 crc kubenswrapper[4672]: E1206 09:06:22.490862 4672 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="200ms" Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.489889 4672 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Dec 06 09:06:22 crc kubenswrapper[4672]: E1206 09:06:22.491274 4672 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.491645 4672 factory.go:221] Registration of the systemd container factory successfully Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.492811 4672 factory.go:153] Registering CRI-O factory Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.492849 4672 factory.go:221] Registration of the crio container factory successfully Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.492933 4672 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.492957 4672 factory.go:103] Registering Raw factory Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.492978 4672 manager.go:1196] Started watching for new ooms in manager Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.493684 4672 manager.go:319] Starting recovery of all containers Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.512180 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.512322 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.512344 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.512453 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.512471 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.512502 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.512518 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.512534 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.512557 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.512580 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.512639 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.512712 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.512749 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.512771 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.512793 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.512810 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.512928 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.512982 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.522861 4672 manager.go:324] Recovery completed Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524207 4672 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524262 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524286 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524305 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524322 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524339 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524358 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524379 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524396 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524420 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524438 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524457 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524476 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524492 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524512 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524531 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524549 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524567 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524586 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524630 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524649 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524665 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524682 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524701 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524718 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524734 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524761 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524777 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524792 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524807 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524841 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524859 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524893 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524911 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524928 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524952 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524970 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.524987 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525038 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525060 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525076 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525108 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525122 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525139 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525153 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525168 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525182 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525197 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525212 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525229 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525248 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525263 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525299 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525314 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525328 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525344 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525359 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525378 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525426 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525444 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525459 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525474 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525491 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525505 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525566 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525588 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525651 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525671 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525688 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525703 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525719 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525737 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525751 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525767 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525783 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525798 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525814 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525828 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525844 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525859 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525875 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525889 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525905 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525919 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525934 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525950 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525965 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.525988 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526004 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526020 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526035 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526051 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526066 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526082 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526099 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526117 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526138 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526155 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526172 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526188 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526202 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526217 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526231 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526246 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526262 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526276 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526291 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526306 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526320 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526336 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526350 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526366 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526383 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526399 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526413 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526427 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526442 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526456 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526469 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526484 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526499 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526514 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526529 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526543 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526559 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526573 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526587 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526626 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526645 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526661 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526675 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526690 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526706 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526722 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526736 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526753 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526768 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526782 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526797 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526812 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526827 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526842 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526857 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526870 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526885 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526901 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526916 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526931 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526945 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526959 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526974 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.526991 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527006 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527021 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527036 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527053 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527068 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527083 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527100 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527117 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527133 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527147 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527162 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527178 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527193 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527208 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527222 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527239 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527254 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527267 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527281 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527295 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527312 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527326 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527342 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527358 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527376 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527392 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527410 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527425 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527440 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527457 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527474 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527493 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527510 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527526 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527543 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527560 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527575 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527592 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527629 4672 reconstruct.go:97] "Volume reconstruction finished" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.527641 4672 reconciler.go:26] "Reconciler: start to sync state" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.537295 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.539144 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.539174 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.539182 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.540024 4672 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.540051 4672 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.540078 4672 state_mem.go:36] "Initialized new in-memory state store" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.548358 4672 policy_none.go:49] "None policy: Start" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.551853 4672 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.551907 4672 state_mem.go:35] "Initializing new in-memory state store" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.553467 4672 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.555493 4672 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.555538 4672 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.555570 4672 kubelet.go:2335] "Starting kubelet main sync loop" Dec 06 09:06:22 crc kubenswrapper[4672]: E1206 09:06:22.555646 4672 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 06 09:06:22 crc kubenswrapper[4672]: W1206 09:06:22.556832 4672 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Dec 06 09:06:22 crc kubenswrapper[4672]: E1206 09:06:22.556897 4672 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Dec 06 09:06:22 crc kubenswrapper[4672]: E1206 09:06:22.591044 4672 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.591758 4672 manager.go:334] "Starting Device Plugin manager" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.591823 4672 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.591837 4672 server.go:79] "Starting device plugin registration server" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.592268 4672 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.592284 4672 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.592576 4672 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.592745 4672 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.592764 4672 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 06 09:06:22 crc kubenswrapper[4672]: E1206 09:06:22.600845 4672 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.656656 4672 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.656775 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.658248 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.658303 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.658316 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.658522 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.658732 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.658789 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.659840 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.659888 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.659895 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.659905 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.659965 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.659978 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.660350 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.660383 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.660231 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.661354 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.661432 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.661453 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.661865 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.661902 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.661920 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.662093 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.662978 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.663051 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.663412 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.663441 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.663453 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.663582 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.663692 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.663721 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.669871 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.669907 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.669919 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.669977 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.670022 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.670040 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.670189 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.670248 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.670261 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.670331 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.670383 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.671742 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.671780 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.671791 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:22 crc kubenswrapper[4672]: E1206 09:06:22.692422 4672 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="400ms" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.692544 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.694251 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.694346 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.694370 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.694426 4672 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 09:06:22 crc kubenswrapper[4672]: E1206 09:06:22.695011 4672 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.30:6443: connect: connection refused" node="crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.730686 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.730728 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.730754 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.730773 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.730790 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.730805 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.730819 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.730834 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.730850 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.730866 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.730953 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.731007 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.731055 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.731101 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.731126 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.832940 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.832988 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.833015 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.833033 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.833050 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.833065 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.833079 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.833095 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.833109 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.833126 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.833142 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.833173 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.833191 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.833208 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.833204 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.833222 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.833275 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.833282 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.833306 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.833224 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.833304 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.833319 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.833328 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.833335 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.833355 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.833366 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.833377 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.833381 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.833386 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.833403 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.895553 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.896977 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.897021 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.897031 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.897055 4672 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 09:06:22 crc kubenswrapper[4672]: E1206 09:06:22.897692 4672 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.30:6443: connect: connection refused" node="crc" Dec 06 09:06:22 crc kubenswrapper[4672]: E1206 09:06:22.902187 4672 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.30:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187e950bf0452ac6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-06 09:06:22.484056774 +0000 UTC m=+0.228317061,LastTimestamp:2025-12-06 09:06:22.484056774 +0000 UTC m=+0.228317061,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 06 09:06:22 crc kubenswrapper[4672]: I1206 09:06:22.990359 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.014379 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 09:06:23 crc kubenswrapper[4672]: W1206 09:06:23.019060 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-03f4762aad1aafa36a10d8c8458308575f6acca3235f897787b6e9b2fdef2829 WatchSource:0}: Error finding container 03f4762aad1aafa36a10d8c8458308575f6acca3235f897787b6e9b2fdef2829: Status 404 returned error can't find the container with id 03f4762aad1aafa36a10d8c8458308575f6acca3235f897787b6e9b2fdef2829 Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.021677 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 06 09:06:23 crc kubenswrapper[4672]: W1206 09:06:23.031961 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-29aab898f20508f80d4d2a054022c6681d961fa8c105e22ed9dda26d488470d4 WatchSource:0}: Error finding container 29aab898f20508f80d4d2a054022c6681d961fa8c105e22ed9dda26d488470d4: Status 404 returned error can't find the container with id 29aab898f20508f80d4d2a054022c6681d961fa8c105e22ed9dda26d488470d4 Dec 06 09:06:23 crc kubenswrapper[4672]: W1206 09:06:23.036206 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-67093bd95d31be9f741ae242def6600b33a892f421a92cb7f577dea60bdc4d2d WatchSource:0}: Error finding container 67093bd95d31be9f741ae242def6600b33a892f421a92cb7f577dea60bdc4d2d: Status 404 returned error can't find the container with id 67093bd95d31be9f741ae242def6600b33a892f421a92cb7f577dea60bdc4d2d Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.044842 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.050863 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 09:06:23 crc kubenswrapper[4672]: W1206 09:06:23.067147 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-710b05010985c3f011866f6b6c769f14f635301c0bad0d72b0c95ea0d2e29704 WatchSource:0}: Error finding container 710b05010985c3f011866f6b6c769f14f635301c0bad0d72b0c95ea0d2e29704: Status 404 returned error can't find the container with id 710b05010985c3f011866f6b6c769f14f635301c0bad0d72b0c95ea0d2e29704 Dec 06 09:06:23 crc kubenswrapper[4672]: W1206 09:06:23.072660 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-c11e09360cf28cfadd8e9f1ffe64872837f4a621ab81aee1af664d1070318dcd WatchSource:0}: Error finding container c11e09360cf28cfadd8e9f1ffe64872837f4a621ab81aee1af664d1070318dcd: Status 404 returned error can't find the container with id c11e09360cf28cfadd8e9f1ffe64872837f4a621ab81aee1af664d1070318dcd Dec 06 09:06:23 crc kubenswrapper[4672]: E1206 09:06:23.093554 4672 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="800ms" Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.298802 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.300300 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.300343 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.300356 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.300383 4672 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 09:06:23 crc kubenswrapper[4672]: E1206 09:06:23.300847 4672 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.30:6443: connect: connection refused" node="crc" Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.487292 4672 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.561923 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92"} Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.562086 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c11e09360cf28cfadd8e9f1ffe64872837f4a621ab81aee1af664d1070318dcd"} Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.564510 4672 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4" exitCode=0 Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.564641 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4"} Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.564670 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"710b05010985c3f011866f6b6c769f14f635301c0bad0d72b0c95ea0d2e29704"} Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.564839 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.566155 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.566218 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.566229 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.566745 4672 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="5530982f90665e9b25a26d353a18fb472e31cc59738eb41a32ad00d74144ea8e" exitCode=0 Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.566840 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"5530982f90665e9b25a26d353a18fb472e31cc59738eb41a32ad00d74144ea8e"} Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.566873 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"67093bd95d31be9f741ae242def6600b33a892f421a92cb7f577dea60bdc4d2d"} Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.567007 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.568276 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.568331 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.568341 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.569236 4672 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="f59bf33a46328dcf822bfb8d8f5d090434302c33cc15139a7b2b5c4077ceabf8" exitCode=0 Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.569295 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"f59bf33a46328dcf822bfb8d8f5d090434302c33cc15139a7b2b5c4077ceabf8"} Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.569319 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"29aab898f20508f80d4d2a054022c6681d961fa8c105e22ed9dda26d488470d4"} Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.570422 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.571284 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.571321 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.571333 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.571808 4672 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="1e00ab8bdef1709d73446eacca39c22e7ea478b5d5a272c362ce234c135b6f21" exitCode=0 Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.571836 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"1e00ab8bdef1709d73446eacca39c22e7ea478b5d5a272c362ce234c135b6f21"} Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.571855 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"03f4762aad1aafa36a10d8c8458308575f6acca3235f897787b6e9b2fdef2829"} Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.571922 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.572677 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.572711 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:23 crc kubenswrapper[4672]: I1206 09:06:23.572723 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:23 crc kubenswrapper[4672]: W1206 09:06:23.684368 4672 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Dec 06 09:06:23 crc kubenswrapper[4672]: E1206 09:06:23.684460 4672 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Dec 06 09:06:23 crc kubenswrapper[4672]: W1206 09:06:23.741988 4672 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Dec 06 09:06:23 crc kubenswrapper[4672]: E1206 09:06:23.742622 4672 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Dec 06 09:06:23 crc kubenswrapper[4672]: W1206 09:06:23.796410 4672 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Dec 06 09:06:23 crc kubenswrapper[4672]: E1206 09:06:23.796500 4672 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Dec 06 09:06:23 crc kubenswrapper[4672]: W1206 09:06:23.809342 4672 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Dec 06 09:06:23 crc kubenswrapper[4672]: E1206 09:06:23.809455 4672 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Dec 06 09:06:23 crc kubenswrapper[4672]: E1206 09:06:23.895529 4672 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="1.6s" Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.101816 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.103455 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.103492 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.103503 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.103527 4672 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 09:06:24 crc kubenswrapper[4672]: E1206 09:06:24.103995 4672 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.30:6443: connect: connection refused" node="crc" Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.577161 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"82bdaa018e1393770e97100fcf2505232341157f89658f052ba5e27572967e81"} Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.577210 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2be2b7d9693248ad452729c60f6ad3599f1ead11da1334fc50007a3457242d9a"} Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.577221 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"111abfbdab45a6256108067f5721a4dc7c30ba86fb03b635515222586085b2a2"} Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.577306 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.578185 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.578217 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.578229 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.581311 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.581769 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8"} Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.581795 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5"} Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.581832 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b"} Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.582259 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.582316 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.582330 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.592235 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2"} Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.592281 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7"} Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.592296 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3"} Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.592307 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3"} Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.592319 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c"} Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.592431 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.593376 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.593405 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.593417 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.595853 4672 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="774bda10fba267ae24351b1bc455448c8190a85c4c15a21fe87db2dd447a2ed6" exitCode=0 Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.595931 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.596323 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"774bda10fba267ae24351b1bc455448c8190a85c4c15a21fe87db2dd447a2ed6"} Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.596417 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.597085 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.597113 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.597126 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.597832 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.597854 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:24 crc kubenswrapper[4672]: I1206 09:06:24.597867 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:25 crc kubenswrapper[4672]: I1206 09:06:25.006783 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:06:25 crc kubenswrapper[4672]: I1206 09:06:25.601435 4672 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4ce5ebaf7b3cc61250e2c3a5fbd849d57856bfc2fa35385c80de2d6f36be38da" exitCode=0 Dec 06 09:06:25 crc kubenswrapper[4672]: I1206 09:06:25.601566 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4ce5ebaf7b3cc61250e2c3a5fbd849d57856bfc2fa35385c80de2d6f36be38da"} Dec 06 09:06:25 crc kubenswrapper[4672]: I1206 09:06:25.601835 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:25 crc kubenswrapper[4672]: I1206 09:06:25.603400 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:25 crc kubenswrapper[4672]: I1206 09:06:25.603465 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:25 crc kubenswrapper[4672]: I1206 09:06:25.603490 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:25 crc kubenswrapper[4672]: I1206 09:06:25.606003 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"98d7126f50f2ad9884cac70711f8ed19b5144c280513b399b334b8249048d098"} Dec 06 09:06:25 crc kubenswrapper[4672]: I1206 09:06:25.606079 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:25 crc kubenswrapper[4672]: I1206 09:06:25.606138 4672 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 09:06:25 crc kubenswrapper[4672]: I1206 09:06:25.606171 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:25 crc kubenswrapper[4672]: I1206 09:06:25.606202 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:25 crc kubenswrapper[4672]: I1206 09:06:25.607528 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:25 crc kubenswrapper[4672]: I1206 09:06:25.607569 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:25 crc kubenswrapper[4672]: I1206 09:06:25.607627 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:25 crc kubenswrapper[4672]: I1206 09:06:25.607684 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:25 crc kubenswrapper[4672]: I1206 09:06:25.607714 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:25 crc kubenswrapper[4672]: I1206 09:06:25.607731 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:25 crc kubenswrapper[4672]: I1206 09:06:25.607968 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:25 crc kubenswrapper[4672]: I1206 09:06:25.607993 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:25 crc kubenswrapper[4672]: I1206 09:06:25.608008 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:25 crc kubenswrapper[4672]: I1206 09:06:25.704210 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:25 crc kubenswrapper[4672]: I1206 09:06:25.705273 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:25 crc kubenswrapper[4672]: I1206 09:06:25.705306 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:25 crc kubenswrapper[4672]: I1206 09:06:25.705314 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:25 crc kubenswrapper[4672]: I1206 09:06:25.705337 4672 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 09:06:25 crc kubenswrapper[4672]: I1206 09:06:25.841976 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 09:06:26 crc kubenswrapper[4672]: I1206 09:06:26.612951 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:26 crc kubenswrapper[4672]: I1206 09:06:26.613381 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"98bd718812bc45a4176b235104cb76050c376bd49d49286ec7ab92c22ad0e942"} Dec 06 09:06:26 crc kubenswrapper[4672]: I1206 09:06:26.613408 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"beb9c1dec645e8b6fe67ddb24082cac239f1414e9c3baba93edb47c7114b31b7"} Dec 06 09:06:26 crc kubenswrapper[4672]: I1206 09:06:26.613418 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d6141b7a5b5294f3de9083378856c4c50ea6efb6ff7f697573bd933589a72de3"} Dec 06 09:06:26 crc kubenswrapper[4672]: I1206 09:06:26.613426 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f32a1a16672e107ef7350e875185d5421b1a5e1e6acc3e524cad34b3001478dc"} Dec 06 09:06:26 crc kubenswrapper[4672]: I1206 09:06:26.613434 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"55319c08434190bee96f4cd0b8eb34de18d8f1796f21ea2459fdabcf22dba050"} Dec 06 09:06:26 crc kubenswrapper[4672]: I1206 09:06:26.613498 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:26 crc kubenswrapper[4672]: I1206 09:06:26.614296 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:26 crc kubenswrapper[4672]: I1206 09:06:26.614339 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:26 crc kubenswrapper[4672]: I1206 09:06:26.614355 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:26 crc kubenswrapper[4672]: I1206 09:06:26.614306 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:26 crc kubenswrapper[4672]: I1206 09:06:26.614485 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:26 crc kubenswrapper[4672]: I1206 09:06:26.614504 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:27 crc kubenswrapper[4672]: I1206 09:06:27.130837 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:06:27 crc kubenswrapper[4672]: I1206 09:06:27.131041 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:27 crc kubenswrapper[4672]: I1206 09:06:27.132502 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:27 crc kubenswrapper[4672]: I1206 09:06:27.132553 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:27 crc kubenswrapper[4672]: I1206 09:06:27.132567 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:27 crc kubenswrapper[4672]: I1206 09:06:27.374319 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 06 09:06:27 crc kubenswrapper[4672]: I1206 09:06:27.615676 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:27 crc kubenswrapper[4672]: I1206 09:06:27.617219 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:27 crc kubenswrapper[4672]: I1206 09:06:27.617282 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:27 crc kubenswrapper[4672]: I1206 09:06:27.617307 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:28 crc kubenswrapper[4672]: I1206 09:06:28.624024 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 06 09:06:28 crc kubenswrapper[4672]: I1206 09:06:28.624302 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:28 crc kubenswrapper[4672]: I1206 09:06:28.625539 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:28 crc kubenswrapper[4672]: I1206 09:06:28.625582 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:28 crc kubenswrapper[4672]: I1206 09:06:28.625621 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:29 crc kubenswrapper[4672]: I1206 09:06:29.480446 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:06:29 crc kubenswrapper[4672]: I1206 09:06:29.480829 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:29 crc kubenswrapper[4672]: I1206 09:06:29.482629 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:29 crc kubenswrapper[4672]: I1206 09:06:29.482695 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:29 crc kubenswrapper[4672]: I1206 09:06:29.482710 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:32 crc kubenswrapper[4672]: I1206 09:06:32.226087 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 09:06:32 crc kubenswrapper[4672]: I1206 09:06:32.226327 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:32 crc kubenswrapper[4672]: I1206 09:06:32.227999 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:32 crc kubenswrapper[4672]: I1206 09:06:32.228160 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:32 crc kubenswrapper[4672]: I1206 09:06:32.228271 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:32 crc kubenswrapper[4672]: E1206 09:06:32.600995 4672 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 06 09:06:32 crc kubenswrapper[4672]: I1206 09:06:32.686652 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 09:06:32 crc kubenswrapper[4672]: I1206 09:06:32.687232 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:32 crc kubenswrapper[4672]: I1206 09:06:32.688892 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:32 crc kubenswrapper[4672]: I1206 09:06:32.688935 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:32 crc kubenswrapper[4672]: I1206 09:06:32.688949 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:34 crc kubenswrapper[4672]: I1206 09:06:34.143864 4672 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 06 09:06:34 crc kubenswrapper[4672]: I1206 09:06:34.144010 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 06 09:06:34 crc kubenswrapper[4672]: I1206 09:06:34.279957 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 09:06:34 crc kubenswrapper[4672]: I1206 09:06:34.280312 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:34 crc kubenswrapper[4672]: I1206 09:06:34.282336 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:34 crc kubenswrapper[4672]: I1206 09:06:34.282409 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:34 crc kubenswrapper[4672]: I1206 09:06:34.282424 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:34 crc kubenswrapper[4672]: I1206 09:06:34.474543 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 09:06:34 crc kubenswrapper[4672]: I1206 09:06:34.480843 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 09:06:34 crc kubenswrapper[4672]: I1206 09:06:34.488257 4672 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 06 09:06:34 crc kubenswrapper[4672]: I1206 09:06:34.636537 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:34 crc kubenswrapper[4672]: I1206 09:06:34.638164 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:34 crc kubenswrapper[4672]: I1206 09:06:34.638226 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:34 crc kubenswrapper[4672]: I1206 09:06:34.638251 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:34 crc kubenswrapper[4672]: I1206 09:06:34.644899 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 09:06:35 crc kubenswrapper[4672]: I1206 09:06:35.007455 4672 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 06 09:06:35 crc kubenswrapper[4672]: I1206 09:06:35.007585 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 06 09:06:35 crc kubenswrapper[4672]: W1206 09:06:35.369393 4672 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 06 09:06:35 crc kubenswrapper[4672]: I1206 09:06:35.369530 4672 trace.go:236] Trace[1615295963]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Dec-2025 09:06:25.367) (total time: 10002ms): Dec 06 09:06:35 crc kubenswrapper[4672]: Trace[1615295963]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (09:06:35.369) Dec 06 09:06:35 crc kubenswrapper[4672]: Trace[1615295963]: [10.002092313s] [10.002092313s] END Dec 06 09:06:35 crc kubenswrapper[4672]: E1206 09:06:35.369584 4672 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 06 09:06:35 crc kubenswrapper[4672]: E1206 09:06:35.497124 4672 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 06 09:06:35 crc kubenswrapper[4672]: I1206 09:06:35.591965 4672 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 06 09:06:35 crc kubenswrapper[4672]: I1206 09:06:35.592092 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 06 09:06:35 crc kubenswrapper[4672]: I1206 09:06:35.638983 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:35 crc kubenswrapper[4672]: I1206 09:06:35.639809 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:35 crc kubenswrapper[4672]: I1206 09:06:35.639845 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:35 crc kubenswrapper[4672]: I1206 09:06:35.639858 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:36 crc kubenswrapper[4672]: I1206 09:06:36.640709 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:36 crc kubenswrapper[4672]: I1206 09:06:36.641659 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:36 crc kubenswrapper[4672]: I1206 09:06:36.641701 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:36 crc kubenswrapper[4672]: I1206 09:06:36.641711 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:37 crc kubenswrapper[4672]: I1206 09:06:37.281008 4672 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 06 09:06:37 crc kubenswrapper[4672]: I1206 09:06:37.281131 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 06 09:06:38 crc kubenswrapper[4672]: I1206 09:06:38.666361 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 06 09:06:38 crc kubenswrapper[4672]: I1206 09:06:38.666483 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:38 crc kubenswrapper[4672]: I1206 09:06:38.667456 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:38 crc kubenswrapper[4672]: I1206 09:06:38.667499 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:38 crc kubenswrapper[4672]: I1206 09:06:38.667514 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:38 crc kubenswrapper[4672]: I1206 09:06:38.686390 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 06 09:06:39 crc kubenswrapper[4672]: I1206 09:06:39.646512 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:39 crc kubenswrapper[4672]: I1206 09:06:39.647443 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:39 crc kubenswrapper[4672]: I1206 09:06:39.647482 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:39 crc kubenswrapper[4672]: I1206 09:06:39.647494 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:40 crc kubenswrapper[4672]: I1206 09:06:40.012314 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:06:40 crc kubenswrapper[4672]: I1206 09:06:40.013304 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:40 crc kubenswrapper[4672]: I1206 09:06:40.014502 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:40 crc kubenswrapper[4672]: I1206 09:06:40.014551 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:40 crc kubenswrapper[4672]: I1206 09:06:40.014582 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:40 crc kubenswrapper[4672]: I1206 09:06:40.016790 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:06:40 crc kubenswrapper[4672]: I1206 09:06:40.614844 4672 trace.go:236] Trace[1503950650]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Dec-2025 09:06:26.863) (total time: 13751ms): Dec 06 09:06:40 crc kubenswrapper[4672]: Trace[1503950650]: ---"Objects listed" error: 13751ms (09:06:40.614) Dec 06 09:06:40 crc kubenswrapper[4672]: Trace[1503950650]: [13.751303453s] [13.751303453s] END Dec 06 09:06:40 crc kubenswrapper[4672]: I1206 09:06:40.614872 4672 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 06 09:06:40 crc kubenswrapper[4672]: I1206 09:06:40.615639 4672 trace.go:236] Trace[1714819300]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Dec-2025 09:06:26.813) (total time: 13801ms): Dec 06 09:06:40 crc kubenswrapper[4672]: Trace[1714819300]: ---"Objects listed" error: 13801ms (09:06:40.615) Dec 06 09:06:40 crc kubenswrapper[4672]: Trace[1714819300]: [13.801872596s] [13.801872596s] END Dec 06 09:06:40 crc kubenswrapper[4672]: I1206 09:06:40.615677 4672 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 06 09:06:40 crc kubenswrapper[4672]: I1206 09:06:40.621413 4672 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 06 09:06:40 crc kubenswrapper[4672]: E1206 09:06:40.621487 4672 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 06 09:06:40 crc kubenswrapper[4672]: I1206 09:06:40.622344 4672 trace.go:236] Trace[157518909]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Dec-2025 09:06:25.731) (total time: 14890ms): Dec 06 09:06:40 crc kubenswrapper[4672]: Trace[157518909]: ---"Objects listed" error: 14890ms (09:06:40.622) Dec 06 09:06:40 crc kubenswrapper[4672]: Trace[157518909]: [14.890457991s] [14.890457991s] END Dec 06 09:06:40 crc kubenswrapper[4672]: I1206 09:06:40.622363 4672 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 06 09:06:40 crc kubenswrapper[4672]: I1206 09:06:40.649016 4672 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 09:06:40 crc kubenswrapper[4672]: I1206 09:06:40.649071 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:40 crc kubenswrapper[4672]: I1206 09:06:40.650394 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:40 crc kubenswrapper[4672]: I1206 09:06:40.650426 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:40 crc kubenswrapper[4672]: I1206 09:06:40.650435 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:40 crc kubenswrapper[4672]: I1206 09:06:40.669261 4672 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:52826->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 06 09:06:40 crc kubenswrapper[4672]: I1206 09:06:40.669326 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:52826->192.168.126.11:17697: read: connection reset by peer" Dec 06 09:06:40 crc kubenswrapper[4672]: I1206 09:06:40.669962 4672 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 06 09:06:40 crc kubenswrapper[4672]: I1206 09:06:40.670064 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 06 09:06:41 crc kubenswrapper[4672]: I1206 09:06:41.652898 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 06 09:06:41 crc kubenswrapper[4672]: I1206 09:06:41.655154 4672 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2" exitCode=255 Dec 06 09:06:41 crc kubenswrapper[4672]: I1206 09:06:41.655233 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2"} Dec 06 09:06:41 crc kubenswrapper[4672]: I1206 09:06:41.655544 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:41 crc kubenswrapper[4672]: I1206 09:06:41.656490 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:41 crc kubenswrapper[4672]: I1206 09:06:41.656530 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:41 crc kubenswrapper[4672]: I1206 09:06:41.656538 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:41 crc kubenswrapper[4672]: I1206 09:06:41.657070 4672 scope.go:117] "RemoveContainer" containerID="d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2" Dec 06 09:06:41 crc kubenswrapper[4672]: I1206 09:06:41.734909 4672 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.486838 4672 apiserver.go:52] "Watching apiserver" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.489776 4672 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.490161 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-ovn-kubernetes/ovnkube-node-xbbs5","openshift-multus/multus-additional-cni-plugins-fdr5p","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-dns/node-resolver-dl2fd","openshift-machine-config-operator/machine-config-daemon-4s7nh","openshift-multus/multus-ks2jd"] Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.490616 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.490681 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.490688 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.490733 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.490827 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 09:06:42 crc kubenswrapper[4672]: E1206 09:06:42.490901 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:06:42 crc kubenswrapper[4672]: E1206 09:06:42.491093 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.491572 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:06:42 crc kubenswrapper[4672]: E1206 09:06:42.491673 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.491708 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dl2fd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.491939 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.492034 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.492124 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.492376 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.500430 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.500522 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.500645 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.500694 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.500857 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.500897 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.500955 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.501357 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.501440 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.501360 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.501735 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.502874 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.502929 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.503255 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.503743 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.503781 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.510547 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.511978 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.512078 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.512121 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.512187 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.512301 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.512092 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.512464 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.512491 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.512658 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.512658 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.512766 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.514024 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.519979 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.520008 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.535891 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.554941 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.567558 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.578715 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.588667 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.589901 4672 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.602015 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.627640 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.636998 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637041 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637070 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637090 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637113 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637133 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637148 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637178 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637194 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637210 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637228 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637245 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637260 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637277 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637294 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637311 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637322 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637329 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637381 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637403 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637426 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637451 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637475 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637498 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637521 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637540 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637554 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637569 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637584 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637640 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637656 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637672 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637688 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637703 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637719 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637736 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637755 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637773 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637789 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637805 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637820 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637838 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637855 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637872 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637887 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637905 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637922 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637939 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637957 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637972 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637987 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638015 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638052 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638075 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638094 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638114 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638137 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638155 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638173 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638192 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638212 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638231 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638250 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638268 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638287 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638306 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638325 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638344 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638365 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638387 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638414 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638437 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638463 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638484 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638507 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638526 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638542 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638558 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638574 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638591 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638626 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638644 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638661 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638677 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638693 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638711 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638728 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638744 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638763 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638780 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638797 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638835 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638851 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638868 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638885 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638899 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638916 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638932 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638947 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638964 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639004 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639023 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639039 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639058 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639076 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639092 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639108 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639124 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639139 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639157 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639175 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639194 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639217 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639238 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639259 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639280 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639299 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639317 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639338 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639359 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639383 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639432 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639448 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639464 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639481 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639498 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639515 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639529 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639545 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639560 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639576 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639591 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639622 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639641 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639657 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639675 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639691 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639707 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639728 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639743 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639758 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639774 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639791 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639809 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639824 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639841 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639859 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639875 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639892 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639908 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639925 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639942 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639959 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639976 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639991 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640007 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640023 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640039 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640055 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640072 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640089 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640106 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640123 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640139 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640157 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640173 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640190 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640206 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640225 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640242 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640260 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640277 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640293 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640309 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640324 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640341 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640358 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640375 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640391 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640408 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640423 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640441 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640457 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640473 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640491 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640507 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640526 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640541 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640558 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640575 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640591 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640672 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640690 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640773 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640804 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640823 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640877 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-host-var-lib-cni-multus\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640897 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-host-var-lib-kubelet\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640916 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-hostroot\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640933 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-host-run-k8s-cni-cncf-io\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640947 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-node-log\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640961 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-cni-bin\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640976 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-host-run-netns\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.640990 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-host-var-lib-cni-bin\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641006 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4471a809-0ca4-44fd-aa93-3d89e87a2291-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fdr5p\" (UID: \"4471a809-0ca4-44fd-aa93-3d89e87a2291\") " pod="openshift-multus/multus-additional-cni-plugins-fdr5p" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641021 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4471a809-0ca4-44fd-aa93-3d89e87a2291-system-cni-dir\") pod \"multus-additional-cni-plugins-fdr5p\" (UID: \"4471a809-0ca4-44fd-aa93-3d89e87a2291\") " pod="openshift-multus/multus-additional-cni-plugins-fdr5p" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641035 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-run-systemd\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641050 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-etc-openvswitch\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641071 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641093 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkjbw\" (UniqueName: \"kubernetes.io/projected/4f3843b7-3dcd-451e-a394-73bc3f037c9d-kube-api-access-pkjbw\") pod \"node-resolver-dl2fd\" (UID: \"4f3843b7-3dcd-451e-a394-73bc3f037c9d\") " pod="openshift-dns/node-resolver-dl2fd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641107 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-multus-cni-dir\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641127 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641144 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641164 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-kubelet\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641178 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-run-ovn\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641193 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b0e78155-0eda-42cd-b11b-fbd9e5cc1e39-rootfs\") pod \"machine-config-daemon-4s7nh\" (UID: \"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\") " pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641208 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-system-cni-dir\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641225 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641242 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-run-netns\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641258 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-cnibin\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641273 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-cni-netd\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641288 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4471a809-0ca4-44fd-aa93-3d89e87a2291-cnibin\") pod \"multus-additional-cni-plugins-fdr5p\" (UID: \"4471a809-0ca4-44fd-aa93-3d89e87a2291\") " pod="openshift-multus/multus-additional-cni-plugins-fdr5p" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641303 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-slash\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641343 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmp5r\" (UniqueName: \"kubernetes.io/projected/b0e78155-0eda-42cd-b11b-fbd9e5cc1e39-kube-api-access-zmp5r\") pod \"machine-config-daemon-4s7nh\" (UID: \"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\") " pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641364 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641381 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr5rc\" (UniqueName: \"kubernetes.io/projected/4471a809-0ca4-44fd-aa93-3d89e87a2291-kube-api-access-wr5rc\") pod \"multus-additional-cni-plugins-fdr5p\" (UID: \"4471a809-0ca4-44fd-aa93-3d89e87a2291\") " pod="openshift-multus/multus-additional-cni-plugins-fdr5p" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641400 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641418 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641434 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4471a809-0ca4-44fd-aa93-3d89e87a2291-os-release\") pod \"multus-additional-cni-plugins-fdr5p\" (UID: \"4471a809-0ca4-44fd-aa93-3d89e87a2291\") " pod="openshift-multus/multus-additional-cni-plugins-fdr5p" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641453 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-log-socket\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641468 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-os-release\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641485 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-multus-conf-dir\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641502 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5thfv\" (UniqueName: \"kubernetes.io/projected/25b493f7-0dae-4eb4-9499-0564410528f7-kube-api-access-5thfv\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641519 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/25b493f7-0dae-4eb4-9499-0564410528f7-multus-daemon-config\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641536 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641555 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641573 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641591 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641637 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/713432b9-3b28-4ad0-b578-9d42aa1931aa-ovn-node-metrics-cert\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641657 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641675 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b0e78155-0eda-42cd-b11b-fbd9e5cc1e39-proxy-tls\") pod \"machine-config-daemon-4s7nh\" (UID: \"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\") " pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641691 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4f3843b7-3dcd-451e-a394-73bc3f037c9d-hosts-file\") pod \"node-resolver-dl2fd\" (UID: \"4f3843b7-3dcd-451e-a394-73bc3f037c9d\") " pod="openshift-dns/node-resolver-dl2fd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641709 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4471a809-0ca4-44fd-aa93-3d89e87a2291-cni-binary-copy\") pod \"multus-additional-cni-plugins-fdr5p\" (UID: \"4471a809-0ca4-44fd-aa93-3d89e87a2291\") " pod="openshift-multus/multus-additional-cni-plugins-fdr5p" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641728 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641744 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-var-lib-openvswitch\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641763 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641778 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-multus-socket-dir-parent\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641795 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-host-run-multus-certs\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641810 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-etc-kubernetes\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641826 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-run-ovn-kubernetes\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641842 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4471a809-0ca4-44fd-aa93-3d89e87a2291-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fdr5p\" (UID: \"4471a809-0ca4-44fd-aa93-3d89e87a2291\") " pod="openshift-multus/multus-additional-cni-plugins-fdr5p" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641857 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/713432b9-3b28-4ad0-b578-9d42aa1931aa-env-overrides\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641873 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b0e78155-0eda-42cd-b11b-fbd9e5cc1e39-mcd-auth-proxy-config\") pod \"machine-config-daemon-4s7nh\" (UID: \"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\") " pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641890 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-systemd-units\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641906 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/713432b9-3b28-4ad0-b578-9d42aa1931aa-ovnkube-config\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641921 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/713432b9-3b28-4ad0-b578-9d42aa1931aa-ovnkube-script-lib\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641937 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blgnn\" (UniqueName: \"kubernetes.io/projected/713432b9-3b28-4ad0-b578-9d42aa1931aa-kube-api-access-blgnn\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641957 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641973 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-run-openvswitch\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641989 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/25b493f7-0dae-4eb4-9499-0564410528f7-cni-binary-copy\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.642027 4672 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637475 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637561 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637557 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637652 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637827 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.642099 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.637839 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638117 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638183 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638325 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638378 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638715 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638767 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.638880 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639051 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639432 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639570 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.639720 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641590 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.641641 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.642048 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.642256 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.642260 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.642360 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.642471 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.642539 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.642866 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.642906 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.643036 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.643061 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.643187 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.643242 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.643283 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.643388 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.643424 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.643582 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.643610 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.643744 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.643759 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.643786 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.643946 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.643945 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.643970 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.644082 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.644102 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.644184 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.644262 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.644332 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.644386 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: E1206 09:06:42.644462 4672 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.644480 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: E1206 09:06:42.644527 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 09:06:43.144499067 +0000 UTC m=+20.888759344 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.644687 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.644715 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.644900 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.644905 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.644946 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.645108 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.645108 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.645162 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.645181 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.645445 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.645479 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.645511 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.645709 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.645737 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.645896 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.646105 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.646115 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.646175 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.646335 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.646397 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.646461 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.646557 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.646803 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.646841 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.646957 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.647005 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.647239 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.647359 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.647385 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.647651 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.647915 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.648611 4672 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.648858 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.650619 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.650628 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.650800 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.650954 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.651090 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.651122 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.651144 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.651498 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.651547 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.651833 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.651875 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.652004 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.652502 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.652710 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.652725 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.653087 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.653293 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.653783 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: E1206 09:06:42.654959 4672 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 09:06:42 crc kubenswrapper[4672]: E1206 09:06:42.658486 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 09:06:43.158466864 +0000 UTC m=+20.902727151 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.655010 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.655144 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.655257 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.655401 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.655872 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.656103 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.656166 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.656226 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.656491 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.657858 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.658345 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.658390 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.654487 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.655879 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.658815 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.661009 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.661407 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.661763 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.661920 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.661989 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.662420 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.662762 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.663987 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.664488 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.664678 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.664754 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.664772 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.665011 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.666491 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.666607 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.667203 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.667897 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.671524 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.672923 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.673247 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.673305 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.674282 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.674843 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.675164 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.675286 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.675338 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.675504 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.675721 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.675759 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.675989 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.676003 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.676377 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.676942 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.677288 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.678315 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.679085 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.679614 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.679845 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.679958 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.680039 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.680234 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.680335 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.680427 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.680623 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.680790 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.680957 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.681578 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.681738 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.681820 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.682005 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.682057 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.682072 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.682007 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.682438 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.683048 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.685337 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 09:06:42 crc kubenswrapper[4672]: E1206 09:06:42.685417 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 09:06:42 crc kubenswrapper[4672]: E1206 09:06:42.685461 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 09:06:42 crc kubenswrapper[4672]: E1206 09:06:42.685476 4672 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:06:42 crc kubenswrapper[4672]: E1206 09:06:42.685583 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 09:06:43.185560956 +0000 UTC m=+20.929821423 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:06:42 crc kubenswrapper[4672]: E1206 09:06:42.685777 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 09:06:42 crc kubenswrapper[4672]: E1206 09:06:42.685794 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 09:06:42 crc kubenswrapper[4672]: E1206 09:06:42.685804 4672 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:06:42 crc kubenswrapper[4672]: E1206 09:06:42.685843 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 09:06:43.185832543 +0000 UTC m=+20.930093050 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.686189 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: E1206 09:06:42.687257 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:06:43.187237189 +0000 UTC m=+20.931497476 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.687368 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.687524 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.687891 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.687982 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.688246 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.688750 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.689326 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.689467 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.689790 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.691285 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.691574 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.691641 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.692218 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.692298 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.692383 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.692509 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.692780 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.693215 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.693279 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.693923 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.694399 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.694425 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.694771 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.694766 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.695066 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.695061 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.697242 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e"} Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.697941 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.698024 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.699468 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.702662 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.704377 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.705713 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.710288 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.710293 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.720856 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.722853 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.730936 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.733395 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.742305 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.742477 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-host-run-k8s-cni-cncf-io\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.742523 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-host-run-k8s-cni-cncf-io\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.742714 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-host-var-lib-cni-multus\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.742777 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-host-var-lib-cni-multus\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.742795 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-host-var-lib-kubelet\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.742818 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-host-var-lib-kubelet\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.742823 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-hostroot\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.742855 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4471a809-0ca4-44fd-aa93-3d89e87a2291-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fdr5p\" (UID: \"4471a809-0ca4-44fd-aa93-3d89e87a2291\") " pod="openshift-multus/multus-additional-cni-plugins-fdr5p" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.742862 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-hostroot\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.742874 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-node-log\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.742894 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-cni-bin\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.742926 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-host-run-netns\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.743418 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-cni-bin\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.743554 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-host-var-lib-cni-bin\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.742944 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-node-log\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.743003 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-host-run-netns\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.744360 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4471a809-0ca4-44fd-aa93-3d89e87a2291-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fdr5p\" (UID: \"4471a809-0ca4-44fd-aa93-3d89e87a2291\") " pod="openshift-multus/multus-additional-cni-plugins-fdr5p" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.747019 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-host-var-lib-cni-bin\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.747078 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkjbw\" (UniqueName: \"kubernetes.io/projected/4f3843b7-3dcd-451e-a394-73bc3f037c9d-kube-api-access-pkjbw\") pod \"node-resolver-dl2fd\" (UID: \"4f3843b7-3dcd-451e-a394-73bc3f037c9d\") " pod="openshift-dns/node-resolver-dl2fd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.747105 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-multus-cni-dir\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.747137 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4471a809-0ca4-44fd-aa93-3d89e87a2291-system-cni-dir\") pod \"multus-additional-cni-plugins-fdr5p\" (UID: \"4471a809-0ca4-44fd-aa93-3d89e87a2291\") " pod="openshift-multus/multus-additional-cni-plugins-fdr5p" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.747194 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-run-systemd\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.747213 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-etc-openvswitch\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.747245 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-system-cni-dir\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.747266 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-kubelet\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.747288 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-run-ovn\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.747308 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b0e78155-0eda-42cd-b11b-fbd9e5cc1e39-rootfs\") pod \"machine-config-daemon-4s7nh\" (UID: \"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\") " pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.747328 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-run-netns\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.747347 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-cnibin\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.747365 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4471a809-0ca4-44fd-aa93-3d89e87a2291-cnibin\") pod \"multus-additional-cni-plugins-fdr5p\" (UID: \"4471a809-0ca4-44fd-aa93-3d89e87a2291\") " pod="openshift-multus/multus-additional-cni-plugins-fdr5p" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.747384 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-cni-netd\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.747402 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-slash\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.747422 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmp5r\" (UniqueName: \"kubernetes.io/projected/b0e78155-0eda-42cd-b11b-fbd9e5cc1e39-kube-api-access-zmp5r\") pod \"machine-config-daemon-4s7nh\" (UID: \"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\") " pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.747454 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr5rc\" (UniqueName: \"kubernetes.io/projected/4471a809-0ca4-44fd-aa93-3d89e87a2291-kube-api-access-wr5rc\") pod \"multus-additional-cni-plugins-fdr5p\" (UID: \"4471a809-0ca4-44fd-aa93-3d89e87a2291\") " pod="openshift-multus/multus-additional-cni-plugins-fdr5p" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.747480 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5thfv\" (UniqueName: \"kubernetes.io/projected/25b493f7-0dae-4eb4-9499-0564410528f7-kube-api-access-5thfv\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.747507 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4471a809-0ca4-44fd-aa93-3d89e87a2291-os-release\") pod \"multus-additional-cni-plugins-fdr5p\" (UID: \"4471a809-0ca4-44fd-aa93-3d89e87a2291\") " pod="openshift-multus/multus-additional-cni-plugins-fdr5p" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.747591 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-log-socket\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.747530 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-log-socket\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.748934 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4471a809-0ca4-44fd-aa93-3d89e87a2291-system-cni-dir\") pod \"multus-additional-cni-plugins-fdr5p\" (UID: \"4471a809-0ca4-44fd-aa93-3d89e87a2291\") " pod="openshift-multus/multus-additional-cni-plugins-fdr5p" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.748970 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-run-systemd\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.748999 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-etc-openvswitch\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.749051 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-system-cni-dir\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.749077 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-kubelet\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.749131 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-os-release\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.750533 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-multus-conf-dir\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.750673 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.750763 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/25b493f7-0dae-4eb4-9499-0564410528f7-multus-daemon-config\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.750848 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b0e78155-0eda-42cd-b11b-fbd9e5cc1e39-proxy-tls\") pod \"machine-config-daemon-4s7nh\" (UID: \"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\") " pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.750928 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4f3843b7-3dcd-451e-a394-73bc3f037c9d-hosts-file\") pod \"node-resolver-dl2fd\" (UID: \"4f3843b7-3dcd-451e-a394-73bc3f037c9d\") " pod="openshift-dns/node-resolver-dl2fd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.751000 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.751171 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/713432b9-3b28-4ad0-b578-9d42aa1931aa-ovn-node-metrics-cert\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.751279 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4471a809-0ca4-44fd-aa93-3d89e87a2291-cni-binary-copy\") pod \"multus-additional-cni-plugins-fdr5p\" (UID: \"4471a809-0ca4-44fd-aa93-3d89e87a2291\") " pod="openshift-multus/multus-additional-cni-plugins-fdr5p" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.751369 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-var-lib-openvswitch\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.751446 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-run-ovn-kubernetes\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.751517 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.751582 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-multus-socket-dir-parent\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.751680 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-host-run-multus-certs\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.751748 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.751756 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-etc-kubernetes\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.751955 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4471a809-0ca4-44fd-aa93-3d89e87a2291-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fdr5p\" (UID: \"4471a809-0ca4-44fd-aa93-3d89e87a2291\") " pod="openshift-multus/multus-additional-cni-plugins-fdr5p" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.752012 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.749182 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-os-release\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.749146 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-multus-cni-dir\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.749201 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-cni-netd\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.749515 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4471a809-0ca4-44fd-aa93-3d89e87a2291-os-release\") pod \"multus-additional-cni-plugins-fdr5p\" (UID: \"4471a809-0ca4-44fd-aa93-3d89e87a2291\") " pod="openshift-multus/multus-additional-cni-plugins-fdr5p" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.749279 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-run-netns\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.749309 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-cnibin\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.752025 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/713432b9-3b28-4ad0-b578-9d42aa1931aa-env-overrides\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.752328 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b0e78155-0eda-42cd-b11b-fbd9e5cc1e39-mcd-auth-proxy-config\") pod \"machine-config-daemon-4s7nh\" (UID: \"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\") " pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.752385 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-var-lib-openvswitch\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.752400 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.752325 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-host-run-multus-certs\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.752355 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-run-ovn-kubernetes\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.749260 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b0e78155-0eda-42cd-b11b-fbd9e5cc1e39-rootfs\") pod \"machine-config-daemon-4s7nh\" (UID: \"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\") " pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.752490 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-multus-socket-dir-parent\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.751609 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4f3843b7-3dcd-451e-a394-73bc3f037c9d-hosts-file\") pod \"node-resolver-dl2fd\" (UID: \"4f3843b7-3dcd-451e-a394-73bc3f037c9d\") " pod="openshift-dns/node-resolver-dl2fd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.752547 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-etc-kubernetes\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.749222 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-run-ovn\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.752765 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-systemd-units\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.752838 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/713432b9-3b28-4ad0-b578-9d42aa1931aa-ovnkube-config\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.752903 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/713432b9-3b28-4ad0-b578-9d42aa1931aa-ovnkube-script-lib\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.752973 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blgnn\" (UniqueName: \"kubernetes.io/projected/713432b9-3b28-4ad0-b578-9d42aa1931aa-kube-api-access-blgnn\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.753057 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-run-openvswitch\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.753134 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/25b493f7-0dae-4eb4-9499-0564410528f7-cni-binary-copy\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.753324 4672 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.753390 4672 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.753845 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.753929 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.754003 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.754063 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.754123 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.754183 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.754249 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.754305 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.751691 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/25b493f7-0dae-4eb4-9499-0564410528f7-multus-daemon-config\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.754350 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/713432b9-3b28-4ad0-b578-9d42aa1931aa-env-overrides\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.754369 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.754427 4672 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.754434 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-systemd-units\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.754467 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-run-openvswitch\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.754444 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.754547 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.754549 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b0e78155-0eda-42cd-b11b-fbd9e5cc1e39-mcd-auth-proxy-config\") pod \"machine-config-daemon-4s7nh\" (UID: \"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\") " pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.754591 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.754637 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.754682 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.754699 4672 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.754712 4672 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.754726 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.754789 4672 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.754804 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.749345 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4471a809-0ca4-44fd-aa93-3d89e87a2291-cnibin\") pod \"multus-additional-cni-plugins-fdr5p\" (UID: \"4471a809-0ca4-44fd-aa93-3d89e87a2291\") " pod="openshift-multus/multus-additional-cni-plugins-fdr5p" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.755231 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/713432b9-3b28-4ad0-b578-9d42aa1931aa-ovnkube-script-lib\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.751725 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/25b493f7-0dae-4eb4-9499-0564410528f7-multus-conf-dir\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.753973 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4471a809-0ca4-44fd-aa93-3d89e87a2291-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fdr5p\" (UID: \"4471a809-0ca4-44fd-aa93-3d89e87a2291\") " pod="openshift-multus/multus-additional-cni-plugins-fdr5p" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.749326 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-slash\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.760586 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/25b493f7-0dae-4eb4-9499-0564410528f7-cni-binary-copy\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.761327 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4471a809-0ca4-44fd-aa93-3d89e87a2291-cni-binary-copy\") pod \"multus-additional-cni-plugins-fdr5p\" (UID: \"4471a809-0ca4-44fd-aa93-3d89e87a2291\") " pod="openshift-multus/multus-additional-cni-plugins-fdr5p" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.762002 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/713432b9-3b28-4ad0-b578-9d42aa1931aa-ovn-node-metrics-cert\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.764117 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/713432b9-3b28-4ad0-b578-9d42aa1931aa-ovnkube-config\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.767779 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768530 4672 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768558 4672 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768572 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768586 4672 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768622 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768641 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768656 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768667 4672 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768680 4672 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768692 4672 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768704 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768718 4672 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768730 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768743 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768755 4672 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768769 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768781 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768793 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768804 4672 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768816 4672 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768828 4672 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768839 4672 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768851 4672 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768863 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768875 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768888 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768899 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768911 4672 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768922 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768936 4672 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768948 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768962 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768978 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.768992 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769007 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769020 4672 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769037 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769050 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769065 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769077 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769089 4672 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769101 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769123 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769137 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769150 4672 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769162 4672 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769174 4672 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769184 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769197 4672 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769221 4672 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769233 4672 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769245 4672 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769257 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769268 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769282 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769294 4672 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769306 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769322 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769334 4672 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769348 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769360 4672 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769372 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769383 4672 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769395 4672 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769407 4672 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769439 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769452 4672 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769464 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769477 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769489 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769499 4672 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769522 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769534 4672 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769546 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769558 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769569 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769580 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769592 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769620 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769633 4672 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769646 4672 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769657 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769669 4672 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769681 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769692 4672 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769705 4672 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769716 4672 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769727 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769736 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769744 4672 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769753 4672 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769761 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769772 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769785 4672 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769798 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769811 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769821 4672 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769833 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769846 4672 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769857 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769868 4672 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769880 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769892 4672 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769903 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769912 4672 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769921 4672 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769930 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769940 4672 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769953 4672 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769962 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769973 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769983 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.769992 4672 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770002 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770011 4672 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770019 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770028 4672 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770039 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770048 4672 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770058 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770071 4672 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770109 4672 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770119 4672 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770129 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770138 4672 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770147 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770157 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770166 4672 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770176 4672 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770184 4672 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770194 4672 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770203 4672 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770212 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770221 4672 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770176 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b0e78155-0eda-42cd-b11b-fbd9e5cc1e39-proxy-tls\") pod \"machine-config-daemon-4s7nh\" (UID: \"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\") " pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770230 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770306 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770319 4672 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770084 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr5rc\" (UniqueName: \"kubernetes.io/projected/4471a809-0ca4-44fd-aa93-3d89e87a2291-kube-api-access-wr5rc\") pod \"multus-additional-cni-plugins-fdr5p\" (UID: \"4471a809-0ca4-44fd-aa93-3d89e87a2291\") " pod="openshift-multus/multus-additional-cni-plugins-fdr5p" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770331 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770653 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770663 4672 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770691 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770701 4672 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770710 4672 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770718 4672 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770728 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770736 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770745 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770782 4672 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770792 4672 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770801 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770811 4672 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770820 4672 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770829 4672 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770854 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770864 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770873 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770881 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770891 4672 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770900 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770909 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770934 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5thfv\" (UniqueName: \"kubernetes.io/projected/25b493f7-0dae-4eb4-9499-0564410528f7-kube-api-access-5thfv\") pod \"multus-ks2jd\" (UID: \"25b493f7-0dae-4eb4-9499-0564410528f7\") " pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770938 4672 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770968 4672 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770978 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770987 4672 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.770997 4672 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.771005 4672 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.771015 4672 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.771024 4672 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.771033 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.771043 4672 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.771052 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.772922 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkjbw\" (UniqueName: \"kubernetes.io/projected/4f3843b7-3dcd-451e-a394-73bc3f037c9d-kube-api-access-pkjbw\") pod \"node-resolver-dl2fd\" (UID: \"4f3843b7-3dcd-451e-a394-73bc3f037c9d\") " pod="openshift-dns/node-resolver-dl2fd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.773489 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmp5r\" (UniqueName: \"kubernetes.io/projected/b0e78155-0eda-42cd-b11b-fbd9e5cc1e39-kube-api-access-zmp5r\") pod \"machine-config-daemon-4s7nh\" (UID: \"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\") " pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.776335 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blgnn\" (UniqueName: \"kubernetes.io/projected/713432b9-3b28-4ad0-b578-9d42aa1931aa-kube-api-access-blgnn\") pod \"ovnkube-node-xbbs5\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.782002 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.791918 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.800896 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.806667 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.807754 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.815385 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 09:06:42 crc kubenswrapper[4672]: W1206 09:06:42.818954 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-e3a06daa17aa8f392bf80b347754e01dd876ffbd0efcb5a056c87857abec7c33 WatchSource:0}: Error finding container e3a06daa17aa8f392bf80b347754e01dd876ffbd0efcb5a056c87857abec7c33: Status 404 returned error can't find the container with id e3a06daa17aa8f392bf80b347754e01dd876ffbd0efcb5a056c87857abec7c33 Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.822003 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.822317 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.827935 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dl2fd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.835040 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.835643 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:42 crc kubenswrapper[4672]: W1206 09:06:42.837633 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-565dac3b3259eb45b71e92c343a49f441bff2b1e0645a279dd3c09fc7945c39b WatchSource:0}: Error finding container 565dac3b3259eb45b71e92c343a49f441bff2b1e0645a279dd3c09fc7945c39b: Status 404 returned error can't find the container with id 565dac3b3259eb45b71e92c343a49f441bff2b1e0645a279dd3c09fc7945c39b Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.842413 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.848068 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.852969 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.853544 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ks2jd" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.867086 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.877192 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:42 crc kubenswrapper[4672]: W1206 09:06:42.879955 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4471a809_0ca4_44fd_aa93_3d89e87a2291.slice/crio-577f8cba9151f1023e4393849f8397b4960739ddbf662cc60ed3adc948684b99 WatchSource:0}: Error finding container 577f8cba9151f1023e4393849f8397b4960739ddbf662cc60ed3adc948684b99: Status 404 returned error can't find the container with id 577f8cba9151f1023e4393849f8397b4960739ddbf662cc60ed3adc948684b99 Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.889347 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.902120 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.922547 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.961720 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.976637 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:42 crc kubenswrapper[4672]: I1206 09:06:42.995782 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.008660 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.030014 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.045443 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.056335 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.067941 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.084405 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.185131 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.185292 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:06:43 crc kubenswrapper[4672]: E1206 09:06:43.185465 4672 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 09:06:43 crc kubenswrapper[4672]: E1206 09:06:43.185711 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 09:06:44.185697027 +0000 UTC m=+21.929957314 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 09:06:43 crc kubenswrapper[4672]: E1206 09:06:43.186615 4672 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 09:06:43 crc kubenswrapper[4672]: E1206 09:06:43.186774 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 09:06:44.186740945 +0000 UTC m=+21.931001392 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.285784 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:06:43 crc kubenswrapper[4672]: E1206 09:06:43.285929 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:06:44.2858903 +0000 UTC m=+22.030150577 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.286393 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.286429 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:06:43 crc kubenswrapper[4672]: E1206 09:06:43.286591 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 09:06:43 crc kubenswrapper[4672]: E1206 09:06:43.286632 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 09:06:43 crc kubenswrapper[4672]: E1206 09:06:43.286647 4672 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:06:43 crc kubenswrapper[4672]: E1206 09:06:43.286658 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 09:06:43 crc kubenswrapper[4672]: E1206 09:06:43.286694 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 09:06:43 crc kubenswrapper[4672]: E1206 09:06:43.286707 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 09:06:44.286688661 +0000 UTC m=+22.030948948 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:06:43 crc kubenswrapper[4672]: E1206 09:06:43.286708 4672 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:06:43 crc kubenswrapper[4672]: E1206 09:06:43.286787 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 09:06:44.286751733 +0000 UTC m=+22.031012020 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.703668 4672 generic.go:334] "Generic (PLEG): container finished" podID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerID="10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda" exitCode=0 Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.703738 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" event={"ID":"713432b9-3b28-4ad0-b578-9d42aa1931aa","Type":"ContainerDied","Data":"10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda"} Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.703767 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" event={"ID":"713432b9-3b28-4ad0-b578-9d42aa1931aa","Type":"ContainerStarted","Data":"5d6a89e307227cafbb58809edb9c2c25d1c8d42087540f3466ab30c439922c71"} Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.706928 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"565dac3b3259eb45b71e92c343a49f441bff2b1e0645a279dd3c09fc7945c39b"} Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.709027 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7"} Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.709098 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e3a06daa17aa8f392bf80b347754e01dd876ffbd0efcb5a056c87857abec7c33"} Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.710401 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ks2jd" event={"ID":"25b493f7-0dae-4eb4-9499-0564410528f7","Type":"ContainerStarted","Data":"3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328"} Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.710446 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ks2jd" event={"ID":"25b493f7-0dae-4eb4-9499-0564410528f7","Type":"ContainerStarted","Data":"3ae7fc8466478f4f3a20a7095911104d22e13b4176e877f0b9aa7b46db2bdd2c"} Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.716473 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerStarted","Data":"2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e"} Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.716524 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerStarted","Data":"389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b"} Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.716538 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerStarted","Data":"6a0fcd22458500c202cdf8d2c3753426276ab70995b45015d656d6b7dc87ae72"} Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.718036 4672 generic.go:334] "Generic (PLEG): container finished" podID="4471a809-0ca4-44fd-aa93-3d89e87a2291" containerID="c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e" exitCode=0 Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.718085 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" event={"ID":"4471a809-0ca4-44fd-aa93-3d89e87a2291","Type":"ContainerDied","Data":"c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e"} Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.718102 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" event={"ID":"4471a809-0ca4-44fd-aa93-3d89e87a2291","Type":"ContainerStarted","Data":"577f8cba9151f1023e4393849f8397b4960739ddbf662cc60ed3adc948684b99"} Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.718590 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:43Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.720442 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dl2fd" event={"ID":"4f3843b7-3dcd-451e-a394-73bc3f037c9d","Type":"ContainerStarted","Data":"65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14"} Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.720469 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dl2fd" event={"ID":"4f3843b7-3dcd-451e-a394-73bc3f037c9d","Type":"ContainerStarted","Data":"932a99752cc82d00e60e7193bcb50ebe655d70b49af5f32af4888ab52a008104"} Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.732235 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae"} Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.732299 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f"} Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.732310 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6604951e90c653acc11c5a4affa260182c61137ec56bdf6905ce35911ec89c04"} Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.739267 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:43Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.755705 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:43Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.777915 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:43Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.792792 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:43Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.804422 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:43Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.818478 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:43Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.822021 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.829633 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.829705 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.829717 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.830003 4672 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.833353 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:43Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.838749 4672 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.838896 4672 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.843068 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.843120 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.843132 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.843149 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.843161 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:43Z","lastTransitionTime":"2025-12-06T09:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.848173 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:43Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:43 crc kubenswrapper[4672]: E1206 09:06:43.866419 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dee4872a-ee41-4a28-b591-3da52b9dd3d6\\\",\\\"systemUUID\\\":\\\"7e6e2ea0-eb53-4cec-8366-444329cefc63\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:43Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.870130 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.870154 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.870163 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.870177 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.870188 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:43Z","lastTransitionTime":"2025-12-06T09:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:43 crc kubenswrapper[4672]: E1206 09:06:43.882738 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dee4872a-ee41-4a28-b591-3da52b9dd3d6\\\",\\\"systemUUID\\\":\\\"7e6e2ea0-eb53-4cec-8366-444329cefc63\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:43Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.884370 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:43Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.886836 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.886870 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.886877 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.886891 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.886911 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:43Z","lastTransitionTime":"2025-12-06T09:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.906926 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:43Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.929140 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:43Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:43 crc kubenswrapper[4672]: E1206 09:06:43.933002 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dee4872a-ee41-4a28-b591-3da52b9dd3d6\\\",\\\"systemUUID\\\":\\\"7e6e2ea0-eb53-4cec-8366-444329cefc63\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:43Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.955101 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.955137 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.955148 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.955163 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.955173 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:43Z","lastTransitionTime":"2025-12-06T09:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.972360 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:43Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:43 crc kubenswrapper[4672]: E1206 09:06:43.988268 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dee4872a-ee41-4a28-b591-3da52b9dd3d6\\\",\\\"systemUUID\\\":\\\"7e6e2ea0-eb53-4cec-8366-444329cefc63\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:43Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.994703 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.994729 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.994736 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.994748 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.994756 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:43Z","lastTransitionTime":"2025-12-06T09:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:43 crc kubenswrapper[4672]: I1206 09:06:43.997095 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:43Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: E1206 09:06:44.019332 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dee4872a-ee41-4a28-b591-3da52b9dd3d6\\\",\\\"systemUUID\\\":\\\"7e6e2ea0-eb53-4cec-8366-444329cefc63\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: E1206 09:06:44.019450 4672 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.020734 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.021445 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.021549 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.021635 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.021699 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.021754 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:44Z","lastTransitionTime":"2025-12-06T09:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.036337 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.050944 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.077068 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.099271 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.118813 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.124478 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.124523 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.124532 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.124552 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.124567 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:44Z","lastTransitionTime":"2025-12-06T09:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.138375 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.157496 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.171634 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.189316 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.198756 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.198804 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:06:44 crc kubenswrapper[4672]: E1206 09:06:44.198902 4672 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 09:06:44 crc kubenswrapper[4672]: E1206 09:06:44.198959 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 09:06:46.198944952 +0000 UTC m=+23.943205239 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 09:06:44 crc kubenswrapper[4672]: E1206 09:06:44.198998 4672 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 09:06:44 crc kubenswrapper[4672]: E1206 09:06:44.199137 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 09:06:46.199106816 +0000 UTC m=+23.943367103 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.228338 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.228377 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.228389 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.228418 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.228431 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:44Z","lastTransitionTime":"2025-12-06T09:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.286028 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.298241 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.300292 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.300456 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.300537 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:06:44 crc kubenswrapper[4672]: E1206 09:06:44.300734 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 09:06:44 crc kubenswrapper[4672]: E1206 09:06:44.300760 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 09:06:44 crc kubenswrapper[4672]: E1206 09:06:44.300774 4672 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:06:44 crc kubenswrapper[4672]: E1206 09:06:44.300827 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 09:06:46.300810328 +0000 UTC m=+24.045070615 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:06:44 crc kubenswrapper[4672]: E1206 09:06:44.301212 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:06:46.301191388 +0000 UTC m=+24.045451675 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:06:44 crc kubenswrapper[4672]: E1206 09:06:44.301277 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 09:06:44 crc kubenswrapper[4672]: E1206 09:06:44.301288 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 09:06:44 crc kubenswrapper[4672]: E1206 09:06:44.301297 4672 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:06:44 crc kubenswrapper[4672]: E1206 09:06:44.301326 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 09:06:46.301317032 +0000 UTC m=+24.045577319 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.305194 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.313506 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.334711 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.334747 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.334761 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.334782 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.334794 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:44Z","lastTransitionTime":"2025-12-06T09:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.338514 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.354906 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.373948 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.394792 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.417363 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.433522 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.437243 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.437272 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.437282 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.437297 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.437307 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:44Z","lastTransitionTime":"2025-12-06T09:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.447078 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.464497 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.486378 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.500178 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.510890 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.528544 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.542241 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.542286 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.542302 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.542322 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.542337 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:44Z","lastTransitionTime":"2025-12-06T09:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.552865 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.557767 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:06:44 crc kubenswrapper[4672]: E1206 09:06:44.557923 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.558429 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:06:44 crc kubenswrapper[4672]: E1206 09:06:44.558479 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.558900 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:06:44 crc kubenswrapper[4672]: E1206 09:06:44.558961 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.561634 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.562385 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.563741 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.564440 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.565420 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.566053 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.566793 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.568933 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.569749 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.570811 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.571376 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.572540 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.573247 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.573715 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.574197 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.576182 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.576723 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.577879 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.578288 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.578943 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.580003 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.580570 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.581584 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.582022 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.583945 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.584561 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.585817 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.586744 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.587339 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.588448 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.589055 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.590459 4672 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.590565 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.591145 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.592281 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.593359 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.593843 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.595639 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.596288 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.597753 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.598524 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.599839 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.601805 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.603437 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.604097 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.605162 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.605745 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.606955 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.607739 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.609065 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.610825 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.611782 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.616348 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.616883 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.618068 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.618829 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.619421 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.633318 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.647053 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.647098 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.647108 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.647129 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.647139 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:44Z","lastTransitionTime":"2025-12-06T09:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.649943 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.668159 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.678764 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.695972 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.740244 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" event={"ID":"713432b9-3b28-4ad0-b578-9d42aa1931aa","Type":"ContainerStarted","Data":"7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db"} Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.740306 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" event={"ID":"713432b9-3b28-4ad0-b578-9d42aa1931aa","Type":"ContainerStarted","Data":"5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b"} Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.740318 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" event={"ID":"713432b9-3b28-4ad0-b578-9d42aa1931aa","Type":"ContainerStarted","Data":"4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1"} Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.740328 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" event={"ID":"713432b9-3b28-4ad0-b578-9d42aa1931aa","Type":"ContainerStarted","Data":"68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec"} Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.740339 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" event={"ID":"713432b9-3b28-4ad0-b578-9d42aa1931aa","Type":"ContainerStarted","Data":"6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c"} Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.740350 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" event={"ID":"713432b9-3b28-4ad0-b578-9d42aa1931aa","Type":"ContainerStarted","Data":"0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22"} Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.742699 4672 generic.go:334] "Generic (PLEG): container finished" podID="4471a809-0ca4-44fd-aa93-3d89e87a2291" containerID="ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0" exitCode=0 Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.742817 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" event={"ID":"4471a809-0ca4-44fd-aa93-3d89e87a2291","Type":"ContainerDied","Data":"ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0"} Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.747530 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.749148 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.749204 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.749216 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.749236 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.749249 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:44Z","lastTransitionTime":"2025-12-06T09:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:44 crc kubenswrapper[4672]: E1206 09:06:44.753092 4672 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.801201 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.839092 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.853467 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.853508 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.853554 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.853569 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.853786 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:44Z","lastTransitionTime":"2025-12-06T09:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.879108 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.920145 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.959023 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.959089 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.959101 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.959059 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.959119 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.959435 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:44Z","lastTransitionTime":"2025-12-06T09:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:44 crc kubenswrapper[4672]: I1206 09:06:44.999629 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:44Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.046957 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:45Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.062262 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.062316 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.062331 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.062355 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.062368 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:45Z","lastTransitionTime":"2025-12-06T09:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.079713 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:45Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.124435 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:45Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.157389 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:45Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.166123 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.166229 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.166378 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.166495 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.166623 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:45Z","lastTransitionTime":"2025-12-06T09:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.206145 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:45Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.243167 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:45Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.268953 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.269326 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.269399 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.269499 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.269665 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:45Z","lastTransitionTime":"2025-12-06T09:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.279975 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:45Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.323386 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:45Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.361273 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:45Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.372138 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.372160 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.372168 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.372181 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.372189 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:45Z","lastTransitionTime":"2025-12-06T09:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.473964 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.474004 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.474014 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.474029 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.474040 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:45Z","lastTransitionTime":"2025-12-06T09:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.576901 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.576954 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.576966 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.576989 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.577002 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:45Z","lastTransitionTime":"2025-12-06T09:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.679000 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.679340 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.679350 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.679362 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.679372 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:45Z","lastTransitionTime":"2025-12-06T09:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.747563 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be"} Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.750028 4672 generic.go:334] "Generic (PLEG): container finished" podID="4471a809-0ca4-44fd-aa93-3d89e87a2291" containerID="153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890" exitCode=0 Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.750094 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" event={"ID":"4471a809-0ca4-44fd-aa93-3d89e87a2291","Type":"ContainerDied","Data":"153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890"} Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.770826 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:45Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.781344 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.781380 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.781390 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.781403 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.781412 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:45Z","lastTransitionTime":"2025-12-06T09:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.786751 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:45Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.801756 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:45Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.815167 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:45Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.829490 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:45Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.851218 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:45Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.883821 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.883861 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.883870 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.883885 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.883895 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:45Z","lastTransitionTime":"2025-12-06T09:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.884725 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:45Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.920380 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:45Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.941351 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:45Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.955705 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:45Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.966982 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:45Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.978664 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:45Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.985556 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.985594 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.985613 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.985628 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.985638 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:45Z","lastTransitionTime":"2025-12-06T09:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:45 crc kubenswrapper[4672]: I1206 09:06:45.993222 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:45Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.004249 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:46Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.018520 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:46Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.032684 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:46Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.045070 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:46Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.076442 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:46Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.088894 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.088941 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.088953 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.088972 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.088997 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:46Z","lastTransitionTime":"2025-12-06T09:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.123982 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:46Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.161347 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:46Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.192200 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.192251 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.192268 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.192288 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.192305 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:46Z","lastTransitionTime":"2025-12-06T09:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.205848 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:46Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.219078 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.219139 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:06:46 crc kubenswrapper[4672]: E1206 09:06:46.219216 4672 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 09:06:46 crc kubenswrapper[4672]: E1206 09:06:46.219258 4672 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 09:06:46 crc kubenswrapper[4672]: E1206 09:06:46.219304 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 09:06:50.219283499 +0000 UTC m=+27.963543806 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 09:06:46 crc kubenswrapper[4672]: E1206 09:06:46.219329 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 09:06:50.21931833 +0000 UTC m=+27.963578627 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.243310 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:46Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.282354 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:46Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.295034 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.295081 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.295091 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.295110 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.295122 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:46Z","lastTransitionTime":"2025-12-06T09:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.319882 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.320077 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:06:46 crc kubenswrapper[4672]: E1206 09:06:46.320223 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:06:50.32016681 +0000 UTC m=+28.064427107 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:06:46 crc kubenswrapper[4672]: E1206 09:06:46.320367 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.320392 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:06:46 crc kubenswrapper[4672]: E1206 09:06:46.320425 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 09:06:46 crc kubenswrapper[4672]: E1206 09:06:46.320558 4672 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:06:46 crc kubenswrapper[4672]: E1206 09:06:46.320511 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 09:06:46 crc kubenswrapper[4672]: E1206 09:06:46.320636 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 09:06:46 crc kubenswrapper[4672]: E1206 09:06:46.320662 4672 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:06:46 crc kubenswrapper[4672]: E1206 09:06:46.320681 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 09:06:50.320652403 +0000 UTC m=+28.064912900 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:06:46 crc kubenswrapper[4672]: E1206 09:06:46.320758 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 09:06:50.320720654 +0000 UTC m=+28.064980981 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.324568 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:46Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.359750 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:46Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.397161 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.397200 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.397210 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.397226 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.397236 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:46Z","lastTransitionTime":"2025-12-06T09:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.409658 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:46Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.499927 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.499992 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.500008 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.500031 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.500044 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:46Z","lastTransitionTime":"2025-12-06T09:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.556406 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.556472 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:06:46 crc kubenswrapper[4672]: E1206 09:06:46.556554 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.556420 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:06:46 crc kubenswrapper[4672]: E1206 09:06:46.557014 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:06:46 crc kubenswrapper[4672]: E1206 09:06:46.557103 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.602792 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.602831 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.602843 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.602860 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.602870 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:46Z","lastTransitionTime":"2025-12-06T09:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.704490 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.704539 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.704551 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.704570 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.704627 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:46Z","lastTransitionTime":"2025-12-06T09:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.757512 4672 generic.go:334] "Generic (PLEG): container finished" podID="4471a809-0ca4-44fd-aa93-3d89e87a2291" containerID="7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4" exitCode=0 Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.757586 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" event={"ID":"4471a809-0ca4-44fd-aa93-3d89e87a2291","Type":"ContainerDied","Data":"7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4"} Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.780755 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:46Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.803031 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:46Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.808276 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.808310 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.808321 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.808337 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.808349 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:46Z","lastTransitionTime":"2025-12-06T09:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.820731 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:46Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.836419 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:46Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.854668 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:46Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.871154 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:46Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.892922 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:46Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.909022 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:46Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.910877 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.910919 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.910931 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.910948 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.910959 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:46Z","lastTransitionTime":"2025-12-06T09:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.924803 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:46Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.939482 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:46Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.953581 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:46Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.975927 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:46Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:46 crc kubenswrapper[4672]: I1206 09:06:46.992373 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:46Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.012972 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.013003 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.013015 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.013031 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.013043 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:47Z","lastTransitionTime":"2025-12-06T09:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.116140 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.116218 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.116230 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.116247 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.116257 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:47Z","lastTransitionTime":"2025-12-06T09:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.218962 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.219008 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.219019 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.219041 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.219056 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:47Z","lastTransitionTime":"2025-12-06T09:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.321453 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.321493 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.321501 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.321518 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.321529 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:47Z","lastTransitionTime":"2025-12-06T09:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.423788 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.424006 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.424067 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.424124 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.424178 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:47Z","lastTransitionTime":"2025-12-06T09:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.526847 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.527087 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.527155 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.527221 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.527275 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:47Z","lastTransitionTime":"2025-12-06T09:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.629002 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.629241 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.629315 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.629377 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.629430 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:47Z","lastTransitionTime":"2025-12-06T09:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.731872 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.732109 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.732191 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.732304 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.732389 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:47Z","lastTransitionTime":"2025-12-06T09:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.762149 4672 generic.go:334] "Generic (PLEG): container finished" podID="4471a809-0ca4-44fd-aa93-3d89e87a2291" containerID="d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999" exitCode=0 Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.762345 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" event={"ID":"4471a809-0ca4-44fd-aa93-3d89e87a2291","Type":"ContainerDied","Data":"d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999"} Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.772525 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" event={"ID":"713432b9-3b28-4ad0-b578-9d42aa1931aa","Type":"ContainerStarted","Data":"97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c"} Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.777594 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:47Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.790964 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:47Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.805423 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:47Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.831467 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:47Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.836814 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.836846 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.836857 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.836872 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.836884 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:47Z","lastTransitionTime":"2025-12-06T09:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.858273 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:47Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.876099 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:47Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.888143 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:47Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.900845 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:47Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.918179 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:47Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.937230 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:47Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.940545 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.940593 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.940624 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.940640 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.940653 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:47Z","lastTransitionTime":"2025-12-06T09:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.952526 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:47Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.966231 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:47Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:47 crc kubenswrapper[4672]: I1206 09:06:47.982103 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:47Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.043044 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.043083 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.043092 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.043107 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.043117 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:48Z","lastTransitionTime":"2025-12-06T09:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.145264 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.145297 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.145309 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.145324 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.145333 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:48Z","lastTransitionTime":"2025-12-06T09:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.249067 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.249204 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.249226 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.249255 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.249269 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:48Z","lastTransitionTime":"2025-12-06T09:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.351454 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.351496 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.351508 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.351525 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.351536 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:48Z","lastTransitionTime":"2025-12-06T09:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.455869 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.455927 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.455943 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.455985 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.456001 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:48Z","lastTransitionTime":"2025-12-06T09:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.556869 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.556889 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.556900 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:06:48 crc kubenswrapper[4672]: E1206 09:06:48.557006 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:06:48 crc kubenswrapper[4672]: E1206 09:06:48.557357 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:06:48 crc kubenswrapper[4672]: E1206 09:06:48.557412 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.559244 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.559363 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.559453 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.559548 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.559642 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:48Z","lastTransitionTime":"2025-12-06T09:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.564977 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-sxrkj"] Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.565773 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sxrkj" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.569161 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.570077 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.570566 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.571543 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.590967 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:48Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.604628 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:48Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.616419 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:48Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.632618 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:48Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.646169 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:48Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.662901 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.662960 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.662977 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.663003 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.663021 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:48Z","lastTransitionTime":"2025-12-06T09:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.663893 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:48Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.676317 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:48Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.692189 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:48Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.703023 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxrkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37625968-279a-4fc1-bfa2-b03868e7363d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vls65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxrkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:48Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.719055 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:48Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.735770 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:48Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.743225 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vls65\" (UniqueName: \"kubernetes.io/projected/37625968-279a-4fc1-bfa2-b03868e7363d-kube-api-access-vls65\") pod \"node-ca-sxrkj\" (UID: \"37625968-279a-4fc1-bfa2-b03868e7363d\") " pod="openshift-image-registry/node-ca-sxrkj" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.743284 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/37625968-279a-4fc1-bfa2-b03868e7363d-host\") pod \"node-ca-sxrkj\" (UID: \"37625968-279a-4fc1-bfa2-b03868e7363d\") " pod="openshift-image-registry/node-ca-sxrkj" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.743332 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/37625968-279a-4fc1-bfa2-b03868e7363d-serviceca\") pod \"node-ca-sxrkj\" (UID: \"37625968-279a-4fc1-bfa2-b03868e7363d\") " pod="openshift-image-registry/node-ca-sxrkj" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.752320 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:48Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.765044 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.765082 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.765092 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.765107 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.765117 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:48Z","lastTransitionTime":"2025-12-06T09:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.767453 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:48Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.779674 4672 generic.go:334] "Generic (PLEG): container finished" podID="4471a809-0ca4-44fd-aa93-3d89e87a2291" containerID="c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782" exitCode=0 Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.779747 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" event={"ID":"4471a809-0ca4-44fd-aa93-3d89e87a2291","Type":"ContainerDied","Data":"c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782"} Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.789796 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:48Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.820191 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:48Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.836203 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:48Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.845110 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/37625968-279a-4fc1-bfa2-b03868e7363d-serviceca\") pod \"node-ca-sxrkj\" (UID: \"37625968-279a-4fc1-bfa2-b03868e7363d\") " pod="openshift-image-registry/node-ca-sxrkj" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.845186 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vls65\" (UniqueName: \"kubernetes.io/projected/37625968-279a-4fc1-bfa2-b03868e7363d-kube-api-access-vls65\") pod \"node-ca-sxrkj\" (UID: \"37625968-279a-4fc1-bfa2-b03868e7363d\") " pod="openshift-image-registry/node-ca-sxrkj" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.845242 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/37625968-279a-4fc1-bfa2-b03868e7363d-host\") pod \"node-ca-sxrkj\" (UID: \"37625968-279a-4fc1-bfa2-b03868e7363d\") " pod="openshift-image-registry/node-ca-sxrkj" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.845343 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/37625968-279a-4fc1-bfa2-b03868e7363d-host\") pod \"node-ca-sxrkj\" (UID: \"37625968-279a-4fc1-bfa2-b03868e7363d\") " pod="openshift-image-registry/node-ca-sxrkj" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.847052 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/37625968-279a-4fc1-bfa2-b03868e7363d-serviceca\") pod \"node-ca-sxrkj\" (UID: \"37625968-279a-4fc1-bfa2-b03868e7363d\") " pod="openshift-image-registry/node-ca-sxrkj" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.851869 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:48Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.865541 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:48Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.867470 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.867496 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.867505 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.867520 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.867531 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:48Z","lastTransitionTime":"2025-12-06T09:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.871424 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vls65\" (UniqueName: \"kubernetes.io/projected/37625968-279a-4fc1-bfa2-b03868e7363d-kube-api-access-vls65\") pod \"node-ca-sxrkj\" (UID: \"37625968-279a-4fc1-bfa2-b03868e7363d\") " pod="openshift-image-registry/node-ca-sxrkj" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.879283 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sxrkj" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.882927 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:48Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.899488 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:48Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.917329 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:48Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.930956 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:48Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.947440 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:48Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.967970 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:48Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.974328 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.974410 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.974439 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.974461 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.974474 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:48Z","lastTransitionTime":"2025-12-06T09:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:48 crc kubenswrapper[4672]: I1206 09:06:48.994128 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:48Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.007882 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:49Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.024588 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:49Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.036124 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxrkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37625968-279a-4fc1-bfa2-b03868e7363d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vls65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxrkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:49Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.076657 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.076722 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.076733 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.076755 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.076771 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:49Z","lastTransitionTime":"2025-12-06T09:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.179622 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.179666 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.179684 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.179703 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.179717 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:49Z","lastTransitionTime":"2025-12-06T09:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.282517 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.282558 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.282570 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.282587 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.282616 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:49Z","lastTransitionTime":"2025-12-06T09:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.385547 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.385587 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.385636 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.385655 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.385666 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:49Z","lastTransitionTime":"2025-12-06T09:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.487537 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.487575 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.487584 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.487612 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.487622 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:49Z","lastTransitionTime":"2025-12-06T09:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.590385 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.590448 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.590464 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.590495 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.590516 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:49Z","lastTransitionTime":"2025-12-06T09:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.693408 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.693456 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.693498 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.693521 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.693535 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:49Z","lastTransitionTime":"2025-12-06T09:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.788263 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" event={"ID":"713432b9-3b28-4ad0-b578-9d42aa1931aa","Type":"ContainerStarted","Data":"8cd51cd7e143bac4cb88a4224e7da67827bf8f434004693faabb0b09140a10a1"} Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.788649 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.788670 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.791890 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sxrkj" event={"ID":"37625968-279a-4fc1-bfa2-b03868e7363d","Type":"ContainerStarted","Data":"ca1d5b78e98dc1b35785758a9e44908823d0f5589f5a8d505ea1e909bb97dbb6"} Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.791956 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sxrkj" event={"ID":"37625968-279a-4fc1-bfa2-b03868e7363d","Type":"ContainerStarted","Data":"a4b9a4fb3c680f02cd74124684cce88a06c902231c272fe741ecd55e24d3bee6"} Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.795798 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.795835 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.795849 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.795866 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.795880 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:49Z","lastTransitionTime":"2025-12-06T09:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.797476 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" event={"ID":"4471a809-0ca4-44fd-aa93-3d89e87a2291","Type":"ContainerStarted","Data":"640f821886d65eb7ea8dc8ec35be25c33457d1dfb440632b932dfc0cb39b7b47"} Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.814493 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:49Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.819975 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.820064 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.831492 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:49Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.854348 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:49Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.872550 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxrkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37625968-279a-4fc1-bfa2-b03868e7363d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vls65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxrkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:49Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.895530 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:49Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.898948 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.898988 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.898997 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.899012 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.899022 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:49Z","lastTransitionTime":"2025-12-06T09:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.918033 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:49Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.932038 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:49Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.955622 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd51cd7e143bac4cb88a4224e7da67827bf8f434004693faabb0b09140a10a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:49Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.975727 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:49Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:49 crc kubenswrapper[4672]: I1206 09:06:49.989995 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:49Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.001939 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.001999 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.002013 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.002033 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.002047 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:50Z","lastTransitionTime":"2025-12-06T09:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.006667 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:50Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.023694 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:50Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.042184 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:50Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.056828 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:50Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.074655 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:50Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.089489 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:50Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.105215 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.105259 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.105269 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.105285 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.105294 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:50Z","lastTransitionTime":"2025-12-06T09:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.108355 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:50Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.124870 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:50Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.137717 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxrkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37625968-279a-4fc1-bfa2-b03868e7363d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca1d5b78e98dc1b35785758a9e44908823d0f5589f5a8d505ea1e909bb97dbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vls65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxrkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:50Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.153011 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:50Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.165955 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:50Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.187150 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd51cd7e143bac4cb88a4224e7da67827bf8f434004693faabb0b09140a10a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:50Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.201986 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:50Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.206993 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.207033 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.207044 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.207059 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.207068 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:50Z","lastTransitionTime":"2025-12-06T09:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.214996 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:50Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.230558 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:50Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.243675 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:50Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.258506 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://640f821886d65eb7ea8dc8ec35be25c33457d1dfb440632b932dfc0cb39b7b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:50Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.262290 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.262386 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:06:50 crc kubenswrapper[4672]: E1206 09:06:50.262457 4672 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 09:06:50 crc kubenswrapper[4672]: E1206 09:06:50.262536 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 09:06:58.262518752 +0000 UTC m=+36.006779039 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 09:06:50 crc kubenswrapper[4672]: E1206 09:06:50.262573 4672 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 09:06:50 crc kubenswrapper[4672]: E1206 09:06:50.262699 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 09:06:58.262676656 +0000 UTC m=+36.006937123 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.273516 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:50Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.309643 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.309984 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.310113 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.310238 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.310353 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:50Z","lastTransitionTime":"2025-12-06T09:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.363747 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.363865 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.363900 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:06:50 crc kubenswrapper[4672]: E1206 09:06:50.364011 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:06:58.363976888 +0000 UTC m=+36.108237175 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:06:50 crc kubenswrapper[4672]: E1206 09:06:50.364062 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 09:06:50 crc kubenswrapper[4672]: E1206 09:06:50.364083 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 09:06:50 crc kubenswrapper[4672]: E1206 09:06:50.364076 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 09:06:50 crc kubenswrapper[4672]: E1206 09:06:50.364137 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 09:06:50 crc kubenswrapper[4672]: E1206 09:06:50.364157 4672 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:06:50 crc kubenswrapper[4672]: E1206 09:06:50.364098 4672 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:06:50 crc kubenswrapper[4672]: E1206 09:06:50.364220 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 09:06:58.364201174 +0000 UTC m=+36.108461481 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:06:50 crc kubenswrapper[4672]: E1206 09:06:50.364255 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 09:06:58.364238565 +0000 UTC m=+36.108498852 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.413991 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.414037 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.414049 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.414069 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.414082 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:50Z","lastTransitionTime":"2025-12-06T09:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.517213 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.517263 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.517274 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.517292 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.517307 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:50Z","lastTransitionTime":"2025-12-06T09:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.556621 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.556651 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.556720 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:06:50 crc kubenswrapper[4672]: E1206 09:06:50.556786 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:06:50 crc kubenswrapper[4672]: E1206 09:06:50.557237 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:06:50 crc kubenswrapper[4672]: E1206 09:06:50.557321 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.620338 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.620390 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.620401 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.620421 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.620436 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:50Z","lastTransitionTime":"2025-12-06T09:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.722547 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.722628 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.722642 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.722657 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.722666 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:50Z","lastTransitionTime":"2025-12-06T09:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.800363 4672 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.825143 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.825180 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.825191 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.825211 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.825222 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:50Z","lastTransitionTime":"2025-12-06T09:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.927899 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.927943 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.927951 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.927965 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:50 crc kubenswrapper[4672]: I1206 09:06:50.927979 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:50Z","lastTransitionTime":"2025-12-06T09:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.059819 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.059867 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.059882 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.059903 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.059919 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:51Z","lastTransitionTime":"2025-12-06T09:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.162875 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.162931 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.162945 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.162966 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.162981 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:51Z","lastTransitionTime":"2025-12-06T09:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.269135 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.269178 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.269196 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.269215 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.269229 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:51Z","lastTransitionTime":"2025-12-06T09:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.372174 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.372224 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.372235 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.372253 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.372263 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:51Z","lastTransitionTime":"2025-12-06T09:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.475029 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.475734 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.475768 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.475789 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.475804 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:51Z","lastTransitionTime":"2025-12-06T09:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.578821 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.578879 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.578905 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.578922 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.578934 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:51Z","lastTransitionTime":"2025-12-06T09:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.681853 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.681904 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.681922 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.681945 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.681962 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:51Z","lastTransitionTime":"2025-12-06T09:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.785406 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.785927 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.786130 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.786340 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.786526 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:51Z","lastTransitionTime":"2025-12-06T09:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.809925 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbbs5_713432b9-3b28-4ad0-b578-9d42aa1931aa/ovnkube-controller/0.log" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.812746 4672 generic.go:334] "Generic (PLEG): container finished" podID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerID="8cd51cd7e143bac4cb88a4224e7da67827bf8f434004693faabb0b09140a10a1" exitCode=1 Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.812791 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" event={"ID":"713432b9-3b28-4ad0-b578-9d42aa1931aa","Type":"ContainerDied","Data":"8cd51cd7e143bac4cb88a4224e7da67827bf8f434004693faabb0b09140a10a1"} Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.813334 4672 scope.go:117] "RemoveContainer" containerID="8cd51cd7e143bac4cb88a4224e7da67827bf8f434004693faabb0b09140a10a1" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.832059 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:51Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.851977 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:51Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.868533 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:51Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.881417 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:51Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.890341 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.890432 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.890446 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.890468 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.890511 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:51Z","lastTransitionTime":"2025-12-06T09:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.895102 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxrkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37625968-279a-4fc1-bfa2-b03868e7363d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca1d5b78e98dc1b35785758a9e44908823d0f5589f5a8d505ea1e909bb97dbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vls65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxrkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:51Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.914910 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd51cd7e143bac4cb88a4224e7da67827bf8f434004693faabb0b09140a10a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd51cd7e143bac4cb88a4224e7da67827bf8f434004693faabb0b09140a10a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:06:51Z\\\",\\\"message\\\":\\\"go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 09:06:51.297631 5855 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 09:06:51.297983 5855 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 09:06:51.298644 5855 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1206 09:06:51.298707 5855 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1206 09:06:51.298717 5855 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1206 09:06:51.298797 5855 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1206 09:06:51.298824 5855 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1206 09:06:51.298824 5855 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1206 09:06:51.298880 5855 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1206 09:06:51.298875 5855 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1206 09:06:51.298927 5855 factory.go:656] Stopping watch factory\\\\nI1206 09:06:51.299003 5855 ovnkube.go:599] Stopped ovnkube\\\\nI1206 09:06:51.298936 5855 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1206 09:06:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:51Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.929077 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:51Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.941060 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:51Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.952971 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:51Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.969515 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://640f821886d65eb7ea8dc8ec35be25c33457d1dfb440632b932dfc0cb39b7b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:51Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.983470 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:51Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.994269 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.994316 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.994327 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.994346 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:51 crc kubenswrapper[4672]: I1206 09:06:51.994360 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:51Z","lastTransitionTime":"2025-12-06T09:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.004503 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.020351 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.032052 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.096956 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.096997 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.097008 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.097024 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.097035 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:52Z","lastTransitionTime":"2025-12-06T09:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.199534 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.199593 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.199647 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.199671 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.199688 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:52Z","lastTransitionTime":"2025-12-06T09:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.306529 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.306679 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.306700 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.306734 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.306766 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:52Z","lastTransitionTime":"2025-12-06T09:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.410718 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.410775 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.410792 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.410818 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.410840 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:52Z","lastTransitionTime":"2025-12-06T09:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.513804 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.513858 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.513875 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.513899 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.513918 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:52Z","lastTransitionTime":"2025-12-06T09:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.556452 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.556521 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.556687 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:06:52 crc kubenswrapper[4672]: E1206 09:06:52.556699 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:06:52 crc kubenswrapper[4672]: E1206 09:06:52.556819 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:06:52 crc kubenswrapper[4672]: E1206 09:06:52.556918 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.586057 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.605248 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.616404 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.616465 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.616481 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.616503 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.616520 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:52Z","lastTransitionTime":"2025-12-06T09:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.628107 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.658525 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.673064 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxrkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37625968-279a-4fc1-bfa2-b03868e7363d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca1d5b78e98dc1b35785758a9e44908823d0f5589f5a8d505ea1e909bb97dbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vls65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxrkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.687891 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.700893 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.719314 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.719388 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.719402 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.719421 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.719445 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:52Z","lastTransitionTime":"2025-12-06T09:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.720845 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd51cd7e143bac4cb88a4224e7da67827bf8f434004693faabb0b09140a10a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd51cd7e143bac4cb88a4224e7da67827bf8f434004693faabb0b09140a10a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:06:51Z\\\",\\\"message\\\":\\\"go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 09:06:51.297631 5855 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 09:06:51.297983 5855 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 09:06:51.298644 5855 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1206 09:06:51.298707 5855 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1206 09:06:51.298717 5855 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1206 09:06:51.298797 5855 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1206 09:06:51.298824 5855 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1206 09:06:51.298824 5855 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1206 09:06:51.298880 5855 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1206 09:06:51.298875 5855 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1206 09:06:51.298927 5855 factory.go:656] Stopping watch factory\\\\nI1206 09:06:51.299003 5855 ovnkube.go:599] Stopped ovnkube\\\\nI1206 09:06:51.298936 5855 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1206 09:06:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.734322 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.746705 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.758904 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.769250 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.784292 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://640f821886d65eb7ea8dc8ec35be25c33457d1dfb440632b932dfc0cb39b7b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.796392 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.817509 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbbs5_713432b9-3b28-4ad0-b578-9d42aa1931aa/ovnkube-controller/0.log" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.819592 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" event={"ID":"713432b9-3b28-4ad0-b578-9d42aa1931aa","Type":"ContainerStarted","Data":"19106a472fff58d8192b9ca06cf0f166038a1ef4ddd83e83bc3384bd41b3e8de"} Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.819822 4672 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.823982 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.824017 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.824027 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.824048 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.824058 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:52Z","lastTransitionTime":"2025-12-06T09:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.837843 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.850705 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.871641 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.883730 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxrkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37625968-279a-4fc1-bfa2-b03868e7363d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca1d5b78e98dc1b35785758a9e44908823d0f5589f5a8d505ea1e909bb97dbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vls65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxrkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.899621 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.913703 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.926293 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.926329 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.926357 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.926372 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.926382 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:52Z","lastTransitionTime":"2025-12-06T09:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.934635 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19106a472fff58d8192b9ca06cf0f166038a1ef4ddd83e83bc3384bd41b3e8de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd51cd7e143bac4cb88a4224e7da67827bf8f434004693faabb0b09140a10a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:06:51Z\\\",\\\"message\\\":\\\"go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 09:06:51.297631 5855 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 09:06:51.297983 5855 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 09:06:51.298644 5855 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1206 09:06:51.298707 5855 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1206 09:06:51.298717 5855 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1206 09:06:51.298797 5855 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1206 09:06:51.298824 5855 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1206 09:06:51.298824 5855 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1206 09:06:51.298880 5855 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1206 09:06:51.298875 5855 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1206 09:06:51.298927 5855 factory.go:656] Stopping watch factory\\\\nI1206 09:06:51.299003 5855 ovnkube.go:599] Stopped ovnkube\\\\nI1206 09:06:51.298936 5855 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1206 09:06:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.951507 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.967697 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.979117 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:52 crc kubenswrapper[4672]: I1206 09:06:52.991968 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.006997 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://640f821886d65eb7ea8dc8ec35be25c33457d1dfb440632b932dfc0cb39b7b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:53Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.026372 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:53Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.028400 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.028457 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.028468 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.028482 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.028493 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:53Z","lastTransitionTime":"2025-12-06T09:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.043474 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:53Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.130947 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.130979 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.130987 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.130999 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.131024 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:53Z","lastTransitionTime":"2025-12-06T09:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.234086 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.234149 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.234169 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.234193 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.234211 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:53Z","lastTransitionTime":"2025-12-06T09:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.338535 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.338637 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.338675 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.338707 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.338727 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:53Z","lastTransitionTime":"2025-12-06T09:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.442240 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.442309 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.442334 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.442364 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.442387 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:53Z","lastTransitionTime":"2025-12-06T09:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.446396 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n"] Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.447792 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.449823 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.452744 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.471545 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:53Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.488838 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:53Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.495967 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/61df9d53-92e8-439f-8d15-44e96d25a23e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ch46n\" (UID: \"61df9d53-92e8-439f-8d15-44e96d25a23e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.496003 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/61df9d53-92e8-439f-8d15-44e96d25a23e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ch46n\" (UID: \"61df9d53-92e8-439f-8d15-44e96d25a23e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.496052 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/61df9d53-92e8-439f-8d15-44e96d25a23e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ch46n\" (UID: \"61df9d53-92e8-439f-8d15-44e96d25a23e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.496076 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svwl8\" (UniqueName: \"kubernetes.io/projected/61df9d53-92e8-439f-8d15-44e96d25a23e-kube-api-access-svwl8\") pod \"ovnkube-control-plane-749d76644c-ch46n\" (UID: \"61df9d53-92e8-439f-8d15-44e96d25a23e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.508640 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19106a472fff58d8192b9ca06cf0f166038a1ef4ddd83e83bc3384bd41b3e8de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd51cd7e143bac4cb88a4224e7da67827bf8f434004693faabb0b09140a10a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:06:51Z\\\",\\\"message\\\":\\\"go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 09:06:51.297631 5855 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 09:06:51.297983 5855 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 09:06:51.298644 5855 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1206 09:06:51.298707 5855 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1206 09:06:51.298717 5855 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1206 09:06:51.298797 5855 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1206 09:06:51.298824 5855 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1206 09:06:51.298824 5855 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1206 09:06:51.298880 5855 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1206 09:06:51.298875 5855 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1206 09:06:51.298927 5855 factory.go:656] Stopping watch factory\\\\nI1206 09:06:51.299003 5855 ovnkube.go:599] Stopped ovnkube\\\\nI1206 09:06:51.298936 5855 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1206 09:06:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:53Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.525565 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:53Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.541705 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:53Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.544420 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.544474 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.544490 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.544506 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.544516 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:53Z","lastTransitionTime":"2025-12-06T09:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.556361 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:53Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.568473 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:53Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.586210 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://640f821886d65eb7ea8dc8ec35be25c33457d1dfb440632b932dfc0cb39b7b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:53Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.596429 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/61df9d53-92e8-439f-8d15-44e96d25a23e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ch46n\" (UID: \"61df9d53-92e8-439f-8d15-44e96d25a23e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.596491 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svwl8\" (UniqueName: \"kubernetes.io/projected/61df9d53-92e8-439f-8d15-44e96d25a23e-kube-api-access-svwl8\") pod \"ovnkube-control-plane-749d76644c-ch46n\" (UID: \"61df9d53-92e8-439f-8d15-44e96d25a23e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.596546 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/61df9d53-92e8-439f-8d15-44e96d25a23e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ch46n\" (UID: \"61df9d53-92e8-439f-8d15-44e96d25a23e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.596567 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/61df9d53-92e8-439f-8d15-44e96d25a23e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ch46n\" (UID: \"61df9d53-92e8-439f-8d15-44e96d25a23e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.597083 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/61df9d53-92e8-439f-8d15-44e96d25a23e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ch46n\" (UID: \"61df9d53-92e8-439f-8d15-44e96d25a23e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.597365 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/61df9d53-92e8-439f-8d15-44e96d25a23e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ch46n\" (UID: \"61df9d53-92e8-439f-8d15-44e96d25a23e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.601444 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:53Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.604018 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/61df9d53-92e8-439f-8d15-44e96d25a23e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ch46n\" (UID: \"61df9d53-92e8-439f-8d15-44e96d25a23e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.613814 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svwl8\" (UniqueName: \"kubernetes.io/projected/61df9d53-92e8-439f-8d15-44e96d25a23e-kube-api-access-svwl8\") pod \"ovnkube-control-plane-749d76644c-ch46n\" (UID: \"61df9d53-92e8-439f-8d15-44e96d25a23e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.617185 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:53Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.627934 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61df9d53-92e8-439f-8d15-44e96d25a23e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ch46n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:53Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.641347 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:53Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.646521 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.646545 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.646553 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.646566 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.646574 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:53Z","lastTransitionTime":"2025-12-06T09:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.653092 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:53Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.663890 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:53Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.672362 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxrkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37625968-279a-4fc1-bfa2-b03868e7363d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca1d5b78e98dc1b35785758a9e44908823d0f5589f5a8d505ea1e909bb97dbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vls65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxrkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:53Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.750100 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.750172 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.750201 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.750233 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.750255 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:53Z","lastTransitionTime":"2025-12-06T09:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.762310 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" Dec 06 09:06:53 crc kubenswrapper[4672]: W1206 09:06:53.783206 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61df9d53_92e8_439f_8d15_44e96d25a23e.slice/crio-2af1c358c1eae4053c2ae1ddc2b65ebfe7b1daf46ae3b92af7e5a0a852ca17ce WatchSource:0}: Error finding container 2af1c358c1eae4053c2ae1ddc2b65ebfe7b1daf46ae3b92af7e5a0a852ca17ce: Status 404 returned error can't find the container with id 2af1c358c1eae4053c2ae1ddc2b65ebfe7b1daf46ae3b92af7e5a0a852ca17ce Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.827815 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbbs5_713432b9-3b28-4ad0-b578-9d42aa1931aa/ovnkube-controller/1.log" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.828940 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbbs5_713432b9-3b28-4ad0-b578-9d42aa1931aa/ovnkube-controller/0.log" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.832440 4672 generic.go:334] "Generic (PLEG): container finished" podID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerID="19106a472fff58d8192b9ca06cf0f166038a1ef4ddd83e83bc3384bd41b3e8de" exitCode=1 Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.832508 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" event={"ID":"713432b9-3b28-4ad0-b578-9d42aa1931aa","Type":"ContainerDied","Data":"19106a472fff58d8192b9ca06cf0f166038a1ef4ddd83e83bc3384bd41b3e8de"} Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.832566 4672 scope.go:117] "RemoveContainer" containerID="8cd51cd7e143bac4cb88a4224e7da67827bf8f434004693faabb0b09140a10a1" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.833519 4672 scope.go:117] "RemoveContainer" containerID="19106a472fff58d8192b9ca06cf0f166038a1ef4ddd83e83bc3384bd41b3e8de" Dec 06 09:06:53 crc kubenswrapper[4672]: E1206 09:06:53.833707 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xbbs5_openshift-ovn-kubernetes(713432b9-3b28-4ad0-b578-9d42aa1931aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.834012 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" event={"ID":"61df9d53-92e8-439f-8d15-44e96d25a23e","Type":"ContainerStarted","Data":"2af1c358c1eae4053c2ae1ddc2b65ebfe7b1daf46ae3b92af7e5a0a852ca17ce"} Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.852781 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.852833 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.852854 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.852882 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.852903 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:53Z","lastTransitionTime":"2025-12-06T09:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.852972 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:53Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.865478 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61df9d53-92e8-439f-8d15-44e96d25a23e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ch46n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:53Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.879241 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:53Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.894084 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:53Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.908243 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:53Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.918968 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxrkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37625968-279a-4fc1-bfa2-b03868e7363d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca1d5b78e98dc1b35785758a9e44908823d0f5589f5a8d505ea1e909bb97dbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vls65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxrkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:53Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.931732 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:53Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.943742 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:53Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.954905 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.954942 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.954951 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.954965 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.954974 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:53Z","lastTransitionTime":"2025-12-06T09:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.965757 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19106a472fff58d8192b9ca06cf0f166038a1ef4ddd83e83bc3384bd41b3e8de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd51cd7e143bac4cb88a4224e7da67827bf8f434004693faabb0b09140a10a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:06:51Z\\\",\\\"message\\\":\\\"go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 09:06:51.297631 5855 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 09:06:51.297983 5855 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 09:06:51.298644 5855 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1206 09:06:51.298707 5855 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1206 09:06:51.298717 5855 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1206 09:06:51.298797 5855 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1206 09:06:51.298824 5855 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1206 09:06:51.298824 5855 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1206 09:06:51.298880 5855 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1206 09:06:51.298875 5855 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1206 09:06:51.298927 5855 factory.go:656] Stopping watch factory\\\\nI1206 09:06:51.299003 5855 ovnkube.go:599] Stopped ovnkube\\\\nI1206 09:06:51.298936 5855 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1206 09:06:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19106a472fff58d8192b9ca06cf0f166038a1ef4ddd83e83bc3384bd41b3e8de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"message\\\":\\\"ormer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z]\\\\nI1206 09:06:53.047780 6007 services_controller.go:434] Service openshift-authentication/oauth-openshift retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{oauth-openshift openshift-authentication 327e9277-4a34-458b-9afd-a4d0b83d7a80 5000 0 2025-02-23 05:23:11 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:oauth-openshift] map[operator.openshift.io/spec-hash:d9e6d53076d47ab2d123d8b1ba8ec6543488d973dcc4e02349493cd1c33bce83 service.alpha.openshift.io/serving-cert-secret-name:v4-0-config-system-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 6443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: oauth-openshi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:53Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:53 crc kubenswrapper[4672]: I1206 09:06:53.986201 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:53Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.002041 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:53Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.017779 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:54Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.029294 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:54Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.045347 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://640f821886d65eb7ea8dc8ec35be25c33457d1dfb440632b932dfc0cb39b7b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:54Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.056734 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.056774 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.056783 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.056799 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.056808 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:54Z","lastTransitionTime":"2025-12-06T09:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.058148 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:54Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.159039 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.159105 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.159122 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.159145 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.159161 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:54Z","lastTransitionTime":"2025-12-06T09:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.265645 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.265699 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.265712 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.265732 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.265744 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:54Z","lastTransitionTime":"2025-12-06T09:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.346681 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.346732 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.346748 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.346771 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.346790 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:54Z","lastTransitionTime":"2025-12-06T09:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:54 crc kubenswrapper[4672]: E1206 09:06:54.368009 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dee4872a-ee41-4a28-b591-3da52b9dd3d6\\\",\\\"systemUUID\\\":\\\"7e6e2ea0-eb53-4cec-8366-444329cefc63\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:54Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.373764 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.373799 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.373810 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.373824 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.373834 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:54Z","lastTransitionTime":"2025-12-06T09:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:54 crc kubenswrapper[4672]: E1206 09:06:54.393491 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dee4872a-ee41-4a28-b591-3da52b9dd3d6\\\",\\\"systemUUID\\\":\\\"7e6e2ea0-eb53-4cec-8366-444329cefc63\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:54Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.399475 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.399556 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.399583 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.399649 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.399676 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:54Z","lastTransitionTime":"2025-12-06T09:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:54 crc kubenswrapper[4672]: E1206 09:06:54.420572 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dee4872a-ee41-4a28-b591-3da52b9dd3d6\\\",\\\"systemUUID\\\":\\\"7e6e2ea0-eb53-4cec-8366-444329cefc63\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:54Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.426124 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.426185 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.426198 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.426214 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.426226 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:54Z","lastTransitionTime":"2025-12-06T09:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:54 crc kubenswrapper[4672]: E1206 09:06:54.446163 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dee4872a-ee41-4a28-b591-3da52b9dd3d6\\\",\\\"systemUUID\\\":\\\"7e6e2ea0-eb53-4cec-8366-444329cefc63\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:54Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.450816 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.450859 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.450870 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.450887 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.450900 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:54Z","lastTransitionTime":"2025-12-06T09:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:54 crc kubenswrapper[4672]: E1206 09:06:54.465834 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dee4872a-ee41-4a28-b591-3da52b9dd3d6\\\",\\\"systemUUID\\\":\\\"7e6e2ea0-eb53-4cec-8366-444329cefc63\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:54Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:54 crc kubenswrapper[4672]: E1206 09:06:54.465960 4672 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.467639 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.467670 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.467701 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.467717 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.467729 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:54Z","lastTransitionTime":"2025-12-06T09:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.556242 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.556246 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.556314 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:06:54 crc kubenswrapper[4672]: E1206 09:06:54.556464 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:06:54 crc kubenswrapper[4672]: E1206 09:06:54.556711 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:06:54 crc kubenswrapper[4672]: E1206 09:06:54.556817 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.569646 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.569687 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.569720 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.569736 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.569747 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:54Z","lastTransitionTime":"2025-12-06T09:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.677133 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.677508 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.677526 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.677547 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.677563 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:54Z","lastTransitionTime":"2025-12-06T09:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.780380 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.780439 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.780456 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.780480 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.780502 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:54Z","lastTransitionTime":"2025-12-06T09:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.840133 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbbs5_713432b9-3b28-4ad0-b578-9d42aa1931aa/ovnkube-controller/1.log" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.846822 4672 scope.go:117] "RemoveContainer" containerID="19106a472fff58d8192b9ca06cf0f166038a1ef4ddd83e83bc3384bd41b3e8de" Dec 06 09:06:54 crc kubenswrapper[4672]: E1206 09:06:54.847190 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xbbs5_openshift-ovn-kubernetes(713432b9-3b28-4ad0-b578-9d42aa1931aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.847581 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" event={"ID":"61df9d53-92e8-439f-8d15-44e96d25a23e","Type":"ContainerStarted","Data":"9e75ceaf93a3d396036177b57e3f468fb6bc704896dc27cd2e8ab6924eab53b7"} Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.847708 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" event={"ID":"61df9d53-92e8-439f-8d15-44e96d25a23e","Type":"ContainerStarted","Data":"a8cad8f3bb7aca435b771c2e1843d53eabc28463caaf29de4650edcf6681ca9d"} Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.865870 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:54Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.883422 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.883677 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.883768 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.883851 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.883940 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:54Z","lastTransitionTime":"2025-12-06T09:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.885579 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61df9d53-92e8-439f-8d15-44e96d25a23e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ch46n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:54Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.906167 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxrkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37625968-279a-4fc1-bfa2-b03868e7363d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca1d5b78e98dc1b35785758a9e44908823d0f5589f5a8d505ea1e909bb97dbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vls65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxrkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:54Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.926430 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:54Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.940854 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:54Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.954565 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:54Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.966517 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:54Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.986380 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.986426 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.986439 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.986456 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.986468 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:54Z","lastTransitionTime":"2025-12-06T09:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:54 crc kubenswrapper[4672]: I1206 09:06:54.988019 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19106a472fff58d8192b9ca06cf0f166038a1ef4ddd83e83bc3384bd41b3e8de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19106a472fff58d8192b9ca06cf0f166038a1ef4ddd83e83bc3384bd41b3e8de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"message\\\":\\\"ormer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z]\\\\nI1206 09:06:53.047780 6007 services_controller.go:434] Service openshift-authentication/oauth-openshift retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{oauth-openshift openshift-authentication 327e9277-4a34-458b-9afd-a4d0b83d7a80 5000 0 2025-02-23 05:23:11 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:oauth-openshift] map[operator.openshift.io/spec-hash:d9e6d53076d47ab2d123d8b1ba8ec6543488d973dcc4e02349493cd1c33bce83 service.alpha.openshift.io/serving-cert-secret-name:v4-0-config-system-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 6443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: oauth-openshi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xbbs5_openshift-ovn-kubernetes(713432b9-3b28-4ad0-b578-9d42aa1931aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:54Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.004724 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.018684 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.036334 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.051108 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://640f821886d65eb7ea8dc8ec35be25c33457d1dfb440632b932dfc0cb39b7b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.064576 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.081048 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.088925 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.088997 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.089012 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.089029 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.089040 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:55Z","lastTransitionTime":"2025-12-06T09:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.096645 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.110902 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.134777 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19106a472fff58d8192b9ca06cf0f166038a1ef4ddd83e83bc3384bd41b3e8de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19106a472fff58d8192b9ca06cf0f166038a1ef4ddd83e83bc3384bd41b3e8de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"message\\\":\\\"ormer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z]\\\\nI1206 09:06:53.047780 6007 services_controller.go:434] Service openshift-authentication/oauth-openshift retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{oauth-openshift openshift-authentication 327e9277-4a34-458b-9afd-a4d0b83d7a80 5000 0 2025-02-23 05:23:11 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:oauth-openshift] map[operator.openshift.io/spec-hash:d9e6d53076d47ab2d123d8b1ba8ec6543488d973dcc4e02349493cd1c33bce83 service.alpha.openshift.io/serving-cert-secret-name:v4-0-config-system-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 6443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: oauth-openshi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xbbs5_openshift-ovn-kubernetes(713432b9-3b28-4ad0-b578-9d42aa1931aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.155922 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.170335 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.184860 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.191305 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.191346 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.191357 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.191374 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.191384 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:55Z","lastTransitionTime":"2025-12-06T09:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.204676 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://640f821886d65eb7ea8dc8ec35be25c33457d1dfb440632b932dfc0cb39b7b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.220259 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.235886 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.250083 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.264524 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.264937 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-w587t"] Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.265427 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:06:55 crc kubenswrapper[4672]: E1206 09:06:55.265493 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.278223 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61df9d53-92e8-439f-8d15-44e96d25a23e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cad8f3bb7aca435b771c2e1843d53eabc28463caaf29de4650edcf6681ca9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e75ceaf93a3d396036177b57e3f468fb6bc704896dc27cd2e8ab6924eab53b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ch46n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.289924 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxrkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37625968-279a-4fc1-bfa2-b03868e7363d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca1d5b78e98dc1b35785758a9e44908823d0f5589f5a8d505ea1e909bb97dbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vls65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxrkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.293515 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.293553 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.293586 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.293623 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.293635 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:55Z","lastTransitionTime":"2025-12-06T09:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.302221 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.315109 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.315238 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qdq7\" (UniqueName: \"kubernetes.io/projected/fca5f829-3091-4191-abf5-2bece3ab91f7-kube-api-access-9qdq7\") pod \"network-metrics-daemon-w587t\" (UID: \"fca5f829-3091-4191-abf5-2bece3ab91f7\") " pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.315307 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fca5f829-3091-4191-abf5-2bece3ab91f7-metrics-certs\") pod \"network-metrics-daemon-w587t\" (UID: \"fca5f829-3091-4191-abf5-2bece3ab91f7\") " pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.325702 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.339775 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.352744 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61df9d53-92e8-439f-8d15-44e96d25a23e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cad8f3bb7aca435b771c2e1843d53eabc28463caaf29de4650edcf6681ca9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e75ceaf93a3d396036177b57e3f468fb6bc704896dc27cd2e8ab6924eab53b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ch46n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.364946 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-w587t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fca5f829-3091-4191-abf5-2bece3ab91f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-w587t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.379264 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.391501 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.395567 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.395715 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.395748 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.395777 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.395797 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:55Z","lastTransitionTime":"2025-12-06T09:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.404657 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.415934 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fca5f829-3091-4191-abf5-2bece3ab91f7-metrics-certs\") pod \"network-metrics-daemon-w587t\" (UID: \"fca5f829-3091-4191-abf5-2bece3ab91f7\") " pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.416035 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qdq7\" (UniqueName: \"kubernetes.io/projected/fca5f829-3091-4191-abf5-2bece3ab91f7-kube-api-access-9qdq7\") pod \"network-metrics-daemon-w587t\" (UID: \"fca5f829-3091-4191-abf5-2bece3ab91f7\") " pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:06:55 crc kubenswrapper[4672]: E1206 09:06:55.416198 4672 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 09:06:55 crc kubenswrapper[4672]: E1206 09:06:55.416313 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fca5f829-3091-4191-abf5-2bece3ab91f7-metrics-certs podName:fca5f829-3091-4191-abf5-2bece3ab91f7 nodeName:}" failed. No retries permitted until 2025-12-06 09:06:55.916282934 +0000 UTC m=+33.660543261 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fca5f829-3091-4191-abf5-2bece3ab91f7-metrics-certs") pod "network-metrics-daemon-w587t" (UID: "fca5f829-3091-4191-abf5-2bece3ab91f7") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.418492 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxrkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37625968-279a-4fc1-bfa2-b03868e7363d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca1d5b78e98dc1b35785758a9e44908823d0f5589f5a8d505ea1e909bb97dbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vls65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxrkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.437165 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qdq7\" (UniqueName: \"kubernetes.io/projected/fca5f829-3091-4191-abf5-2bece3ab91f7-kube-api-access-9qdq7\") pod \"network-metrics-daemon-w587t\" (UID: \"fca5f829-3091-4191-abf5-2bece3ab91f7\") " pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.449738 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19106a472fff58d8192b9ca06cf0f166038a1ef4ddd83e83bc3384bd41b3e8de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19106a472fff58d8192b9ca06cf0f166038a1ef4ddd83e83bc3384bd41b3e8de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"message\\\":\\\"ormer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z]\\\\nI1206 09:06:53.047780 6007 services_controller.go:434] Service openshift-authentication/oauth-openshift retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{oauth-openshift openshift-authentication 327e9277-4a34-458b-9afd-a4d0b83d7a80 5000 0 2025-02-23 05:23:11 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:oauth-openshift] map[operator.openshift.io/spec-hash:d9e6d53076d47ab2d123d8b1ba8ec6543488d973dcc4e02349493cd1c33bce83 service.alpha.openshift.io/serving-cert-secret-name:v4-0-config-system-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 6443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: oauth-openshi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xbbs5_openshift-ovn-kubernetes(713432b9-3b28-4ad0-b578-9d42aa1931aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.465250 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.478829 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.491431 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.497638 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.497667 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.497679 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.497694 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.497706 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:55Z","lastTransitionTime":"2025-12-06T09:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.506332 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://640f821886d65eb7ea8dc8ec35be25c33457d1dfb440632b932dfc0cb39b7b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.521294 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.534996 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.550888 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.562214 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:55Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.601004 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.601055 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.601072 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.601097 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.601114 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:55Z","lastTransitionTime":"2025-12-06T09:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.703179 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.703217 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.703225 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.703240 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.703249 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:55Z","lastTransitionTime":"2025-12-06T09:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.807929 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.808292 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.808425 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.808567 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.808764 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:55Z","lastTransitionTime":"2025-12-06T09:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.911983 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.912045 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.912065 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.912154 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.912188 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:55Z","lastTransitionTime":"2025-12-06T09:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:55 crc kubenswrapper[4672]: I1206 09:06:55.920831 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fca5f829-3091-4191-abf5-2bece3ab91f7-metrics-certs\") pod \"network-metrics-daemon-w587t\" (UID: \"fca5f829-3091-4191-abf5-2bece3ab91f7\") " pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:06:55 crc kubenswrapper[4672]: E1206 09:06:55.921006 4672 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 09:06:55 crc kubenswrapper[4672]: E1206 09:06:55.921251 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fca5f829-3091-4191-abf5-2bece3ab91f7-metrics-certs podName:fca5f829-3091-4191-abf5-2bece3ab91f7 nodeName:}" failed. No retries permitted until 2025-12-06 09:06:56.921187581 +0000 UTC m=+34.665447868 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fca5f829-3091-4191-abf5-2bece3ab91f7-metrics-certs") pod "network-metrics-daemon-w587t" (UID: "fca5f829-3091-4191-abf5-2bece3ab91f7") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.014253 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.014287 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.014301 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.014321 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.014334 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:56Z","lastTransitionTime":"2025-12-06T09:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.117156 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.117218 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.117234 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.117259 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.117276 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:56Z","lastTransitionTime":"2025-12-06T09:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.220373 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.220429 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.220445 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.220468 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.220487 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:56Z","lastTransitionTime":"2025-12-06T09:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.323658 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.324017 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.324126 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.324220 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.324318 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:56Z","lastTransitionTime":"2025-12-06T09:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.427322 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.427375 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.427387 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.427411 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.427428 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:56Z","lastTransitionTime":"2025-12-06T09:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.530223 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.530282 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.530294 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.530310 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.530321 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:56Z","lastTransitionTime":"2025-12-06T09:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.556580 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.556639 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.556581 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:06:56 crc kubenswrapper[4672]: E1206 09:06:56.556756 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:06:56 crc kubenswrapper[4672]: E1206 09:06:56.556840 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:06:56 crc kubenswrapper[4672]: E1206 09:06:56.556897 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.557230 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:06:56 crc kubenswrapper[4672]: E1206 09:06:56.557413 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.633381 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.633463 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.633486 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.633515 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.633537 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:56Z","lastTransitionTime":"2025-12-06T09:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.736938 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.737000 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.737016 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.737039 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.737056 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:56Z","lastTransitionTime":"2025-12-06T09:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.840326 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.840358 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.840367 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.840379 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.840388 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:56Z","lastTransitionTime":"2025-12-06T09:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.932382 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fca5f829-3091-4191-abf5-2bece3ab91f7-metrics-certs\") pod \"network-metrics-daemon-w587t\" (UID: \"fca5f829-3091-4191-abf5-2bece3ab91f7\") " pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:06:56 crc kubenswrapper[4672]: E1206 09:06:56.932528 4672 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 09:06:56 crc kubenswrapper[4672]: E1206 09:06:56.932628 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fca5f829-3091-4191-abf5-2bece3ab91f7-metrics-certs podName:fca5f829-3091-4191-abf5-2bece3ab91f7 nodeName:}" failed. No retries permitted until 2025-12-06 09:06:58.932585648 +0000 UTC m=+36.676845955 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fca5f829-3091-4191-abf5-2bece3ab91f7-metrics-certs") pod "network-metrics-daemon-w587t" (UID: "fca5f829-3091-4191-abf5-2bece3ab91f7") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.942433 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.942462 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.942471 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.942485 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:56 crc kubenswrapper[4672]: I1206 09:06:56.942498 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:56Z","lastTransitionTime":"2025-12-06T09:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.045462 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.045513 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.045525 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.045543 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.045555 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:57Z","lastTransitionTime":"2025-12-06T09:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.136072 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.148642 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.148705 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.148721 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.148771 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.148786 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:57Z","lastTransitionTime":"2025-12-06T09:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.152107 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:57Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.175411 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://640f821886d65eb7ea8dc8ec35be25c33457d1dfb440632b932dfc0cb39b7b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:57Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.197361 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:57Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.215729 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:57Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.233393 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:57Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.252142 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.252213 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.252229 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.252252 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.252268 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:57Z","lastTransitionTime":"2025-12-06T09:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.256802 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:57Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.276595 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:57Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.294200 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61df9d53-92e8-439f-8d15-44e96d25a23e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cad8f3bb7aca435b771c2e1843d53eabc28463caaf29de4650edcf6681ca9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e75ceaf93a3d396036177b57e3f468fb6bc704896dc27cd2e8ab6924eab53b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ch46n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:57Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.310833 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-w587t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fca5f829-3091-4191-abf5-2bece3ab91f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-w587t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:57Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.331643 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:57Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.347865 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:57Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.354981 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.355339 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.355692 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.355996 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.356189 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:57Z","lastTransitionTime":"2025-12-06T09:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.365241 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:57Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.379262 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxrkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37625968-279a-4fc1-bfa2-b03868e7363d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca1d5b78e98dc1b35785758a9e44908823d0f5589f5a8d505ea1e909bb97dbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vls65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxrkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:57Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.403597 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19106a472fff58d8192b9ca06cf0f166038a1ef4ddd83e83bc3384bd41b3e8de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19106a472fff58d8192b9ca06cf0f166038a1ef4ddd83e83bc3384bd41b3e8de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"message\\\":\\\"ormer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z]\\\\nI1206 09:06:53.047780 6007 services_controller.go:434] Service openshift-authentication/oauth-openshift retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{oauth-openshift openshift-authentication 327e9277-4a34-458b-9afd-a4d0b83d7a80 5000 0 2025-02-23 05:23:11 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:oauth-openshift] map[operator.openshift.io/spec-hash:d9e6d53076d47ab2d123d8b1ba8ec6543488d973dcc4e02349493cd1c33bce83 service.alpha.openshift.io/serving-cert-secret-name:v4-0-config-system-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 6443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: oauth-openshi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xbbs5_openshift-ovn-kubernetes(713432b9-3b28-4ad0-b578-9d42aa1931aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:57Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.418851 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:57Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.435720 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:57Z is after 2025-08-24T17:21:41Z" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.459741 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.459794 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.459807 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.459829 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.459844 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:57Z","lastTransitionTime":"2025-12-06T09:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.563587 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.564199 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.564403 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.564642 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.564832 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:57Z","lastTransitionTime":"2025-12-06T09:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.668430 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.668934 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.669042 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.669148 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.669212 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:57Z","lastTransitionTime":"2025-12-06T09:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.772169 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.772211 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.772221 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.772236 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.772247 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:57Z","lastTransitionTime":"2025-12-06T09:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.874086 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.874451 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.874761 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.874885 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.874970 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:57Z","lastTransitionTime":"2025-12-06T09:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.978280 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.978366 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.978395 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.978430 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:57 crc kubenswrapper[4672]: I1206 09:06:57.978454 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:57Z","lastTransitionTime":"2025-12-06T09:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.082195 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.082262 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.082281 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.082307 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.082325 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:58Z","lastTransitionTime":"2025-12-06T09:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.086917 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.087928 4672 scope.go:117] "RemoveContainer" containerID="19106a472fff58d8192b9ca06cf0f166038a1ef4ddd83e83bc3384bd41b3e8de" Dec 06 09:06:58 crc kubenswrapper[4672]: E1206 09:06:58.088125 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xbbs5_openshift-ovn-kubernetes(713432b9-3b28-4ad0-b578-9d42aa1931aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.185297 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.185338 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.185346 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.185362 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.185371 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:58Z","lastTransitionTime":"2025-12-06T09:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.287891 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.288151 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.288175 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.288205 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.288227 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:58Z","lastTransitionTime":"2025-12-06T09:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.348204 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.348344 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:06:58 crc kubenswrapper[4672]: E1206 09:06:58.348432 4672 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 09:06:58 crc kubenswrapper[4672]: E1206 09:06:58.348466 4672 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 09:06:58 crc kubenswrapper[4672]: E1206 09:06:58.348553 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 09:07:14.348525694 +0000 UTC m=+52.092786011 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 09:06:58 crc kubenswrapper[4672]: E1206 09:06:58.348587 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 09:07:14.348571945 +0000 UTC m=+52.092832272 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.390763 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.390809 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.390820 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.390838 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.390850 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:58Z","lastTransitionTime":"2025-12-06T09:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.449645 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.449754 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.449787 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:06:58 crc kubenswrapper[4672]: E1206 09:06:58.449874 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:07:14.449842746 +0000 UTC m=+52.194103043 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:06:58 crc kubenswrapper[4672]: E1206 09:06:58.449895 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 09:06:58 crc kubenswrapper[4672]: E1206 09:06:58.449910 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 09:06:58 crc kubenswrapper[4672]: E1206 09:06:58.449923 4672 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:06:58 crc kubenswrapper[4672]: E1206 09:06:58.449975 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 09:07:14.449961329 +0000 UTC m=+52.194221616 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:06:58 crc kubenswrapper[4672]: E1206 09:06:58.450033 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 09:06:58 crc kubenswrapper[4672]: E1206 09:06:58.450069 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 09:06:58 crc kubenswrapper[4672]: E1206 09:06:58.450088 4672 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:06:58 crc kubenswrapper[4672]: E1206 09:06:58.450171 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 09:07:14.450153574 +0000 UTC m=+52.194413901 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.493677 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.494035 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.494177 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.494320 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.494498 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:58Z","lastTransitionTime":"2025-12-06T09:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.556483 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.556558 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:06:58 crc kubenswrapper[4672]: E1206 09:06:58.556724 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.556776 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.556845 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:06:58 crc kubenswrapper[4672]: E1206 09:06:58.557016 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:06:58 crc kubenswrapper[4672]: E1206 09:06:58.557134 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:06:58 crc kubenswrapper[4672]: E1206 09:06:58.557209 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.597434 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.597464 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.597473 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.597488 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.597498 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:58Z","lastTransitionTime":"2025-12-06T09:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.700124 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.700168 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.700181 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.700199 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.700211 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:58Z","lastTransitionTime":"2025-12-06T09:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.803343 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.803410 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.803420 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.803434 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.803444 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:58Z","lastTransitionTime":"2025-12-06T09:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.905783 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.905824 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.905837 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.905855 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.905866 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:58Z","lastTransitionTime":"2025-12-06T09:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:58 crc kubenswrapper[4672]: I1206 09:06:58.955086 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fca5f829-3091-4191-abf5-2bece3ab91f7-metrics-certs\") pod \"network-metrics-daemon-w587t\" (UID: \"fca5f829-3091-4191-abf5-2bece3ab91f7\") " pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:06:58 crc kubenswrapper[4672]: E1206 09:06:58.955269 4672 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 09:06:58 crc kubenswrapper[4672]: E1206 09:06:58.955332 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fca5f829-3091-4191-abf5-2bece3ab91f7-metrics-certs podName:fca5f829-3091-4191-abf5-2bece3ab91f7 nodeName:}" failed. No retries permitted until 2025-12-06 09:07:02.955319078 +0000 UTC m=+40.699579365 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fca5f829-3091-4191-abf5-2bece3ab91f7-metrics-certs") pod "network-metrics-daemon-w587t" (UID: "fca5f829-3091-4191-abf5-2bece3ab91f7") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.008946 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.009024 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.009046 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.009076 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.009104 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:59Z","lastTransitionTime":"2025-12-06T09:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.113981 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.114052 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.114074 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.114101 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.114121 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:59Z","lastTransitionTime":"2025-12-06T09:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.217272 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.217328 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.217346 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.217367 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.217385 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:59Z","lastTransitionTime":"2025-12-06T09:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.320365 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.320400 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.320410 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.320423 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.320433 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:59Z","lastTransitionTime":"2025-12-06T09:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.423331 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.423401 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.423417 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.423441 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.423460 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:59Z","lastTransitionTime":"2025-12-06T09:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.526373 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.526413 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.526430 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.526454 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.526471 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:59Z","lastTransitionTime":"2025-12-06T09:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.630816 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.630854 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.630863 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.630877 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.630887 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:59Z","lastTransitionTime":"2025-12-06T09:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.733326 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.733379 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.733394 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.733412 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.733425 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:59Z","lastTransitionTime":"2025-12-06T09:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.837120 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.837186 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.837208 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.837237 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.837256 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:59Z","lastTransitionTime":"2025-12-06T09:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.940303 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.940359 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.940380 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.940408 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:06:59 crc kubenswrapper[4672]: I1206 09:06:59.940426 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:06:59Z","lastTransitionTime":"2025-12-06T09:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.043429 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.043463 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.043472 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.043484 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.043492 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:00Z","lastTransitionTime":"2025-12-06T09:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.145975 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.146008 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.146017 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.146030 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.146038 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:00Z","lastTransitionTime":"2025-12-06T09:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.248672 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.248723 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.248737 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.248758 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.248768 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:00Z","lastTransitionTime":"2025-12-06T09:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.351959 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.352011 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.352023 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.352040 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.352052 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:00Z","lastTransitionTime":"2025-12-06T09:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.454803 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.454840 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.454849 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.454863 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.454875 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:00Z","lastTransitionTime":"2025-12-06T09:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.555788 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:00 crc kubenswrapper[4672]: E1206 09:07:00.555930 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.556157 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.556427 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.556572 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:00 crc kubenswrapper[4672]: E1206 09:07:00.556535 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.556984 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.557012 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.557023 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.557038 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.557049 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:00Z","lastTransitionTime":"2025-12-06T09:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:00 crc kubenswrapper[4672]: E1206 09:07:00.557272 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:07:00 crc kubenswrapper[4672]: E1206 09:07:00.557273 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.659699 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.659762 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.659781 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.659805 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.659824 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:00Z","lastTransitionTime":"2025-12-06T09:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.762128 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.762162 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.762174 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.762189 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.762204 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:00Z","lastTransitionTime":"2025-12-06T09:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.864860 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.865124 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.865770 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.865809 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.865827 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:00Z","lastTransitionTime":"2025-12-06T09:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.967491 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.967543 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.967561 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.967581 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:00 crc kubenswrapper[4672]: I1206 09:07:00.967596 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:00Z","lastTransitionTime":"2025-12-06T09:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.070884 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.071781 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.071821 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.071845 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.071886 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:01Z","lastTransitionTime":"2025-12-06T09:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.175582 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.175684 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.175714 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.175746 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.175768 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:01Z","lastTransitionTime":"2025-12-06T09:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.279632 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.279689 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.279701 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.279722 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.279738 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:01Z","lastTransitionTime":"2025-12-06T09:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.387585 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.387697 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.387706 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.387722 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.387730 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:01Z","lastTransitionTime":"2025-12-06T09:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.490567 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.490845 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.491000 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.491288 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.491540 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:01Z","lastTransitionTime":"2025-12-06T09:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.594826 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.594875 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.594891 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.594913 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.594930 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:01Z","lastTransitionTime":"2025-12-06T09:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.697751 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.697795 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.697803 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.697820 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.697829 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:01Z","lastTransitionTime":"2025-12-06T09:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.799966 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.800001 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.800010 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.800023 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.800033 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:01Z","lastTransitionTime":"2025-12-06T09:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.902588 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.902917 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.902986 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.903059 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:01 crc kubenswrapper[4672]: I1206 09:07:01.903115 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:01Z","lastTransitionTime":"2025-12-06T09:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.006555 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.006806 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.006933 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.007004 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.007063 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:02Z","lastTransitionTime":"2025-12-06T09:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.109250 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.109840 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.109918 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.110037 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.110101 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:02Z","lastTransitionTime":"2025-12-06T09:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.212688 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.212717 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.212726 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.212741 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.212750 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:02Z","lastTransitionTime":"2025-12-06T09:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.315412 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.315477 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.315500 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.315527 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.315547 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:02Z","lastTransitionTime":"2025-12-06T09:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.418035 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.418088 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.418105 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.418132 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.418150 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:02Z","lastTransitionTime":"2025-12-06T09:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.520964 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.521028 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.521040 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.521056 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.521067 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:02Z","lastTransitionTime":"2025-12-06T09:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.556958 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.556958 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:02 crc kubenswrapper[4672]: E1206 09:07:02.557142 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.556989 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.557219 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:02 crc kubenswrapper[4672]: E1206 09:07:02.557323 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:07:02 crc kubenswrapper[4672]: E1206 09:07:02.557413 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:07:02 crc kubenswrapper[4672]: E1206 09:07:02.557496 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.575102 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxrkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37625968-279a-4fc1-bfa2-b03868e7363d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca1d5b78e98dc1b35785758a9e44908823d0f5589f5a8d505ea1e909bb97dbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vls65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxrkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:02Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.589513 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-w587t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fca5f829-3091-4191-abf5-2bece3ab91f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-w587t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:02Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.610128 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:02Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.622796 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.622847 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.622860 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.622880 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.622894 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:02Z","lastTransitionTime":"2025-12-06T09:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.627466 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:02Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.641298 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:02Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.655355 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:02Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.676011 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19106a472fff58d8192b9ca06cf0f166038a1ef4ddd83e83bc3384bd41b3e8de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19106a472fff58d8192b9ca06cf0f166038a1ef4ddd83e83bc3384bd41b3e8de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"message\\\":\\\"ormer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z]\\\\nI1206 09:06:53.047780 6007 services_controller.go:434] Service openshift-authentication/oauth-openshift retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{oauth-openshift openshift-authentication 327e9277-4a34-458b-9afd-a4d0b83d7a80 5000 0 2025-02-23 05:23:11 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:oauth-openshift] map[operator.openshift.io/spec-hash:d9e6d53076d47ab2d123d8b1ba8ec6543488d973dcc4e02349493cd1c33bce83 service.alpha.openshift.io/serving-cert-secret-name:v4-0-config-system-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 6443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: oauth-openshi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xbbs5_openshift-ovn-kubernetes(713432b9-3b28-4ad0-b578-9d42aa1931aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:02Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.700871 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:02Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.713488 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:02Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.724749 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.724783 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.724791 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.724806 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.724817 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:02Z","lastTransitionTime":"2025-12-06T09:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.725723 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:02Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.741865 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://640f821886d65eb7ea8dc8ec35be25c33457d1dfb440632b932dfc0cb39b7b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:02Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.758028 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:02Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.774769 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:02Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.791856 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:02Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.807656 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:02Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.820024 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61df9d53-92e8-439f-8d15-44e96d25a23e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cad8f3bb7aca435b771c2e1843d53eabc28463caaf29de4650edcf6681ca9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e75ceaf93a3d396036177b57e3f468fb6bc704896dc27cd2e8ab6924eab53b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ch46n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:02Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.826920 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.826963 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.826975 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.826991 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.827002 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:02Z","lastTransitionTime":"2025-12-06T09:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.929591 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.929718 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.929734 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.929759 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:02 crc kubenswrapper[4672]: I1206 09:07:02.929779 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:02Z","lastTransitionTime":"2025-12-06T09:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.002886 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fca5f829-3091-4191-abf5-2bece3ab91f7-metrics-certs\") pod \"network-metrics-daemon-w587t\" (UID: \"fca5f829-3091-4191-abf5-2bece3ab91f7\") " pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:03 crc kubenswrapper[4672]: E1206 09:07:03.003089 4672 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 09:07:03 crc kubenswrapper[4672]: E1206 09:07:03.003189 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fca5f829-3091-4191-abf5-2bece3ab91f7-metrics-certs podName:fca5f829-3091-4191-abf5-2bece3ab91f7 nodeName:}" failed. No retries permitted until 2025-12-06 09:07:11.003168022 +0000 UTC m=+48.747428309 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fca5f829-3091-4191-abf5-2bece3ab91f7-metrics-certs") pod "network-metrics-daemon-w587t" (UID: "fca5f829-3091-4191-abf5-2bece3ab91f7") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.031938 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.031994 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.032007 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.032023 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.032034 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:03Z","lastTransitionTime":"2025-12-06T09:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.134973 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.135063 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.135084 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.135118 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.135138 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:03Z","lastTransitionTime":"2025-12-06T09:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.237989 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.238070 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.238097 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.238127 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.238149 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:03Z","lastTransitionTime":"2025-12-06T09:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.341068 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.341110 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.341120 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.341136 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.341147 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:03Z","lastTransitionTime":"2025-12-06T09:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.444099 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.444152 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.444169 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.444189 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.444201 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:03Z","lastTransitionTime":"2025-12-06T09:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.546505 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.546544 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.546556 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.546577 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.546590 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:03Z","lastTransitionTime":"2025-12-06T09:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.649247 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.649318 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.649343 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.649368 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.649386 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:03Z","lastTransitionTime":"2025-12-06T09:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.752213 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.752318 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.752336 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.752357 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.752371 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:03Z","lastTransitionTime":"2025-12-06T09:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.856740 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.856785 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.856798 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.856818 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.856832 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:03Z","lastTransitionTime":"2025-12-06T09:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.959795 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.959872 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.959896 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.959927 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:03 crc kubenswrapper[4672]: I1206 09:07:03.959945 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:03Z","lastTransitionTime":"2025-12-06T09:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.062706 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.062767 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.062789 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.062819 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.062842 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:04Z","lastTransitionTime":"2025-12-06T09:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.166157 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.166208 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.166226 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.166248 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.166264 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:04Z","lastTransitionTime":"2025-12-06T09:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.269718 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.269795 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.269806 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.269836 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.269859 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:04Z","lastTransitionTime":"2025-12-06T09:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.373247 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.373284 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.373295 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.373310 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.373321 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:04Z","lastTransitionTime":"2025-12-06T09:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.476866 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.476962 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.476973 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.476995 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.477009 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:04Z","lastTransitionTime":"2025-12-06T09:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.556662 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.556693 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.556748 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.556808 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:04 crc kubenswrapper[4672]: E1206 09:07:04.557041 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:07:04 crc kubenswrapper[4672]: E1206 09:07:04.557233 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:07:04 crc kubenswrapper[4672]: E1206 09:07:04.557379 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:07:04 crc kubenswrapper[4672]: E1206 09:07:04.557529 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.580959 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.581067 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.581091 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.581125 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.581143 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:04Z","lastTransitionTime":"2025-12-06T09:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.685050 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.685108 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.685120 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.685142 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.685155 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:04Z","lastTransitionTime":"2025-12-06T09:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.697746 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.697821 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.697845 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.697877 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.697905 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:04Z","lastTransitionTime":"2025-12-06T09:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:04 crc kubenswrapper[4672]: E1206 09:07:04.720951 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dee4872a-ee41-4a28-b591-3da52b9dd3d6\\\",\\\"systemUUID\\\":\\\"7e6e2ea0-eb53-4cec-8366-444329cefc63\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:04Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.726284 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.726350 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.726374 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.726404 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.726425 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:04Z","lastTransitionTime":"2025-12-06T09:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:04 crc kubenswrapper[4672]: E1206 09:07:04.750152 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dee4872a-ee41-4a28-b591-3da52b9dd3d6\\\",\\\"systemUUID\\\":\\\"7e6e2ea0-eb53-4cec-8366-444329cefc63\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:04Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.755778 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.755828 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.755846 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.755870 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.755919 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:04Z","lastTransitionTime":"2025-12-06T09:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:04 crc kubenswrapper[4672]: E1206 09:07:04.777046 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dee4872a-ee41-4a28-b591-3da52b9dd3d6\\\",\\\"systemUUID\\\":\\\"7e6e2ea0-eb53-4cec-8366-444329cefc63\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:04Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.782438 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.782512 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.782531 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.782559 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.782580 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:04Z","lastTransitionTime":"2025-12-06T09:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:04 crc kubenswrapper[4672]: E1206 09:07:04.801033 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dee4872a-ee41-4a28-b591-3da52b9dd3d6\\\",\\\"systemUUID\\\":\\\"7e6e2ea0-eb53-4cec-8366-444329cefc63\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:04Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.806365 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.806421 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.806433 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.806454 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.806467 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:04Z","lastTransitionTime":"2025-12-06T09:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:04 crc kubenswrapper[4672]: E1206 09:07:04.827883 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dee4872a-ee41-4a28-b591-3da52b9dd3d6\\\",\\\"systemUUID\\\":\\\"7e6e2ea0-eb53-4cec-8366-444329cefc63\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:04Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:04 crc kubenswrapper[4672]: E1206 09:07:04.828106 4672 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.830379 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.830433 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.830450 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.830474 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.830491 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:04Z","lastTransitionTime":"2025-12-06T09:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.933064 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.933140 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.933159 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.933182 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:04 crc kubenswrapper[4672]: I1206 09:07:04.933200 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:04Z","lastTransitionTime":"2025-12-06T09:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.036650 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.036732 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.036751 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.036776 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.036793 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:05Z","lastTransitionTime":"2025-12-06T09:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.140061 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.140133 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.140165 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.140197 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.140222 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:05Z","lastTransitionTime":"2025-12-06T09:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.243578 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.243674 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.243692 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.243718 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.243737 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:05Z","lastTransitionTime":"2025-12-06T09:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.346192 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.346238 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.346249 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.346265 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.346276 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:05Z","lastTransitionTime":"2025-12-06T09:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.449372 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.449405 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.449414 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.449427 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.449436 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:05Z","lastTransitionTime":"2025-12-06T09:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.551915 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.551961 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.551977 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.551999 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.552018 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:05Z","lastTransitionTime":"2025-12-06T09:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.654208 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.654265 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.654281 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.654303 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.654319 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:05Z","lastTransitionTime":"2025-12-06T09:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.759630 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.760221 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.760243 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.760258 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.760267 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:05Z","lastTransitionTime":"2025-12-06T09:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.862835 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.862868 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.862877 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.862889 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.862898 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:05Z","lastTransitionTime":"2025-12-06T09:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.965519 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.965581 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.965626 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.965651 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:05 crc kubenswrapper[4672]: I1206 09:07:05.965668 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:05Z","lastTransitionTime":"2025-12-06T09:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.068980 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.069057 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.069107 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.069139 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.069160 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:06Z","lastTransitionTime":"2025-12-06T09:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.172269 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.172307 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.172315 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.172330 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.172340 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:06Z","lastTransitionTime":"2025-12-06T09:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.275130 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.275201 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.275226 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.275256 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.275277 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:06Z","lastTransitionTime":"2025-12-06T09:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.377569 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.377675 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.377691 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.377709 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.377739 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:06Z","lastTransitionTime":"2025-12-06T09:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.480464 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.480532 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.480543 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.480558 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.480567 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:06Z","lastTransitionTime":"2025-12-06T09:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.557827 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:06 crc kubenswrapper[4672]: E1206 09:07:06.557973 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.558349 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:06 crc kubenswrapper[4672]: E1206 09:07:06.558524 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.558406 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:06 crc kubenswrapper[4672]: E1206 09:07:06.558720 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.558378 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:06 crc kubenswrapper[4672]: E1206 09:07:06.558876 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.583740 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.583996 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.584072 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.584173 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.584244 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:06Z","lastTransitionTime":"2025-12-06T09:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.687284 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.687340 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.687363 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.687392 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.687414 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:06Z","lastTransitionTime":"2025-12-06T09:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.789843 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.789896 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.789912 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.789935 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.789953 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:06Z","lastTransitionTime":"2025-12-06T09:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.892468 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.892647 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.892676 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.892707 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.892728 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:06Z","lastTransitionTime":"2025-12-06T09:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.995665 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.995699 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.995707 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.995722 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:06 crc kubenswrapper[4672]: I1206 09:07:06.995730 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:06Z","lastTransitionTime":"2025-12-06T09:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.098715 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.099056 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.099336 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.099742 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.100110 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:07Z","lastTransitionTime":"2025-12-06T09:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.202447 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.202483 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.202491 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.202503 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.202512 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:07Z","lastTransitionTime":"2025-12-06T09:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.306023 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.306078 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.306089 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.306103 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.306114 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:07Z","lastTransitionTime":"2025-12-06T09:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.410867 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.410910 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.410922 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.410941 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.410957 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:07Z","lastTransitionTime":"2025-12-06T09:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.514407 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.514498 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.514529 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.514563 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.514582 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:07Z","lastTransitionTime":"2025-12-06T09:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.617093 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.617153 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.617168 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.617182 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.617191 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:07Z","lastTransitionTime":"2025-12-06T09:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.719385 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.719437 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.719450 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.719467 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.719479 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:07Z","lastTransitionTime":"2025-12-06T09:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.821624 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.821661 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.821670 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.821682 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.821689 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:07Z","lastTransitionTime":"2025-12-06T09:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.924928 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.925252 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.925274 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.925302 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:07 crc kubenswrapper[4672]: I1206 09:07:07.925324 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:07Z","lastTransitionTime":"2025-12-06T09:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.027326 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.027370 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.027378 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.027394 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.027403 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:08Z","lastTransitionTime":"2025-12-06T09:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.131146 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.131190 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.131199 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.131216 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.131225 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:08Z","lastTransitionTime":"2025-12-06T09:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.234192 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.234423 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.234522 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.234647 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.234724 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:08Z","lastTransitionTime":"2025-12-06T09:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.338043 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.338671 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.338700 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.338721 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.338736 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:08Z","lastTransitionTime":"2025-12-06T09:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.441856 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.441939 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.441964 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.441998 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.442024 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:08Z","lastTransitionTime":"2025-12-06T09:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.545119 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.545173 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.545188 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.545211 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.545225 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:08Z","lastTransitionTime":"2025-12-06T09:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.556542 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.556648 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.556674 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:08 crc kubenswrapper[4672]: E1206 09:07:08.556958 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.557065 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:08 crc kubenswrapper[4672]: E1206 09:07:08.557181 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:07:08 crc kubenswrapper[4672]: E1206 09:07:08.557367 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:07:08 crc kubenswrapper[4672]: E1206 09:07:08.557487 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.647277 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.647308 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.647316 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.647327 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.647338 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:08Z","lastTransitionTime":"2025-12-06T09:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.749464 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.749518 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.749533 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.749548 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.749556 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:08Z","lastTransitionTime":"2025-12-06T09:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.852255 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.852316 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.852338 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.852363 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.852385 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:08Z","lastTransitionTime":"2025-12-06T09:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.955537 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.955584 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.955619 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.955645 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:08 crc kubenswrapper[4672]: I1206 09:07:08.955666 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:08Z","lastTransitionTime":"2025-12-06T09:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.058451 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.058501 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.058512 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.058530 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.058540 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:09Z","lastTransitionTime":"2025-12-06T09:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.161421 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.161476 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.161487 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.161508 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.161521 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:09Z","lastTransitionTime":"2025-12-06T09:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.265335 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.265377 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.265415 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.265435 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.265449 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:09Z","lastTransitionTime":"2025-12-06T09:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.368371 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.368447 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.368475 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.368507 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.368531 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:09Z","lastTransitionTime":"2025-12-06T09:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.471259 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.471331 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.471353 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.471381 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.471405 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:09Z","lastTransitionTime":"2025-12-06T09:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.574062 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.574109 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.574121 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.574141 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.574155 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:09Z","lastTransitionTime":"2025-12-06T09:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.677221 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.677531 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.677775 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.678019 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.678235 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:09Z","lastTransitionTime":"2025-12-06T09:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.781781 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.782042 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.782244 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.782456 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.782708 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:09Z","lastTransitionTime":"2025-12-06T09:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.886163 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.886237 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.886256 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.886281 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.886300 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:09Z","lastTransitionTime":"2025-12-06T09:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.989576 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.989745 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.989771 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.989829 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:09 crc kubenswrapper[4672]: I1206 09:07:09.989874 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:09Z","lastTransitionTime":"2025-12-06T09:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.092458 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.092506 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.092522 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.092551 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.092568 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:10Z","lastTransitionTime":"2025-12-06T09:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.195650 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.195716 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.195742 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.195774 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.195809 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:10Z","lastTransitionTime":"2025-12-06T09:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.298475 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.298551 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.298576 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.298638 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.298673 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:10Z","lastTransitionTime":"2025-12-06T09:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.402039 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.402132 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.402158 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.402517 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.402537 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:10Z","lastTransitionTime":"2025-12-06T09:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.506188 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.506262 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.506284 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.506313 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.506335 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:10Z","lastTransitionTime":"2025-12-06T09:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.558346 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:10 crc kubenswrapper[4672]: E1206 09:07:10.558457 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.558581 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.558684 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:10 crc kubenswrapper[4672]: E1206 09:07:10.558759 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.558850 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:10 crc kubenswrapper[4672]: E1206 09:07:10.558858 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:07:10 crc kubenswrapper[4672]: E1206 09:07:10.558933 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.609725 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.609769 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.609778 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.609792 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.609804 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:10Z","lastTransitionTime":"2025-12-06T09:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.712500 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.712545 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.712555 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.712570 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.712581 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:10Z","lastTransitionTime":"2025-12-06T09:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.814983 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.815021 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.815034 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.815053 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.815066 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:10Z","lastTransitionTime":"2025-12-06T09:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.917930 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.917975 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.917987 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.918003 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:10 crc kubenswrapper[4672]: I1206 09:07:10.918015 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:10Z","lastTransitionTime":"2025-12-06T09:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.021855 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.021917 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.021941 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.021973 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.021997 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:11Z","lastTransitionTime":"2025-12-06T09:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.087753 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fca5f829-3091-4191-abf5-2bece3ab91f7-metrics-certs\") pod \"network-metrics-daemon-w587t\" (UID: \"fca5f829-3091-4191-abf5-2bece3ab91f7\") " pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:11 crc kubenswrapper[4672]: E1206 09:07:11.088005 4672 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 09:07:11 crc kubenswrapper[4672]: E1206 09:07:11.088124 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fca5f829-3091-4191-abf5-2bece3ab91f7-metrics-certs podName:fca5f829-3091-4191-abf5-2bece3ab91f7 nodeName:}" failed. No retries permitted until 2025-12-06 09:07:27.088087026 +0000 UTC m=+64.832347353 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fca5f829-3091-4191-abf5-2bece3ab91f7-metrics-certs") pod "network-metrics-daemon-w587t" (UID: "fca5f829-3091-4191-abf5-2bece3ab91f7") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.125354 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.125408 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.125426 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.125451 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.125470 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:11Z","lastTransitionTime":"2025-12-06T09:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.228223 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.228278 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.228331 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.228357 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.228376 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:11Z","lastTransitionTime":"2025-12-06T09:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.331762 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.331841 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.331931 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.331968 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.332025 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:11Z","lastTransitionTime":"2025-12-06T09:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.435303 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.435381 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.435405 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.435439 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.435460 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:11Z","lastTransitionTime":"2025-12-06T09:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.538637 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.538683 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.538695 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.538714 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.538731 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:11Z","lastTransitionTime":"2025-12-06T09:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.558101 4672 scope.go:117] "RemoveContainer" containerID="19106a472fff58d8192b9ca06cf0f166038a1ef4ddd83e83bc3384bd41b3e8de" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.640683 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.640726 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.640739 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.640755 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.640765 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:11Z","lastTransitionTime":"2025-12-06T09:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.743362 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.743400 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.743413 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.743431 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.743446 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:11Z","lastTransitionTime":"2025-12-06T09:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.845738 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.845790 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.845800 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.845819 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.845830 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:11Z","lastTransitionTime":"2025-12-06T09:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.904177 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbbs5_713432b9-3b28-4ad0-b578-9d42aa1931aa/ovnkube-controller/1.log" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.907787 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" event={"ID":"713432b9-3b28-4ad0-b578-9d42aa1931aa","Type":"ContainerStarted","Data":"24568caa62492721d1b712eac7a48bdb14d98f734bef0ec54e7a454771638d0c"} Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.908236 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.926867 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:11Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.940707 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:11Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.947883 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.947926 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.947942 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.947963 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.947979 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:11Z","lastTransitionTime":"2025-12-06T09:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.968580 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24568caa62492721d1b712eac7a48bdb14d98f734bef0ec54e7a454771638d0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19106a472fff58d8192b9ca06cf0f166038a1ef4ddd83e83bc3384bd41b3e8de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"message\\\":\\\"ormer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z]\\\\nI1206 09:06:53.047780 6007 services_controller.go:434] Service openshift-authentication/oauth-openshift retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{oauth-openshift openshift-authentication 327e9277-4a34-458b-9afd-a4d0b83d7a80 5000 0 2025-02-23 05:23:11 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:oauth-openshift] map[operator.openshift.io/spec-hash:d9e6d53076d47ab2d123d8b1ba8ec6543488d973dcc4e02349493cd1c33bce83 service.alpha.openshift.io/serving-cert-secret-name:v4-0-config-system-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 6443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: oauth-openshi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:11Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:11 crc kubenswrapper[4672]: I1206 09:07:11.985629 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://640f821886d65eb7ea8dc8ec35be25c33457d1dfb440632b932dfc0cb39b7b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:11Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.001702 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:11Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.015371 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.028631 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.040207 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.050293 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.050339 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.050355 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.050374 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.050387 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:12Z","lastTransitionTime":"2025-12-06T09:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.051736 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.064037 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.075389 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61df9d53-92e8-439f-8d15-44e96d25a23e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cad8f3bb7aca435b771c2e1843d53eabc28463caaf29de4650edcf6681ca9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e75ceaf93a3d396036177b57e3f468fb6bc704896dc27cd2e8ab6924eab53b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ch46n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.089715 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.102702 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.113555 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.127363 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxrkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37625968-279a-4fc1-bfa2-b03868e7363d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca1d5b78e98dc1b35785758a9e44908823d0f5589f5a8d505ea1e909bb97dbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vls65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxrkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.137497 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-w587t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fca5f829-3091-4191-abf5-2bece3ab91f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-w587t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.152217 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.152245 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.152253 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.152266 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.152275 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:12Z","lastTransitionTime":"2025-12-06T09:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.254308 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.254349 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.254363 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.254382 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.254392 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:12Z","lastTransitionTime":"2025-12-06T09:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.356889 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.356948 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.356957 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.356972 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.356983 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:12Z","lastTransitionTime":"2025-12-06T09:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.460453 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.460522 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.460540 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.460565 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.460583 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:12Z","lastTransitionTime":"2025-12-06T09:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.556698 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.556765 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.556769 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:12 crc kubenswrapper[4672]: E1206 09:07:12.556829 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.556857 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:12 crc kubenswrapper[4672]: E1206 09:07:12.557047 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:07:12 crc kubenswrapper[4672]: E1206 09:07:12.557070 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:07:12 crc kubenswrapper[4672]: E1206 09:07:12.557101 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.566048 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.566112 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.566135 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.566163 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.566186 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:12Z","lastTransitionTime":"2025-12-06T09:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.579009 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.599306 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.618424 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.630982 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxrkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37625968-279a-4fc1-bfa2-b03868e7363d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca1d5b78e98dc1b35785758a9e44908823d0f5589f5a8d505ea1e909bb97dbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vls65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxrkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.646915 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-w587t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fca5f829-3091-4191-abf5-2bece3ab91f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-w587t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.659517 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.668305 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.668436 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.668450 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.668473 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.668485 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:12Z","lastTransitionTime":"2025-12-06T09:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.672022 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.690066 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24568caa62492721d1b712eac7a48bdb14d98f734bef0ec54e7a454771638d0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19106a472fff58d8192b9ca06cf0f166038a1ef4ddd83e83bc3384bd41b3e8de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"message\\\":\\\"ormer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z]\\\\nI1206 09:06:53.047780 6007 services_controller.go:434] Service openshift-authentication/oauth-openshift retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{oauth-openshift openshift-authentication 327e9277-4a34-458b-9afd-a4d0b83d7a80 5000 0 2025-02-23 05:23:11 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:oauth-openshift] map[operator.openshift.io/spec-hash:d9e6d53076d47ab2d123d8b1ba8ec6543488d973dcc4e02349493cd1c33bce83 service.alpha.openshift.io/serving-cert-secret-name:v4-0-config-system-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 6443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: oauth-openshi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.691907 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.701849 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.704824 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.723550 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.735573 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.747649 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.763157 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://640f821886d65eb7ea8dc8ec35be25c33457d1dfb440632b932dfc0cb39b7b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.770762 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.770800 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.770809 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.770825 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.770835 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:12Z","lastTransitionTime":"2025-12-06T09:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.777881 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.791630 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.805239 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61df9d53-92e8-439f-8d15-44e96d25a23e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cad8f3bb7aca435b771c2e1843d53eabc28463caaf29de4650edcf6681ca9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e75ceaf93a3d396036177b57e3f468fb6bc704896dc27cd2e8ab6924eab53b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ch46n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.822272 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.837798 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.851207 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.873838 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.873886 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.873898 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.873917 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.873929 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:12Z","lastTransitionTime":"2025-12-06T09:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.876019 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxrkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37625968-279a-4fc1-bfa2-b03868e7363d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca1d5b78e98dc1b35785758a9e44908823d0f5589f5a8d505ea1e909bb97dbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vls65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxrkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.895441 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-w587t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fca5f829-3091-4191-abf5-2bece3ab91f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-w587t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.914120 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbbs5_713432b9-3b28-4ad0-b578-9d42aa1931aa/ovnkube-controller/2.log" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.914847 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae1cc18-abb2-44a0-a368-2e211e266739\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://111abfbdab45a6256108067f5721a4dc7c30ba86fb03b635515222586085b2a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be2b7d9693248ad452729c60f6ad3599f1ead11da1334fc50007a3457242d9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bdaa018e1393770e97100fcf2505232341157f89658f052ba5e27572967e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e00ab8bdef1709d73446eacca39c22e7ea478b5d5a272c362ce234c135b6f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e00ab8bdef1709d73446eacca39c22e7ea478b5d5a272c362ce234c135b6f21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.915384 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbbs5_713432b9-3b28-4ad0-b578-9d42aa1931aa/ovnkube-controller/1.log" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.919293 4672 generic.go:334] "Generic (PLEG): container finished" podID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerID="24568caa62492721d1b712eac7a48bdb14d98f734bef0ec54e7a454771638d0c" exitCode=1 Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.919434 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" event={"ID":"713432b9-3b28-4ad0-b578-9d42aa1931aa","Type":"ContainerDied","Data":"24568caa62492721d1b712eac7a48bdb14d98f734bef0ec54e7a454771638d0c"} Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.919506 4672 scope.go:117] "RemoveContainer" containerID="19106a472fff58d8192b9ca06cf0f166038a1ef4ddd83e83bc3384bd41b3e8de" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.920501 4672 scope.go:117] "RemoveContainer" containerID="24568caa62492721d1b712eac7a48bdb14d98f734bef0ec54e7a454771638d0c" Dec 06 09:07:12 crc kubenswrapper[4672]: E1206 09:07:12.921142 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xbbs5_openshift-ovn-kubernetes(713432b9-3b28-4ad0-b578-9d42aa1931aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.936416 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.951667 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.969931 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24568caa62492721d1b712eac7a48bdb14d98f734bef0ec54e7a454771638d0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19106a472fff58d8192b9ca06cf0f166038a1ef4ddd83e83bc3384bd41b3e8de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"message\\\":\\\"ormer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z]\\\\nI1206 09:06:53.047780 6007 services_controller.go:434] Service openshift-authentication/oauth-openshift retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{oauth-openshift openshift-authentication 327e9277-4a34-458b-9afd-a4d0b83d7a80 5000 0 2025-02-23 05:23:11 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:oauth-openshift] map[operator.openshift.io/spec-hash:d9e6d53076d47ab2d123d8b1ba8ec6543488d973dcc4e02349493cd1c33bce83 service.alpha.openshift.io/serving-cert-secret-name:v4-0-config-system-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 6443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: oauth-openshi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.976359 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.976400 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.976414 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.976503 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.976517 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:12Z","lastTransitionTime":"2025-12-06T09:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:12 crc kubenswrapper[4672]: I1206 09:07:12.984426 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.000016 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:12Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.014972 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:13Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.023863 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:13Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.037025 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://640f821886d65eb7ea8dc8ec35be25c33457d1dfb440632b932dfc0cb39b7b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:13Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.048344 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:13Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.063078 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:13Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.078739 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.078777 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.078788 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.078801 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.078810 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:13Z","lastTransitionTime":"2025-12-06T09:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.079307 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61df9d53-92e8-439f-8d15-44e96d25a23e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cad8f3bb7aca435b771c2e1843d53eabc28463caaf29de4650edcf6681ca9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e75ceaf93a3d396036177b57e3f468fb6bc704896dc27cd2e8ab6924eab53b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ch46n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:13Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.097150 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:13Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.109956 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61df9d53-92e8-439f-8d15-44e96d25a23e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cad8f3bb7aca435b771c2e1843d53eabc28463caaf29de4650edcf6681ca9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e75ceaf93a3d396036177b57e3f468fb6bc704896dc27cd2e8ab6924eab53b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ch46n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:13Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.124836 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:13Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.135372 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxrkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37625968-279a-4fc1-bfa2-b03868e7363d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca1d5b78e98dc1b35785758a9e44908823d0f5589f5a8d505ea1e909bb97dbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vls65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxrkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:13Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.146815 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-w587t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fca5f829-3091-4191-abf5-2bece3ab91f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-w587t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:13Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.160277 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:13Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.172807 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:13Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.181417 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.181454 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.181463 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.181478 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.181488 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:13Z","lastTransitionTime":"2025-12-06T09:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.185241 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:13Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.197298 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:13Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.220809 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24568caa62492721d1b712eac7a48bdb14d98f734bef0ec54e7a454771638d0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19106a472fff58d8192b9ca06cf0f166038a1ef4ddd83e83bc3384bd41b3e8de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"message\\\":\\\"ormer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:06:52Z is after 2025-08-24T17:21:41Z]\\\\nI1206 09:06:53.047780 6007 services_controller.go:434] Service openshift-authentication/oauth-openshift retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{oauth-openshift openshift-authentication 327e9277-4a34-458b-9afd-a4d0b83d7a80 5000 0 2025-02-23 05:23:11 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:oauth-openshift] map[operator.openshift.io/spec-hash:d9e6d53076d47ab2d123d8b1ba8ec6543488d973dcc4e02349493cd1c33bce83 service.alpha.openshift.io/serving-cert-secret-name:v4-0-config-system-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 6443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: oauth-openshi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24568caa62492721d1b712eac7a48bdb14d98f734bef0ec54e7a454771638d0c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:07:12Z\\\",\\\"message\\\":\\\"aler-operator]} name:Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.245:443: 10.217.5.245:9192:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {54fbe873-7e6d-475f-a0ad-8dd5f06d850d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1206 09:07:12.323738 6217 services_controller.go:452] Built service openshift-apiserver-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1206 09:07:12.323747 6217 services_controller.go:453] Built service openshift-apiserver-operator/metrics template LB for network=default: []services.LB{}\\\\nI1206 09:07:12.323753 6217 services_controller.go:454] Service openshift-apiserver-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1206 09:07:12.323779 6217 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization,\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:13Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.233790 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae1cc18-abb2-44a0-a368-2e211e266739\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://111abfbdab45a6256108067f5721a4dc7c30ba86fb03b635515222586085b2a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be2b7d9693248ad452729c60f6ad3599f1ead11da1334fc50007a3457242d9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bdaa018e1393770e97100fcf2505232341157f89658f052ba5e27572967e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e00ab8bdef1709d73446eacca39c22e7ea478b5d5a272c362ce234c135b6f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e00ab8bdef1709d73446eacca39c22e7ea478b5d5a272c362ce234c135b6f21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:13Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.248383 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:13Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.260419 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:13Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.271402 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:13Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.284689 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.284740 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.284751 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.284765 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.284776 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:13Z","lastTransitionTime":"2025-12-06T09:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.288008 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://640f821886d65eb7ea8dc8ec35be25c33457d1dfb440632b932dfc0cb39b7b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:13Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.301321 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:13Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.315002 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:13Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.387787 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.387852 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.387870 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.387895 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.387914 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:13Z","lastTransitionTime":"2025-12-06T09:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.490911 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.490959 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.490970 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.490987 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.490998 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:13Z","lastTransitionTime":"2025-12-06T09:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.594178 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.594225 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.594236 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.594250 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.594260 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:13Z","lastTransitionTime":"2025-12-06T09:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.697823 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.697878 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.697890 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.697908 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.697921 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:13Z","lastTransitionTime":"2025-12-06T09:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.800653 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.800712 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.800723 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.800746 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.800757 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:13Z","lastTransitionTime":"2025-12-06T09:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.903757 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.903806 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.903818 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.903835 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.903849 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:13Z","lastTransitionTime":"2025-12-06T09:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.924865 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbbs5_713432b9-3b28-4ad0-b578-9d42aa1931aa/ovnkube-controller/2.log" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.929786 4672 scope.go:117] "RemoveContainer" containerID="24568caa62492721d1b712eac7a48bdb14d98f734bef0ec54e7a454771638d0c" Dec 06 09:07:13 crc kubenswrapper[4672]: E1206 09:07:13.929981 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xbbs5_openshift-ovn-kubernetes(713432b9-3b28-4ad0-b578-9d42aa1931aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.944514 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:13Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.958020 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:13Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.973363 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://640f821886d65eb7ea8dc8ec35be25c33457d1dfb440632b932dfc0cb39b7b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:13Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:13 crc kubenswrapper[4672]: I1206 09:07:13.987498 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:13Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.001540 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:13Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.007173 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.007223 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.007237 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.007257 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.007273 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:14Z","lastTransitionTime":"2025-12-06T09:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.018906 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:14Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.036077 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:14Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.052981 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61df9d53-92e8-439f-8d15-44e96d25a23e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cad8f3bb7aca435b771c2e1843d53eabc28463caaf29de4650edcf6681ca9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e75ceaf93a3d396036177b57e3f468fb6bc704896dc27cd2e8ab6924eab53b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ch46n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:14Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.066747 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxrkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37625968-279a-4fc1-bfa2-b03868e7363d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca1d5b78e98dc1b35785758a9e44908823d0f5589f5a8d505ea1e909bb97dbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vls65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxrkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:14Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.079909 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-w587t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fca5f829-3091-4191-abf5-2bece3ab91f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-w587t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:14Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.094807 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:14Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.107107 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:14Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.109310 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.109370 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.109388 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.109411 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.109427 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:14Z","lastTransitionTime":"2025-12-06T09:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.121203 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:14Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.135219 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:14Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.164888 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24568caa62492721d1b712eac7a48bdb14d98f734bef0ec54e7a454771638d0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24568caa62492721d1b712eac7a48bdb14d98f734bef0ec54e7a454771638d0c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:07:12Z\\\",\\\"message\\\":\\\"aler-operator]} name:Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.245:443: 10.217.5.245:9192:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {54fbe873-7e6d-475f-a0ad-8dd5f06d850d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1206 09:07:12.323738 6217 services_controller.go:452] Built service openshift-apiserver-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1206 09:07:12.323747 6217 services_controller.go:453] Built service openshift-apiserver-operator/metrics template LB for network=default: []services.LB{}\\\\nI1206 09:07:12.323753 6217 services_controller.go:454] Service openshift-apiserver-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1206 09:07:12.323779 6217 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization,\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:07:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xbbs5_openshift-ovn-kubernetes(713432b9-3b28-4ad0-b578-9d42aa1931aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:14Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.178948 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae1cc18-abb2-44a0-a368-2e211e266739\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://111abfbdab45a6256108067f5721a4dc7c30ba86fb03b635515222586085b2a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be2b7d9693248ad452729c60f6ad3599f1ead11da1334fc50007a3457242d9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bdaa018e1393770e97100fcf2505232341157f89658f052ba5e27572967e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e00ab8bdef1709d73446eacca39c22e7ea478b5d5a272c362ce234c135b6f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e00ab8bdef1709d73446eacca39c22e7ea478b5d5a272c362ce234c135b6f21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:14Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.193911 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:14Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.211640 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.211700 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.211718 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.211763 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.211783 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:14Z","lastTransitionTime":"2025-12-06T09:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.314669 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.314748 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.314793 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.314816 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.314830 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:14Z","lastTransitionTime":"2025-12-06T09:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.417826 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.417879 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.417897 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.417922 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.417941 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:14Z","lastTransitionTime":"2025-12-06T09:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.423358 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.423505 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:14 crc kubenswrapper[4672]: E1206 09:07:14.423671 4672 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 09:07:14 crc kubenswrapper[4672]: E1206 09:07:14.423760 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 09:07:46.423738205 +0000 UTC m=+84.167998532 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 09:07:14 crc kubenswrapper[4672]: E1206 09:07:14.423856 4672 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 09:07:14 crc kubenswrapper[4672]: E1206 09:07:14.423971 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 09:07:46.423957711 +0000 UTC m=+84.168218008 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.521646 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.521700 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.521712 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.521751 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.521764 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:14Z","lastTransitionTime":"2025-12-06T09:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.524255 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:07:14 crc kubenswrapper[4672]: E1206 09:07:14.524448 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:07:46.524422911 +0000 UTC m=+84.268683208 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.524541 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.524714 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:14 crc kubenswrapper[4672]: E1206 09:07:14.524722 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 09:07:14 crc kubenswrapper[4672]: E1206 09:07:14.524804 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 09:07:14 crc kubenswrapper[4672]: E1206 09:07:14.524819 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 09:07:14 crc kubenswrapper[4672]: E1206 09:07:14.524831 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 09:07:14 crc kubenswrapper[4672]: E1206 09:07:14.524843 4672 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:07:14 crc kubenswrapper[4672]: E1206 09:07:14.524853 4672 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:07:14 crc kubenswrapper[4672]: E1206 09:07:14.524916 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 09:07:46.524902233 +0000 UTC m=+84.269162540 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:07:14 crc kubenswrapper[4672]: E1206 09:07:14.524958 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 09:07:46.524935114 +0000 UTC m=+84.269195411 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.556809 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.556845 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.556824 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.556856 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:14 crc kubenswrapper[4672]: E1206 09:07:14.556984 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:07:14 crc kubenswrapper[4672]: E1206 09:07:14.557132 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:07:14 crc kubenswrapper[4672]: E1206 09:07:14.557237 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:07:14 crc kubenswrapper[4672]: E1206 09:07:14.557269 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.624739 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.624803 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.624827 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.624854 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.624875 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:14Z","lastTransitionTime":"2025-12-06T09:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.727677 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.727731 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.727747 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.727773 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.727790 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:14Z","lastTransitionTime":"2025-12-06T09:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.831034 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.831077 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.831090 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.831104 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.831114 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:14Z","lastTransitionTime":"2025-12-06T09:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.933034 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.933070 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.933081 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.933096 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:14 crc kubenswrapper[4672]: I1206 09:07:14.933107 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:14Z","lastTransitionTime":"2025-12-06T09:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.037227 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.037302 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.037328 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.037355 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.037376 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:15Z","lastTransitionTime":"2025-12-06T09:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.060320 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.060407 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.060429 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.060452 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.060468 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:15Z","lastTransitionTime":"2025-12-06T09:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:15 crc kubenswrapper[4672]: E1206 09:07:15.079856 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dee4872a-ee41-4a28-b591-3da52b9dd3d6\\\",\\\"systemUUID\\\":\\\"7e6e2ea0-eb53-4cec-8366-444329cefc63\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:15Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.083920 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.083952 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.083964 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.083980 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.083991 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:15Z","lastTransitionTime":"2025-12-06T09:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:15 crc kubenswrapper[4672]: E1206 09:07:15.101110 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dee4872a-ee41-4a28-b591-3da52b9dd3d6\\\",\\\"systemUUID\\\":\\\"7e6e2ea0-eb53-4cec-8366-444329cefc63\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:15Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.106540 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.106627 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.106647 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.106672 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.106691 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:15Z","lastTransitionTime":"2025-12-06T09:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:15 crc kubenswrapper[4672]: E1206 09:07:15.125318 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dee4872a-ee41-4a28-b591-3da52b9dd3d6\\\",\\\"systemUUID\\\":\\\"7e6e2ea0-eb53-4cec-8366-444329cefc63\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:15Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.131120 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.131408 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.131636 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.131909 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.132110 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:15Z","lastTransitionTime":"2025-12-06T09:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:15 crc kubenswrapper[4672]: E1206 09:07:15.153525 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dee4872a-ee41-4a28-b591-3da52b9dd3d6\\\",\\\"systemUUID\\\":\\\"7e6e2ea0-eb53-4cec-8366-444329cefc63\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:15Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.196780 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.196829 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.196841 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.196861 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.196874 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:15Z","lastTransitionTime":"2025-12-06T09:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:15 crc kubenswrapper[4672]: E1206 09:07:15.213246 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dee4872a-ee41-4a28-b591-3da52b9dd3d6\\\",\\\"systemUUID\\\":\\\"7e6e2ea0-eb53-4cec-8366-444329cefc63\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:15Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:15 crc kubenswrapper[4672]: E1206 09:07:15.213406 4672 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.215297 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.215331 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.215343 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.215358 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.215368 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:15Z","lastTransitionTime":"2025-12-06T09:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.318194 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.318247 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.318258 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.318277 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.318287 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:15Z","lastTransitionTime":"2025-12-06T09:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.421631 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.421679 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.421689 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.421704 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.421717 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:15Z","lastTransitionTime":"2025-12-06T09:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.524438 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.524491 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.524507 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.524534 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.524556 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:15Z","lastTransitionTime":"2025-12-06T09:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.628056 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.628120 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.628143 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.628172 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.628192 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:15Z","lastTransitionTime":"2025-12-06T09:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.731257 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.731330 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.731353 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.731381 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.731400 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:15Z","lastTransitionTime":"2025-12-06T09:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.834950 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.835016 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.835036 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.835060 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.835081 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:15Z","lastTransitionTime":"2025-12-06T09:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.938527 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.939101 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.939252 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.939389 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:15 crc kubenswrapper[4672]: I1206 09:07:15.939508 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:15Z","lastTransitionTime":"2025-12-06T09:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.042274 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.042357 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.042382 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.042409 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.042431 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:16Z","lastTransitionTime":"2025-12-06T09:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.145572 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.145673 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.145692 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.145717 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.145735 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:16Z","lastTransitionTime":"2025-12-06T09:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.248921 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.249000 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.249024 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.249059 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.249083 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:16Z","lastTransitionTime":"2025-12-06T09:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.352379 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.352450 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.352463 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.352872 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.352911 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:16Z","lastTransitionTime":"2025-12-06T09:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.456230 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.456275 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.456288 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.456308 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.456325 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:16Z","lastTransitionTime":"2025-12-06T09:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.556801 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:16 crc kubenswrapper[4672]: E1206 09:07:16.556959 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.557035 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:16 crc kubenswrapper[4672]: E1206 09:07:16.557282 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.557330 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:16 crc kubenswrapper[4672]: E1206 09:07:16.557564 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.557705 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:16 crc kubenswrapper[4672]: E1206 09:07:16.557846 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.558424 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.558468 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.558482 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.558503 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.558517 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:16Z","lastTransitionTime":"2025-12-06T09:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.661905 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.661951 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.661963 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.661992 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.662006 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:16Z","lastTransitionTime":"2025-12-06T09:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.764832 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.764889 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.764902 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.764921 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.764934 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:16Z","lastTransitionTime":"2025-12-06T09:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.867927 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.867986 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.867998 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.868021 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.868036 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:16Z","lastTransitionTime":"2025-12-06T09:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.971275 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.971330 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.971341 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.971358 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:16 crc kubenswrapper[4672]: I1206 09:07:16.971371 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:16Z","lastTransitionTime":"2025-12-06T09:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.073892 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.073970 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.073987 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.074008 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.074026 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:17Z","lastTransitionTime":"2025-12-06T09:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.176816 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.176897 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.176911 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.176932 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.176948 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:17Z","lastTransitionTime":"2025-12-06T09:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.280082 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.280130 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.280147 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.280169 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.280186 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:17Z","lastTransitionTime":"2025-12-06T09:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.384588 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.384696 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.384720 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.384754 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.384778 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:17Z","lastTransitionTime":"2025-12-06T09:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.488443 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.488513 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.488534 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.488564 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.488586 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:17Z","lastTransitionTime":"2025-12-06T09:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.592083 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.592156 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.592176 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.592202 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.592220 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:17Z","lastTransitionTime":"2025-12-06T09:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.695865 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.695940 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.695965 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.696000 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.696025 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:17Z","lastTransitionTime":"2025-12-06T09:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.800003 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.800075 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.800089 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.800113 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.800132 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:17Z","lastTransitionTime":"2025-12-06T09:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.904315 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.904429 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.904455 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.904487 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:17 crc kubenswrapper[4672]: I1206 09:07:17.904507 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:17Z","lastTransitionTime":"2025-12-06T09:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.009268 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.009700 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.009870 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.010068 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.010306 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:18Z","lastTransitionTime":"2025-12-06T09:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.114199 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.114273 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.114291 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.114320 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.114343 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:18Z","lastTransitionTime":"2025-12-06T09:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.217639 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.217729 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.217751 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.217778 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.217798 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:18Z","lastTransitionTime":"2025-12-06T09:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.321323 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.321374 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.321384 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.321401 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.321411 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:18Z","lastTransitionTime":"2025-12-06T09:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.425115 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.425207 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.425242 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.425273 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.425297 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:18Z","lastTransitionTime":"2025-12-06T09:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.528629 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.528683 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.528694 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.528711 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.528724 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:18Z","lastTransitionTime":"2025-12-06T09:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.556922 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.557009 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.557009 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.557028 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:18 crc kubenswrapper[4672]: E1206 09:07:18.557133 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:07:18 crc kubenswrapper[4672]: E1206 09:07:18.557273 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:07:18 crc kubenswrapper[4672]: E1206 09:07:18.557486 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:07:18 crc kubenswrapper[4672]: E1206 09:07:18.557564 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.632109 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.632165 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.632176 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.632198 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.632211 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:18Z","lastTransitionTime":"2025-12-06T09:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.735777 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.735853 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.735868 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.735893 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.735913 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:18Z","lastTransitionTime":"2025-12-06T09:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.839722 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.839789 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.839806 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.839833 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.839854 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:18Z","lastTransitionTime":"2025-12-06T09:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.943704 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.943753 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.943770 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.943794 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:18 crc kubenswrapper[4672]: I1206 09:07:18.943814 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:18Z","lastTransitionTime":"2025-12-06T09:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.046443 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.046510 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.046532 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.046568 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.046649 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:19Z","lastTransitionTime":"2025-12-06T09:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.149127 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.149201 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.149221 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.149245 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.149267 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:19Z","lastTransitionTime":"2025-12-06T09:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.252173 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.252217 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.252227 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.252244 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.252254 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:19Z","lastTransitionTime":"2025-12-06T09:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.356014 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.356328 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.356431 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.356554 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.356732 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:19Z","lastTransitionTime":"2025-12-06T09:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.460758 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.461204 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.461374 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.461517 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.461694 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:19Z","lastTransitionTime":"2025-12-06T09:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.564410 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.564736 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.564842 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.564932 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.565029 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:19Z","lastTransitionTime":"2025-12-06T09:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.668145 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.668213 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.668235 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.668264 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.668286 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:19Z","lastTransitionTime":"2025-12-06T09:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.770372 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.770659 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.770761 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.770838 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.770913 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:19Z","lastTransitionTime":"2025-12-06T09:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.874135 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.874202 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.874215 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.874232 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.874245 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:19Z","lastTransitionTime":"2025-12-06T09:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.977391 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.977434 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.977447 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.977466 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:19 crc kubenswrapper[4672]: I1206 09:07:19.977481 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:19Z","lastTransitionTime":"2025-12-06T09:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.080126 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.080658 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.080864 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.081020 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.081150 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:20Z","lastTransitionTime":"2025-12-06T09:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.184564 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.184668 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.184688 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.184718 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.184735 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:20Z","lastTransitionTime":"2025-12-06T09:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.288595 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.288691 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.288709 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.288735 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.288755 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:20Z","lastTransitionTime":"2025-12-06T09:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.391839 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.391882 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.391894 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.391912 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.391924 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:20Z","lastTransitionTime":"2025-12-06T09:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.494391 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.494427 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.494438 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.494454 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.494466 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:20Z","lastTransitionTime":"2025-12-06T09:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.556778 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.556786 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.556812 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:20 crc kubenswrapper[4672]: E1206 09:07:20.557189 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:07:20 crc kubenswrapper[4672]: E1206 09:07:20.556949 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:07:20 crc kubenswrapper[4672]: E1206 09:07:20.557240 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.557289 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:20 crc kubenswrapper[4672]: E1206 09:07:20.557473 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.596230 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.596277 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.596287 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.596305 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.596316 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:20Z","lastTransitionTime":"2025-12-06T09:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.699066 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.699108 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.699116 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.699133 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.699144 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:20Z","lastTransitionTime":"2025-12-06T09:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.801831 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.801917 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.801937 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.801964 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.801981 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:20Z","lastTransitionTime":"2025-12-06T09:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.904458 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.904515 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.904532 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.904559 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:20 crc kubenswrapper[4672]: I1206 09:07:20.904576 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:20Z","lastTransitionTime":"2025-12-06T09:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.007647 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.007697 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.007713 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.007736 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.007752 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:21Z","lastTransitionTime":"2025-12-06T09:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.110426 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.110468 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.110477 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.110496 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.110505 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:21Z","lastTransitionTime":"2025-12-06T09:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.212895 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.212923 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.212931 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.212943 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.212952 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:21Z","lastTransitionTime":"2025-12-06T09:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.316757 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.316810 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.316819 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.316834 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.316846 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:21Z","lastTransitionTime":"2025-12-06T09:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.419631 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.419662 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.419671 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.419703 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.419713 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:21Z","lastTransitionTime":"2025-12-06T09:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.522552 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.522631 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.522641 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.522661 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.522671 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:21Z","lastTransitionTime":"2025-12-06T09:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.625858 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.626293 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.626450 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.627088 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.627372 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:21Z","lastTransitionTime":"2025-12-06T09:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.734535 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.734576 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.734594 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.734637 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.734652 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:21Z","lastTransitionTime":"2025-12-06T09:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.837852 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.837900 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.837911 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.837930 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.837944 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:21Z","lastTransitionTime":"2025-12-06T09:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.940949 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.941290 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.941422 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.941584 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:21 crc kubenswrapper[4672]: I1206 09:07:21.941715 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:21Z","lastTransitionTime":"2025-12-06T09:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.044408 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.044455 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.044464 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.044482 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.044497 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:22Z","lastTransitionTime":"2025-12-06T09:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.147495 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.147543 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.147554 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.147571 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.147584 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:22Z","lastTransitionTime":"2025-12-06T09:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.250958 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.251019 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.251033 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.251051 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.251265 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:22Z","lastTransitionTime":"2025-12-06T09:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.354871 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.354925 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.354934 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.354950 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.354964 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:22Z","lastTransitionTime":"2025-12-06T09:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.458047 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.458102 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.458116 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.458138 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.458150 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:22Z","lastTransitionTime":"2025-12-06T09:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.555849 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.555899 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.555849 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.555849 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:22 crc kubenswrapper[4672]: E1206 09:07:22.556069 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:07:22 crc kubenswrapper[4672]: E1206 09:07:22.556411 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:07:22 crc kubenswrapper[4672]: E1206 09:07:22.556614 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:07:22 crc kubenswrapper[4672]: E1206 09:07:22.556739 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.560967 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.561062 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.561074 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.562839 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.562970 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:22Z","lastTransitionTime":"2025-12-06T09:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.572364 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:22Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.584336 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:22Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.593839 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:22Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.609667 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:22Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.626382 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://640f821886d65eb7ea8dc8ec35be25c33457d1dfb440632b932dfc0cb39b7b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:22Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.647688 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:22Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.661090 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:22Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.666014 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.666066 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.666085 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.666107 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.666120 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:22Z","lastTransitionTime":"2025-12-06T09:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.676404 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61df9d53-92e8-439f-8d15-44e96d25a23e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cad8f3bb7aca435b771c2e1843d53eabc28463caaf29de4650edcf6681ca9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e75ceaf93a3d396036177b57e3f468fb6bc704896dc27cd2e8ab6924eab53b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ch46n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:22Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.692672 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:22Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.707338 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:22Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.722690 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:22Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.736433 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxrkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37625968-279a-4fc1-bfa2-b03868e7363d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca1d5b78e98dc1b35785758a9e44908823d0f5589f5a8d505ea1e909bb97dbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vls65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxrkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:22Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.750146 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-w587t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fca5f829-3091-4191-abf5-2bece3ab91f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-w587t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:22Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.765541 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae1cc18-abb2-44a0-a368-2e211e266739\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://111abfbdab45a6256108067f5721a4dc7c30ba86fb03b635515222586085b2a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be2b7d9693248ad452729c60f6ad3599f1ead11da1334fc50007a3457242d9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bdaa018e1393770e97100fcf2505232341157f89658f052ba5e27572967e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e00ab8bdef1709d73446eacca39c22e7ea478b5d5a272c362ce234c135b6f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e00ab8bdef1709d73446eacca39c22e7ea478b5d5a272c362ce234c135b6f21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:22Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.768611 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.768688 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.768702 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.768726 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.768741 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:22Z","lastTransitionTime":"2025-12-06T09:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.779269 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:22Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.793837 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:22Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.820950 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24568caa62492721d1b712eac7a48bdb14d98f734bef0ec54e7a454771638d0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24568caa62492721d1b712eac7a48bdb14d98f734bef0ec54e7a454771638d0c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:07:12Z\\\",\\\"message\\\":\\\"aler-operator]} name:Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.245:443: 10.217.5.245:9192:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {54fbe873-7e6d-475f-a0ad-8dd5f06d850d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1206 09:07:12.323738 6217 services_controller.go:452] Built service openshift-apiserver-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1206 09:07:12.323747 6217 services_controller.go:453] Built service openshift-apiserver-operator/metrics template LB for network=default: []services.LB{}\\\\nI1206 09:07:12.323753 6217 services_controller.go:454] Service openshift-apiserver-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1206 09:07:12.323779 6217 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization,\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:07:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xbbs5_openshift-ovn-kubernetes(713432b9-3b28-4ad0-b578-9d42aa1931aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:22Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.871656 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.872098 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.872110 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.872129 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.872140 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:22Z","lastTransitionTime":"2025-12-06T09:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.974651 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.974689 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.974697 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.974729 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:22 crc kubenswrapper[4672]: I1206 09:07:22.974740 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:22Z","lastTransitionTime":"2025-12-06T09:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.077254 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.077305 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.077318 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.077335 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.077349 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:23Z","lastTransitionTime":"2025-12-06T09:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.181148 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.181234 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.181260 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.181294 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.181324 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:23Z","lastTransitionTime":"2025-12-06T09:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.283798 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.283855 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.283864 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.283880 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.283890 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:23Z","lastTransitionTime":"2025-12-06T09:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.387209 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.387254 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.387270 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.387291 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.387304 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:23Z","lastTransitionTime":"2025-12-06T09:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.490414 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.490473 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.490498 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.490515 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.490526 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:23Z","lastTransitionTime":"2025-12-06T09:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.593483 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.593542 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.593554 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.593576 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.593591 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:23Z","lastTransitionTime":"2025-12-06T09:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.696378 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.696416 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.696427 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.696477 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.696487 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:23Z","lastTransitionTime":"2025-12-06T09:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.799824 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.799880 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.799896 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.799920 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.799936 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:23Z","lastTransitionTime":"2025-12-06T09:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.903284 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.903337 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.903347 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.903371 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:23 crc kubenswrapper[4672]: I1206 09:07:23.903395 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:23Z","lastTransitionTime":"2025-12-06T09:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.007031 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.007100 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.007115 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.007136 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.007151 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:24Z","lastTransitionTime":"2025-12-06T09:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.110847 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.110924 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.110945 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.110991 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.111022 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:24Z","lastTransitionTime":"2025-12-06T09:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.213221 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.213349 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.213382 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.213410 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.213428 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:24Z","lastTransitionTime":"2025-12-06T09:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.316720 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.317013 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.317104 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.317198 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.317274 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:24Z","lastTransitionTime":"2025-12-06T09:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.419751 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.419796 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.419811 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.419832 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.419847 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:24Z","lastTransitionTime":"2025-12-06T09:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.523047 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.523119 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.523136 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.523183 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.523199 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:24Z","lastTransitionTime":"2025-12-06T09:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.556998 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.557112 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.557026 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.557242 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:24 crc kubenswrapper[4672]: E1206 09:07:24.557245 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:07:24 crc kubenswrapper[4672]: E1206 09:07:24.557330 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:07:24 crc kubenswrapper[4672]: E1206 09:07:24.557478 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:07:24 crc kubenswrapper[4672]: E1206 09:07:24.557731 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.626962 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.627282 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.627355 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.627455 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.627538 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:24Z","lastTransitionTime":"2025-12-06T09:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.731278 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.731347 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.731363 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.731388 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.731406 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:24Z","lastTransitionTime":"2025-12-06T09:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.834415 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.834496 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.834512 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.834535 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.834555 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:24Z","lastTransitionTime":"2025-12-06T09:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.937712 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.938134 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.938281 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.938422 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:24 crc kubenswrapper[4672]: I1206 09:07:24.938585 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:24Z","lastTransitionTime":"2025-12-06T09:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.042230 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.042551 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.042763 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.042903 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.043049 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:25Z","lastTransitionTime":"2025-12-06T09:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.147662 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.147715 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.147726 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.147746 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.147764 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:25Z","lastTransitionTime":"2025-12-06T09:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.256137 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.256192 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.256203 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.256226 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.256242 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:25Z","lastTransitionTime":"2025-12-06T09:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.358964 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.359031 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.359043 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.359057 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.359067 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:25Z","lastTransitionTime":"2025-12-06T09:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.461993 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.462042 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.462055 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.462075 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.462089 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:25Z","lastTransitionTime":"2025-12-06T09:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.539216 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.539244 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.539255 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.539269 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.539279 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:25Z","lastTransitionTime":"2025-12-06T09:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:25 crc kubenswrapper[4672]: E1206 09:07:25.554562 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dee4872a-ee41-4a28-b591-3da52b9dd3d6\\\",\\\"systemUUID\\\":\\\"7e6e2ea0-eb53-4cec-8366-444329cefc63\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:25Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.557008 4672 scope.go:117] "RemoveContainer" containerID="24568caa62492721d1b712eac7a48bdb14d98f734bef0ec54e7a454771638d0c" Dec 06 09:07:25 crc kubenswrapper[4672]: E1206 09:07:25.557244 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xbbs5_openshift-ovn-kubernetes(713432b9-3b28-4ad0-b578-9d42aa1931aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.559756 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.559972 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.560073 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.560147 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.560230 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:25Z","lastTransitionTime":"2025-12-06T09:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:25 crc kubenswrapper[4672]: E1206 09:07:25.572863 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dee4872a-ee41-4a28-b591-3da52b9dd3d6\\\",\\\"systemUUID\\\":\\\"7e6e2ea0-eb53-4cec-8366-444329cefc63\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:25Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.577350 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.577532 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.577658 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.577753 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.577827 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:25Z","lastTransitionTime":"2025-12-06T09:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:25 crc kubenswrapper[4672]: E1206 09:07:25.592516 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dee4872a-ee41-4a28-b591-3da52b9dd3d6\\\",\\\"systemUUID\\\":\\\"7e6e2ea0-eb53-4cec-8366-444329cefc63\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:25Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.597990 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.598050 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.598061 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.598080 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.598093 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:25Z","lastTransitionTime":"2025-12-06T09:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:25 crc kubenswrapper[4672]: E1206 09:07:25.613127 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dee4872a-ee41-4a28-b591-3da52b9dd3d6\\\",\\\"systemUUID\\\":\\\"7e6e2ea0-eb53-4cec-8366-444329cefc63\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:25Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.618645 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.618713 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.618734 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.618797 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.618815 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:25Z","lastTransitionTime":"2025-12-06T09:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:25 crc kubenswrapper[4672]: E1206 09:07:25.635030 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dee4872a-ee41-4a28-b591-3da52b9dd3d6\\\",\\\"systemUUID\\\":\\\"7e6e2ea0-eb53-4cec-8366-444329cefc63\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:25Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:25 crc kubenswrapper[4672]: E1206 09:07:25.635178 4672 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.637264 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.637308 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.637323 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.637342 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.637374 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:25Z","lastTransitionTime":"2025-12-06T09:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.740133 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.740195 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.740204 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.740219 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.740229 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:25Z","lastTransitionTime":"2025-12-06T09:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.843430 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.843479 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.843490 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.843510 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.843523 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:25Z","lastTransitionTime":"2025-12-06T09:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.946989 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.947043 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.947055 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.947076 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:25 crc kubenswrapper[4672]: I1206 09:07:25.947089 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:25Z","lastTransitionTime":"2025-12-06T09:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.049691 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.049747 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.049757 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.049776 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.049789 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:26Z","lastTransitionTime":"2025-12-06T09:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.152820 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.152878 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.152891 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.152928 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.152950 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:26Z","lastTransitionTime":"2025-12-06T09:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.255876 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.255930 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.255940 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.255959 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.255974 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:26Z","lastTransitionTime":"2025-12-06T09:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.359721 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.359756 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.359769 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.359785 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.359798 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:26Z","lastTransitionTime":"2025-12-06T09:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.463000 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.463045 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.463073 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.463093 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.463102 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:26Z","lastTransitionTime":"2025-12-06T09:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.556430 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.556532 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:26 crc kubenswrapper[4672]: E1206 09:07:26.556658 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:07:26 crc kubenswrapper[4672]: E1206 09:07:26.556757 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.556460 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:26 crc kubenswrapper[4672]: E1206 09:07:26.556876 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.556456 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:26 crc kubenswrapper[4672]: E1206 09:07:26.556964 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.565543 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.565742 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.565829 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.565920 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.565990 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:26Z","lastTransitionTime":"2025-12-06T09:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.669022 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.669554 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.669671 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.669746 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.669813 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:26Z","lastTransitionTime":"2025-12-06T09:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.772576 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.772945 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.773050 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.773155 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.773239 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:26Z","lastTransitionTime":"2025-12-06T09:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.875913 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.875951 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.875962 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.875976 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.875985 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:26Z","lastTransitionTime":"2025-12-06T09:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.978435 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.978490 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.978501 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.978522 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:26 crc kubenswrapper[4672]: I1206 09:07:26.978533 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:26Z","lastTransitionTime":"2025-12-06T09:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.081209 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.081262 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.081272 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.081299 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.081311 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:27Z","lastTransitionTime":"2025-12-06T09:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.168710 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fca5f829-3091-4191-abf5-2bece3ab91f7-metrics-certs\") pod \"network-metrics-daemon-w587t\" (UID: \"fca5f829-3091-4191-abf5-2bece3ab91f7\") " pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:27 crc kubenswrapper[4672]: E1206 09:07:27.169012 4672 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 09:07:27 crc kubenswrapper[4672]: E1206 09:07:27.169180 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fca5f829-3091-4191-abf5-2bece3ab91f7-metrics-certs podName:fca5f829-3091-4191-abf5-2bece3ab91f7 nodeName:}" failed. No retries permitted until 2025-12-06 09:07:59.16911447 +0000 UTC m=+96.913374757 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fca5f829-3091-4191-abf5-2bece3ab91f7-metrics-certs") pod "network-metrics-daemon-w587t" (UID: "fca5f829-3091-4191-abf5-2bece3ab91f7") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.183824 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.183872 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.183888 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.183912 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.183932 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:27Z","lastTransitionTime":"2025-12-06T09:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.286247 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.286317 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.286332 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.286362 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.286383 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:27Z","lastTransitionTime":"2025-12-06T09:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.388836 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.388901 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.388918 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.388944 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.388960 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:27Z","lastTransitionTime":"2025-12-06T09:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.491390 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.491668 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.491734 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.491805 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.491879 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:27Z","lastTransitionTime":"2025-12-06T09:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.594907 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.594959 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.594973 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.594988 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.594999 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:27Z","lastTransitionTime":"2025-12-06T09:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.698225 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.698702 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.698997 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.699275 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.699766 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:27Z","lastTransitionTime":"2025-12-06T09:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.802608 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.802678 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.802688 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.802705 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.802716 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:27Z","lastTransitionTime":"2025-12-06T09:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.904958 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.905004 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.905013 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.905047 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:27 crc kubenswrapper[4672]: I1206 09:07:27.905060 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:27Z","lastTransitionTime":"2025-12-06T09:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.008972 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.009044 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.009072 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.009106 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.009132 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:28Z","lastTransitionTime":"2025-12-06T09:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.112656 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.112748 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.112778 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.112816 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.112841 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:28Z","lastTransitionTime":"2025-12-06T09:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.215502 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.215537 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.215546 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.215562 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.215573 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:28Z","lastTransitionTime":"2025-12-06T09:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.323470 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.323530 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.323544 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.323577 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.323591 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:28Z","lastTransitionTime":"2025-12-06T09:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.427066 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.427131 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.427150 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.427176 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.427195 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:28Z","lastTransitionTime":"2025-12-06T09:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.529814 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.529886 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.529896 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.529917 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.529931 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:28Z","lastTransitionTime":"2025-12-06T09:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.556252 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.556350 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.556418 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:28 crc kubenswrapper[4672]: E1206 09:07:28.556451 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.556473 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:28 crc kubenswrapper[4672]: E1206 09:07:28.556619 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:07:28 crc kubenswrapper[4672]: E1206 09:07:28.556714 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:07:28 crc kubenswrapper[4672]: E1206 09:07:28.556806 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.633448 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.633484 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.633513 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.633533 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.633544 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:28Z","lastTransitionTime":"2025-12-06T09:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.736661 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.736704 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.736717 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.736735 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.736747 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:28Z","lastTransitionTime":"2025-12-06T09:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.839088 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.839120 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.839129 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.839144 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.839155 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:28Z","lastTransitionTime":"2025-12-06T09:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.941399 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.941432 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.941442 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.941458 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:28 crc kubenswrapper[4672]: I1206 09:07:28.941467 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:28Z","lastTransitionTime":"2025-12-06T09:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.044905 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.044950 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.044960 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.044979 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.044990 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:29Z","lastTransitionTime":"2025-12-06T09:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.147964 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.148001 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.148014 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.148034 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.148046 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:29Z","lastTransitionTime":"2025-12-06T09:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.251382 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.251425 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.251437 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.251465 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.251479 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:29Z","lastTransitionTime":"2025-12-06T09:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.354349 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.354388 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.354400 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.354418 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.354429 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:29Z","lastTransitionTime":"2025-12-06T09:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.456678 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.456718 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.456729 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.456744 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.456756 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:29Z","lastTransitionTime":"2025-12-06T09:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.559776 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.559836 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.559849 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.559864 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.559876 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:29Z","lastTransitionTime":"2025-12-06T09:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.662070 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.662113 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.662122 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.662139 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.662149 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:29Z","lastTransitionTime":"2025-12-06T09:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.764810 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.764875 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.764887 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.764903 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.764913 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:29Z","lastTransitionTime":"2025-12-06T09:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.867321 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.867375 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.867387 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.867406 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.867418 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:29Z","lastTransitionTime":"2025-12-06T09:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.969703 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.969750 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.969760 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.969776 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.969791 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:29Z","lastTransitionTime":"2025-12-06T09:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.993175 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ks2jd_25b493f7-0dae-4eb4-9499-0564410528f7/kube-multus/0.log" Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.993240 4672 generic.go:334] "Generic (PLEG): container finished" podID="25b493f7-0dae-4eb4-9499-0564410528f7" containerID="3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328" exitCode=1 Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.993277 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ks2jd" event={"ID":"25b493f7-0dae-4eb4-9499-0564410528f7","Type":"ContainerDied","Data":"3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328"} Dec 06 09:07:29 crc kubenswrapper[4672]: I1206 09:07:29.993767 4672 scope.go:117] "RemoveContainer" containerID="3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.004969 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxrkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37625968-279a-4fc1-bfa2-b03868e7363d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca1d5b78e98dc1b35785758a9e44908823d0f5589f5a8d505ea1e909bb97dbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vls65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxrkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:30Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.016699 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-w587t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fca5f829-3091-4191-abf5-2bece3ab91f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-w587t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:30Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.030796 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:30Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.046116 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:30Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.062559 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:30Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.073331 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.073372 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.073405 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.073425 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.073438 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:30Z","lastTransitionTime":"2025-12-06T09:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.075391 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:30Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.095990 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24568caa62492721d1b712eac7a48bdb14d98f734bef0ec54e7a454771638d0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24568caa62492721d1b712eac7a48bdb14d98f734bef0ec54e7a454771638d0c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:07:12Z\\\",\\\"message\\\":\\\"aler-operator]} name:Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.245:443: 10.217.5.245:9192:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {54fbe873-7e6d-475f-a0ad-8dd5f06d850d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1206 09:07:12.323738 6217 services_controller.go:452] Built service openshift-apiserver-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1206 09:07:12.323747 6217 services_controller.go:453] Built service openshift-apiserver-operator/metrics template LB for network=default: []services.LB{}\\\\nI1206 09:07:12.323753 6217 services_controller.go:454] Service openshift-apiserver-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1206 09:07:12.323779 6217 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization,\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:07:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xbbs5_openshift-ovn-kubernetes(713432b9-3b28-4ad0-b578-9d42aa1931aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:30Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.112136 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae1cc18-abb2-44a0-a368-2e211e266739\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://111abfbdab45a6256108067f5721a4dc7c30ba86fb03b635515222586085b2a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be2b7d9693248ad452729c60f6ad3599f1ead11da1334fc50007a3457242d9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bdaa018e1393770e97100fcf2505232341157f89658f052ba5e27572967e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e00ab8bdef1709d73446eacca39c22e7ea478b5d5a272c362ce234c135b6f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e00ab8bdef1709d73446eacca39c22e7ea478b5d5a272c362ce234c135b6f21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:30Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.147316 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:30Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.164289 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:30Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.177564 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.177629 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.177640 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.177657 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.177672 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:30Z","lastTransitionTime":"2025-12-06T09:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.179170 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:30Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.205719 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://640f821886d65eb7ea8dc8ec35be25c33457d1dfb440632b932dfc0cb39b7b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:30Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.223217 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:07:29Z\\\",\\\"message\\\":\\\"2025-12-06T09:06:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b6f0cc5a-271f-4a32-bbe2-902361277552\\\\n2025-12-06T09:06:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b6f0cc5a-271f-4a32-bbe2-902361277552 to /host/opt/cni/bin/\\\\n2025-12-06T09:06:43Z [verbose] multus-daemon started\\\\n2025-12-06T09:06:43Z [verbose] Readiness Indicator file check\\\\n2025-12-06T09:07:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:30Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.240519 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:30Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.254670 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:30Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.267623 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:30Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.280815 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.280842 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.280853 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.280873 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.280885 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:30Z","lastTransitionTime":"2025-12-06T09:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.282996 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61df9d53-92e8-439f-8d15-44e96d25a23e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cad8f3bb7aca435b771c2e1843d53eabc28463caaf29de4650edcf6681ca9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e75ceaf93a3d396036177b57e3f468fb6bc704896dc27cd2e8ab6924eab53b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ch46n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:30Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.383678 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.383725 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.383736 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.383756 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.383768 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:30Z","lastTransitionTime":"2025-12-06T09:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.486992 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.487057 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.487070 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.487087 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.487101 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:30Z","lastTransitionTime":"2025-12-06T09:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.556060 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:30 crc kubenswrapper[4672]: E1206 09:07:30.556241 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.556279 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.556298 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.556305 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:30 crc kubenswrapper[4672]: E1206 09:07:30.556456 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:07:30 crc kubenswrapper[4672]: E1206 09:07:30.556558 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:07:30 crc kubenswrapper[4672]: E1206 09:07:30.556671 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.590204 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.590279 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.590289 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.590317 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.590329 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:30Z","lastTransitionTime":"2025-12-06T09:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.693184 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.693265 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.693287 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.693316 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.693339 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:30Z","lastTransitionTime":"2025-12-06T09:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.796166 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.796221 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.796232 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.796250 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.796268 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:30Z","lastTransitionTime":"2025-12-06T09:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.899861 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.899941 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.899951 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.899968 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.899982 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:30Z","lastTransitionTime":"2025-12-06T09:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:30 crc kubenswrapper[4672]: I1206 09:07:30.999343 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ks2jd_25b493f7-0dae-4eb4-9499-0564410528f7/kube-multus/0.log" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:30.999412 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ks2jd" event={"ID":"25b493f7-0dae-4eb4-9499-0564410528f7","Type":"ContainerStarted","Data":"e4d9a2a4e0be6b9ab12a348356a2cc8e8211a95855cab5a24ff9b3967b837140"} Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.002324 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.002374 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.002386 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.002405 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.002417 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:31Z","lastTransitionTime":"2025-12-06T09:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.013564 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:31Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.025919 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:31Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.041972 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://640f821886d65eb7ea8dc8ec35be25c33457d1dfb440632b932dfc0cb39b7b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:31Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.056789 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4d9a2a4e0be6b9ab12a348356a2cc8e8211a95855cab5a24ff9b3967b837140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:07:29Z\\\",\\\"message\\\":\\\"2025-12-06T09:06:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b6f0cc5a-271f-4a32-bbe2-902361277552\\\\n2025-12-06T09:06:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b6f0cc5a-271f-4a32-bbe2-902361277552 to /host/opt/cni/bin/\\\\n2025-12-06T09:06:43Z [verbose] multus-daemon started\\\\n2025-12-06T09:06:43Z [verbose] Readiness Indicator file check\\\\n2025-12-06T09:07:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:31Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.071128 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:31Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.083841 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:31Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.097570 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:31Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.104820 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.104866 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.104880 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.104902 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.104917 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:31Z","lastTransitionTime":"2025-12-06T09:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.112561 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61df9d53-92e8-439f-8d15-44e96d25a23e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cad8f3bb7aca435b771c2e1843d53eabc28463caaf29de4650edcf6681ca9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e75ceaf93a3d396036177b57e3f468fb6bc704896dc27cd2e8ab6924eab53b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ch46n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:31Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.125114 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxrkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37625968-279a-4fc1-bfa2-b03868e7363d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca1d5b78e98dc1b35785758a9e44908823d0f5589f5a8d505ea1e909bb97dbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vls65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxrkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:31Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.137779 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-w587t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fca5f829-3091-4191-abf5-2bece3ab91f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-w587t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:31Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.153846 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:31Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.169916 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:31Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.184756 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:31Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.199330 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:31Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.207173 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.207211 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.207229 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.207250 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.207264 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:31Z","lastTransitionTime":"2025-12-06T09:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.219880 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24568caa62492721d1b712eac7a48bdb14d98f734bef0ec54e7a454771638d0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24568caa62492721d1b712eac7a48bdb14d98f734bef0ec54e7a454771638d0c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:07:12Z\\\",\\\"message\\\":\\\"aler-operator]} name:Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.245:443: 10.217.5.245:9192:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {54fbe873-7e6d-475f-a0ad-8dd5f06d850d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1206 09:07:12.323738 6217 services_controller.go:452] Built service openshift-apiserver-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1206 09:07:12.323747 6217 services_controller.go:453] Built service openshift-apiserver-operator/metrics template LB for network=default: []services.LB{}\\\\nI1206 09:07:12.323753 6217 services_controller.go:454] Service openshift-apiserver-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1206 09:07:12.323779 6217 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization,\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:07:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xbbs5_openshift-ovn-kubernetes(713432b9-3b28-4ad0-b578-9d42aa1931aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:31Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.233850 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae1cc18-abb2-44a0-a368-2e211e266739\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://111abfbdab45a6256108067f5721a4dc7c30ba86fb03b635515222586085b2a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be2b7d9693248ad452729c60f6ad3599f1ead11da1334fc50007a3457242d9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bdaa018e1393770e97100fcf2505232341157f89658f052ba5e27572967e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e00ab8bdef1709d73446eacca39c22e7ea478b5d5a272c362ce234c135b6f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e00ab8bdef1709d73446eacca39c22e7ea478b5d5a272c362ce234c135b6f21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:31Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.251897 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:31Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.310030 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.310082 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.310095 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.310114 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.310126 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:31Z","lastTransitionTime":"2025-12-06T09:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.412275 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.412318 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.412327 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.412342 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.412352 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:31Z","lastTransitionTime":"2025-12-06T09:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.514662 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.514696 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.514706 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.514721 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.514731 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:31Z","lastTransitionTime":"2025-12-06T09:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.621125 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.621162 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.621171 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.621184 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.621194 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:31Z","lastTransitionTime":"2025-12-06T09:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.723699 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.723731 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.723741 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.723755 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.723765 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:31Z","lastTransitionTime":"2025-12-06T09:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.826827 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.826868 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.826877 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.826898 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.826910 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:31Z","lastTransitionTime":"2025-12-06T09:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.929933 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.929977 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.929987 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.930006 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:31 crc kubenswrapper[4672]: I1206 09:07:31.930018 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:31Z","lastTransitionTime":"2025-12-06T09:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.032431 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.032471 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.032480 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.032494 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.032504 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:32Z","lastTransitionTime":"2025-12-06T09:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.135110 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.135144 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.135154 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.135169 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.135181 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:32Z","lastTransitionTime":"2025-12-06T09:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.238873 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.238952 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.238973 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.239057 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.239084 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:32Z","lastTransitionTime":"2025-12-06T09:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.342345 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.342399 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.342411 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.342432 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.342444 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:32Z","lastTransitionTime":"2025-12-06T09:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.445242 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.445302 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.445321 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.445341 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.445353 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:32Z","lastTransitionTime":"2025-12-06T09:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.548333 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.548373 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.548384 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.548401 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.548413 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:32Z","lastTransitionTime":"2025-12-06T09:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.556567 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.556585 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.556693 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:32 crc kubenswrapper[4672]: E1206 09:07:32.556797 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.556846 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:32 crc kubenswrapper[4672]: E1206 09:07:32.557001 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:07:32 crc kubenswrapper[4672]: E1206 09:07:32.557221 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:07:32 crc kubenswrapper[4672]: E1206 09:07:32.557135 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.571089 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:32Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.585463 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:32Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.597103 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxrkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37625968-279a-4fc1-bfa2-b03868e7363d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca1d5b78e98dc1b35785758a9e44908823d0f5589f5a8d505ea1e909bb97dbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vls65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxrkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:32Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.609735 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-w587t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fca5f829-3091-4191-abf5-2bece3ab91f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-w587t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:32Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.625189 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:32Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.639946 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:32Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.650739 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.650794 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.650808 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.650827 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.650838 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:32Z","lastTransitionTime":"2025-12-06T09:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.652807 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:32Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.673685 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24568caa62492721d1b712eac7a48bdb14d98f734bef0ec54e7a454771638d0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24568caa62492721d1b712eac7a48bdb14d98f734bef0ec54e7a454771638d0c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:07:12Z\\\",\\\"message\\\":\\\"aler-operator]} name:Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.245:443: 10.217.5.245:9192:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {54fbe873-7e6d-475f-a0ad-8dd5f06d850d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1206 09:07:12.323738 6217 services_controller.go:452] Built service openshift-apiserver-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1206 09:07:12.323747 6217 services_controller.go:453] Built service openshift-apiserver-operator/metrics template LB for network=default: []services.LB{}\\\\nI1206 09:07:12.323753 6217 services_controller.go:454] Service openshift-apiserver-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1206 09:07:12.323779 6217 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization,\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:07:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xbbs5_openshift-ovn-kubernetes(713432b9-3b28-4ad0-b578-9d42aa1931aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:32Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.687590 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae1cc18-abb2-44a0-a368-2e211e266739\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://111abfbdab45a6256108067f5721a4dc7c30ba86fb03b635515222586085b2a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be2b7d9693248ad452729c60f6ad3599f1ead11da1334fc50007a3457242d9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bdaa018e1393770e97100fcf2505232341157f89658f052ba5e27572967e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e00ab8bdef1709d73446eacca39c22e7ea478b5d5a272c362ce234c135b6f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e00ab8bdef1709d73446eacca39c22e7ea478b5d5a272c362ce234c135b6f21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:32Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.702742 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:32Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.719194 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:32Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.737965 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:32Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.749019 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:32Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.754021 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.754047 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.754056 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.754071 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.754082 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:32Z","lastTransitionTime":"2025-12-06T09:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.764666 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://640f821886d65eb7ea8dc8ec35be25c33457d1dfb440632b932dfc0cb39b7b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:32Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.780662 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4d9a2a4e0be6b9ab12a348356a2cc8e8211a95855cab5a24ff9b3967b837140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:07:29Z\\\",\\\"message\\\":\\\"2025-12-06T09:06:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b6f0cc5a-271f-4a32-bbe2-902361277552\\\\n2025-12-06T09:06:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b6f0cc5a-271f-4a32-bbe2-902361277552 to /host/opt/cni/bin/\\\\n2025-12-06T09:06:43Z [verbose] multus-daemon started\\\\n2025-12-06T09:06:43Z [verbose] Readiness Indicator file check\\\\n2025-12-06T09:07:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:32Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.793150 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61df9d53-92e8-439f-8d15-44e96d25a23e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cad8f3bb7aca435b771c2e1843d53eabc28463caaf29de4650edcf6681ca9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e75ceaf93a3d396036177b57e3f468fb6bc704896dc27cd2e8ab6924eab53b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ch46n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:32Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.807347 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:32Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.855396 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.855427 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.855435 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.855450 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.855459 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:32Z","lastTransitionTime":"2025-12-06T09:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.959388 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.959449 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.959461 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.959483 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:32 crc kubenswrapper[4672]: I1206 09:07:32.959495 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:32Z","lastTransitionTime":"2025-12-06T09:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.061823 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.061853 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.061862 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.061875 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.061888 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:33Z","lastTransitionTime":"2025-12-06T09:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.165229 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.165279 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.165293 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.165314 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.165328 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:33Z","lastTransitionTime":"2025-12-06T09:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.267436 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.267488 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.267500 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.267520 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.267536 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:33Z","lastTransitionTime":"2025-12-06T09:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.369949 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.369985 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.369999 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.370015 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.370026 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:33Z","lastTransitionTime":"2025-12-06T09:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.472579 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.472860 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.472987 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.473103 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.473188 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:33Z","lastTransitionTime":"2025-12-06T09:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.575479 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.575520 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.575531 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.575549 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.575561 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:33Z","lastTransitionTime":"2025-12-06T09:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.677832 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.677890 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.677908 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.677933 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.677947 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:33Z","lastTransitionTime":"2025-12-06T09:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.780882 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.780938 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.780956 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.780982 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.780999 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:33Z","lastTransitionTime":"2025-12-06T09:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.883423 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.883704 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.883790 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.883880 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.883963 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:33Z","lastTransitionTime":"2025-12-06T09:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.987683 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.988080 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.988180 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.988274 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:33 crc kubenswrapper[4672]: I1206 09:07:33.988362 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:33Z","lastTransitionTime":"2025-12-06T09:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.091370 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.091403 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.091412 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.091427 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.091437 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:34Z","lastTransitionTime":"2025-12-06T09:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.193916 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.193964 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.193975 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.193991 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.194002 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:34Z","lastTransitionTime":"2025-12-06T09:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.295730 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.295781 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.295793 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.295814 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.295825 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:34Z","lastTransitionTime":"2025-12-06T09:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.397948 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.397991 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.398002 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.398018 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.398034 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:34Z","lastTransitionTime":"2025-12-06T09:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.500963 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.501009 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.501020 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.501036 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.501047 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:34Z","lastTransitionTime":"2025-12-06T09:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.556810 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.556849 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:34 crc kubenswrapper[4672]: E1206 09:07:34.556968 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.557042 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:34 crc kubenswrapper[4672]: E1206 09:07:34.557146 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:07:34 crc kubenswrapper[4672]: E1206 09:07:34.557218 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.557282 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:34 crc kubenswrapper[4672]: E1206 09:07:34.557343 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.603385 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.603427 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.603438 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.603453 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.603465 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:34Z","lastTransitionTime":"2025-12-06T09:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.705924 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.705963 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.705975 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.706014 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.706026 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:34Z","lastTransitionTime":"2025-12-06T09:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.810508 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.810644 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.810666 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.810690 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.810739 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:34Z","lastTransitionTime":"2025-12-06T09:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.914348 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.914403 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.914412 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.914428 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:34 crc kubenswrapper[4672]: I1206 09:07:34.914438 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:34Z","lastTransitionTime":"2025-12-06T09:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.017487 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.017552 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.017562 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.017582 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.017610 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:35Z","lastTransitionTime":"2025-12-06T09:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.121076 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.121123 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.121135 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.121154 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.121165 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:35Z","lastTransitionTime":"2025-12-06T09:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.227716 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.228127 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.228274 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.228409 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.228559 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:35Z","lastTransitionTime":"2025-12-06T09:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.331885 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.331964 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.331987 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.332022 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.332049 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:35Z","lastTransitionTime":"2025-12-06T09:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.434138 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.434217 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.434236 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.434263 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.434280 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:35Z","lastTransitionTime":"2025-12-06T09:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.536824 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.536888 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.536906 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.536930 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.536945 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:35Z","lastTransitionTime":"2025-12-06T09:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.639312 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.639367 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.639376 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.639395 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.639410 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:35Z","lastTransitionTime":"2025-12-06T09:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.742419 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.742455 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.742470 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.742495 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.742511 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:35Z","lastTransitionTime":"2025-12-06T09:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.844880 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.844923 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.844932 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.844948 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.844958 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:35Z","lastTransitionTime":"2025-12-06T09:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.947676 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.947715 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.947724 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.947738 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.947747 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:35Z","lastTransitionTime":"2025-12-06T09:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.987484 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.987528 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.987542 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.987561 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:35 crc kubenswrapper[4672]: I1206 09:07:35.987576 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:35Z","lastTransitionTime":"2025-12-06T09:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:36 crc kubenswrapper[4672]: E1206 09:07:36.001953 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dee4872a-ee41-4a28-b591-3da52b9dd3d6\\\",\\\"systemUUID\\\":\\\"7e6e2ea0-eb53-4cec-8366-444329cefc63\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:35Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.005570 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.005625 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.005635 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.005647 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.005656 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:36Z","lastTransitionTime":"2025-12-06T09:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:36 crc kubenswrapper[4672]: E1206 09:07:36.023187 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dee4872a-ee41-4a28-b591-3da52b9dd3d6\\\",\\\"systemUUID\\\":\\\"7e6e2ea0-eb53-4cec-8366-444329cefc63\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:36Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.027291 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.027320 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.027332 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.027349 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.027360 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:36Z","lastTransitionTime":"2025-12-06T09:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:36 crc kubenswrapper[4672]: E1206 09:07:36.048214 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dee4872a-ee41-4a28-b591-3da52b9dd3d6\\\",\\\"systemUUID\\\":\\\"7e6e2ea0-eb53-4cec-8366-444329cefc63\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:36Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.052847 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.052896 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.052910 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.052928 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.052942 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:36Z","lastTransitionTime":"2025-12-06T09:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:36 crc kubenswrapper[4672]: E1206 09:07:36.066708 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dee4872a-ee41-4a28-b591-3da52b9dd3d6\\\",\\\"systemUUID\\\":\\\"7e6e2ea0-eb53-4cec-8366-444329cefc63\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:36Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.070841 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.070880 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.070895 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.070916 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.070931 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:36Z","lastTransitionTime":"2025-12-06T09:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:36 crc kubenswrapper[4672]: E1206 09:07:36.084549 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dee4872a-ee41-4a28-b591-3da52b9dd3d6\\\",\\\"systemUUID\\\":\\\"7e6e2ea0-eb53-4cec-8366-444329cefc63\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:36Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:36 crc kubenswrapper[4672]: E1206 09:07:36.084789 4672 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.087127 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.087167 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.087177 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.087192 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.087204 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:36Z","lastTransitionTime":"2025-12-06T09:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.190076 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.190108 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.190117 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.190130 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.190139 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:36Z","lastTransitionTime":"2025-12-06T09:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.292179 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.292209 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.292218 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.292234 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.292246 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:36Z","lastTransitionTime":"2025-12-06T09:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.394192 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.394227 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.394238 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.394254 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.394265 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:36Z","lastTransitionTime":"2025-12-06T09:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.497084 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.497134 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.497146 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.497164 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.497176 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:36Z","lastTransitionTime":"2025-12-06T09:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.556172 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.556207 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.556220 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.556235 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:36 crc kubenswrapper[4672]: E1206 09:07:36.556370 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:07:36 crc kubenswrapper[4672]: E1206 09:07:36.556424 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:07:36 crc kubenswrapper[4672]: E1206 09:07:36.556484 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:07:36 crc kubenswrapper[4672]: E1206 09:07:36.556738 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.599166 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.599234 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.599259 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.599290 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.599312 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:36Z","lastTransitionTime":"2025-12-06T09:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.701520 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.701577 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.701594 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.701647 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.701666 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:36Z","lastTransitionTime":"2025-12-06T09:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.805777 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.805844 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.805862 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.805890 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.805914 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:36Z","lastTransitionTime":"2025-12-06T09:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.909308 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.909357 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.909372 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.909392 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:36 crc kubenswrapper[4672]: I1206 09:07:36.909406 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:36Z","lastTransitionTime":"2025-12-06T09:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.013047 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.013139 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.013162 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.013191 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.013213 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:37Z","lastTransitionTime":"2025-12-06T09:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.116207 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.116627 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.116803 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.117017 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.117167 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:37Z","lastTransitionTime":"2025-12-06T09:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.219297 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.219368 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.219381 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.219398 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.219411 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:37Z","lastTransitionTime":"2025-12-06T09:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.321865 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.321949 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.321968 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.321990 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.322007 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:37Z","lastTransitionTime":"2025-12-06T09:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.426027 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.426392 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.426542 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.426722 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.426866 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:37Z","lastTransitionTime":"2025-12-06T09:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.529873 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.529925 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.529947 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.529975 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.529996 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:37Z","lastTransitionTime":"2025-12-06T09:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.632874 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.632911 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.632924 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.632938 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.632950 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:37Z","lastTransitionTime":"2025-12-06T09:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.736247 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.736316 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.736341 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.736371 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.736391 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:37Z","lastTransitionTime":"2025-12-06T09:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.840322 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.840372 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.840384 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.840402 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.840417 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:37Z","lastTransitionTime":"2025-12-06T09:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.943676 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.943726 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.943735 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.943754 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:37 crc kubenswrapper[4672]: I1206 09:07:37.943765 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:37Z","lastTransitionTime":"2025-12-06T09:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.046643 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.046927 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.047009 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.047099 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.047188 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:38Z","lastTransitionTime":"2025-12-06T09:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.150815 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.150882 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.150899 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.150921 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.150934 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:38Z","lastTransitionTime":"2025-12-06T09:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.253881 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.253917 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.253930 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.253948 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.253961 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:38Z","lastTransitionTime":"2025-12-06T09:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.356969 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.357299 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.357434 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.357557 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.357824 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:38Z","lastTransitionTime":"2025-12-06T09:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.460672 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.461106 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.461319 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.461486 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.461587 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:38Z","lastTransitionTime":"2025-12-06T09:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.556354 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:38 crc kubenswrapper[4672]: E1206 09:07:38.556549 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.556374 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:38 crc kubenswrapper[4672]: E1206 09:07:38.556733 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.556993 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.557569 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.557898 4672 scope.go:117] "RemoveContainer" containerID="24568caa62492721d1b712eac7a48bdb14d98f734bef0ec54e7a454771638d0c" Dec 06 09:07:38 crc kubenswrapper[4672]: E1206 09:07:38.557935 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:07:38 crc kubenswrapper[4672]: E1206 09:07:38.557969 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.563345 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.563395 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.563415 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.563439 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.563461 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:38Z","lastTransitionTime":"2025-12-06T09:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.666070 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.666115 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.666127 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.666148 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.666162 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:38Z","lastTransitionTime":"2025-12-06T09:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.769729 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.769803 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.769823 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.769891 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.769914 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:38Z","lastTransitionTime":"2025-12-06T09:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.872639 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.872710 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.872722 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.872741 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.872754 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:38Z","lastTransitionTime":"2025-12-06T09:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.975690 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.975733 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.975746 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.975763 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:38 crc kubenswrapper[4672]: I1206 09:07:38.975775 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:38Z","lastTransitionTime":"2025-12-06T09:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.034281 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbbs5_713432b9-3b28-4ad0-b578-9d42aa1931aa/ovnkube-controller/2.log" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.041554 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" event={"ID":"713432b9-3b28-4ad0-b578-9d42aa1931aa","Type":"ContainerStarted","Data":"749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806"} Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.042985 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.058300 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://640f821886d65eb7ea8dc8ec35be25c33457d1dfb440632b932dfc0cb39b7b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:39Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.071275 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4d9a2a4e0be6b9ab12a348356a2cc8e8211a95855cab5a24ff9b3967b837140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:07:29Z\\\",\\\"message\\\":\\\"2025-12-06T09:06:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b6f0cc5a-271f-4a32-bbe2-902361277552\\\\n2025-12-06T09:06:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b6f0cc5a-271f-4a32-bbe2-902361277552 to /host/opt/cni/bin/\\\\n2025-12-06T09:06:43Z [verbose] multus-daemon started\\\\n2025-12-06T09:06:43Z [verbose] Readiness Indicator file check\\\\n2025-12-06T09:07:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:39Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.078667 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.078709 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.078721 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.078742 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.078754 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:39Z","lastTransitionTime":"2025-12-06T09:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.088304 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:39Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.104168 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:39Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.117756 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:39Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.130175 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:39Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.147902 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:39Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.161055 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61df9d53-92e8-439f-8d15-44e96d25a23e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cad8f3bb7aca435b771c2e1843d53eabc28463caaf29de4650edcf6681ca9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e75ceaf93a3d396036177b57e3f468fb6bc704896dc27cd2e8ab6924eab53b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ch46n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:39Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.179229 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:39Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.180480 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.180612 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.180693 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.180779 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.180849 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:39Z","lastTransitionTime":"2025-12-06T09:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.194514 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:39Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.213780 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:39Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.231901 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxrkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37625968-279a-4fc1-bfa2-b03868e7363d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca1d5b78e98dc1b35785758a9e44908823d0f5589f5a8d505ea1e909bb97dbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vls65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxrkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:39Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.257337 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-w587t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fca5f829-3091-4191-abf5-2bece3ab91f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-w587t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:39Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.271951 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae1cc18-abb2-44a0-a368-2e211e266739\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://111abfbdab45a6256108067f5721a4dc7c30ba86fb03b635515222586085b2a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be2b7d9693248ad452729c60f6ad3599f1ead11da1334fc50007a3457242d9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bdaa018e1393770e97100fcf2505232341157f89658f052ba5e27572967e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e00ab8bdef1709d73446eacca39c22e7ea478b5d5a272c362ce234c135b6f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e00ab8bdef1709d73446eacca39c22e7ea478b5d5a272c362ce234c135b6f21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:39Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.283830 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.284127 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.284253 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.284394 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.284507 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:39Z","lastTransitionTime":"2025-12-06T09:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.290994 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:39Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.305135 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:39Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.322684 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24568caa62492721d1b712eac7a48bdb14d98f734bef0ec54e7a454771638d0c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:07:12Z\\\",\\\"message\\\":\\\"aler-operator]} name:Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.245:443: 10.217.5.245:9192:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {54fbe873-7e6d-475f-a0ad-8dd5f06d850d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1206 09:07:12.323738 6217 services_controller.go:452] Built service openshift-apiserver-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1206 09:07:12.323747 6217 services_controller.go:453] Built service openshift-apiserver-operator/metrics template LB for network=default: []services.LB{}\\\\nI1206 09:07:12.323753 6217 services_controller.go:454] Service openshift-apiserver-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1206 09:07:12.323779 6217 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization,\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:07:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:39Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.386976 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.387021 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.387053 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.387068 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.387078 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:39Z","lastTransitionTime":"2025-12-06T09:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.489244 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.489278 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.489287 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.489300 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.489310 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:39Z","lastTransitionTime":"2025-12-06T09:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.592090 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.592129 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.592139 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.592153 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.592161 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:39Z","lastTransitionTime":"2025-12-06T09:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.694887 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.694916 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.694925 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.694941 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.694949 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:39Z","lastTransitionTime":"2025-12-06T09:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.806546 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.806895 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.806997 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.807080 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.807177 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:39Z","lastTransitionTime":"2025-12-06T09:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.909559 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.909651 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.909664 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.909682 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:39 crc kubenswrapper[4672]: I1206 09:07:39.909714 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:39Z","lastTransitionTime":"2025-12-06T09:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.011789 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.011861 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.011880 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.011906 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.011926 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:40Z","lastTransitionTime":"2025-12-06T09:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.047870 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbbs5_713432b9-3b28-4ad0-b578-9d42aa1931aa/ovnkube-controller/3.log" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.048630 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbbs5_713432b9-3b28-4ad0-b578-9d42aa1931aa/ovnkube-controller/2.log" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.052227 4672 generic.go:334] "Generic (PLEG): container finished" podID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerID="749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806" exitCode=1 Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.052321 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" event={"ID":"713432b9-3b28-4ad0-b578-9d42aa1931aa","Type":"ContainerDied","Data":"749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806"} Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.052535 4672 scope.go:117] "RemoveContainer" containerID="24568caa62492721d1b712eac7a48bdb14d98f734bef0ec54e7a454771638d0c" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.054264 4672 scope.go:117] "RemoveContainer" containerID="749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806" Dec 06 09:07:40 crc kubenswrapper[4672]: E1206 09:07:40.054739 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xbbs5_openshift-ovn-kubernetes(713432b9-3b28-4ad0-b578-9d42aa1931aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.069131 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxrkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37625968-279a-4fc1-bfa2-b03868e7363d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca1d5b78e98dc1b35785758a9e44908823d0f5589f5a8d505ea1e909bb97dbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vls65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxrkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:40Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.086139 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-w587t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fca5f829-3091-4191-abf5-2bece3ab91f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-w587t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:40Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.104151 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:40Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.114018 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.114072 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.114088 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.114108 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.114123 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:40Z","lastTransitionTime":"2025-12-06T09:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.121718 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:40Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.139351 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:40Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.151995 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:40Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.169196 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24568caa62492721d1b712eac7a48bdb14d98f734bef0ec54e7a454771638d0c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:07:12Z\\\",\\\"message\\\":\\\"aler-operator]} name:Service_openshift-machine-api/cluster-autoscaler-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.245:443: 10.217.5.245:9192:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {54fbe873-7e6d-475f-a0ad-8dd5f06d850d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1206 09:07:12.323738 6217 services_controller.go:452] Built service openshift-apiserver-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1206 09:07:12.323747 6217 services_controller.go:453] Built service openshift-apiserver-operator/metrics template LB for network=default: []services.LB{}\\\\nI1206 09:07:12.323753 6217 services_controller.go:454] Service openshift-apiserver-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1206 09:07:12.323779 6217 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization,\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:07:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:07:39Z\\\",\\\"message\\\":\\\"eflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 09:07:39.560347 6571 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 09:07:39.560413 6571 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 09:07:39.560695 6571 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 09:07:39.561084 6571 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 09:07:39.561161 6571 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 09:07:39.561246 6571 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 09:07:39.574231 6571 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1206 09:07:39.574276 6571 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1206 09:07:39.574354 6571 ovnkube.go:599] Stopped ovnkube\\\\nI1206 09:07:39.574380 6571 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1206 09:07:39.574483 6571 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:40Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.180214 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae1cc18-abb2-44a0-a368-2e211e266739\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://111abfbdab45a6256108067f5721a4dc7c30ba86fb03b635515222586085b2a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be2b7d9693248ad452729c60f6ad3599f1ead11da1334fc50007a3457242d9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bdaa018e1393770e97100fcf2505232341157f89658f052ba5e27572967e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e00ab8bdef1709d73446eacca39c22e7ea478b5d5a272c362ce234c135b6f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e00ab8bdef1709d73446eacca39c22e7ea478b5d5a272c362ce234c135b6f21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:40Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.191290 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:40Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.206131 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:40Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.215912 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.215942 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.215952 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.215966 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.215975 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:40Z","lastTransitionTime":"2025-12-06T09:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.217326 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:40Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.230801 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://640f821886d65eb7ea8dc8ec35be25c33457d1dfb440632b932dfc0cb39b7b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:40Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.243449 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4d9a2a4e0be6b9ab12a348356a2cc8e8211a95855cab5a24ff9b3967b837140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:07:29Z\\\",\\\"message\\\":\\\"2025-12-06T09:06:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b6f0cc5a-271f-4a32-bbe2-902361277552\\\\n2025-12-06T09:06:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b6f0cc5a-271f-4a32-bbe2-902361277552 to /host/opt/cni/bin/\\\\n2025-12-06T09:06:43Z [verbose] multus-daemon started\\\\n2025-12-06T09:06:43Z [verbose] Readiness Indicator file check\\\\n2025-12-06T09:07:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:40Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.257401 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:40Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.270702 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:40Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.282998 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:40Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.293276 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61df9d53-92e8-439f-8d15-44e96d25a23e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cad8f3bb7aca435b771c2e1843d53eabc28463caaf29de4650edcf6681ca9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e75ceaf93a3d396036177b57e3f468fb6bc704896dc27cd2e8ab6924eab53b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ch46n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:40Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.317863 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.318063 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.318182 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.318277 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.318338 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:40Z","lastTransitionTime":"2025-12-06T09:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.421332 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.421645 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.421864 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.422061 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.422217 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:40Z","lastTransitionTime":"2025-12-06T09:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.525139 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.525493 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.525668 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.525801 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.526072 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:40Z","lastTransitionTime":"2025-12-06T09:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.556827 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:40 crc kubenswrapper[4672]: E1206 09:07:40.557228 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.556898 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.556828 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.557026 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:40 crc kubenswrapper[4672]: E1206 09:07:40.558437 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:07:40 crc kubenswrapper[4672]: E1206 09:07:40.558676 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:07:40 crc kubenswrapper[4672]: E1206 09:07:40.558761 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.628476 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.628517 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.628531 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.628553 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.628567 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:40Z","lastTransitionTime":"2025-12-06T09:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.731710 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.731785 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.731812 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.731840 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.731861 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:40Z","lastTransitionTime":"2025-12-06T09:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.834680 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.834736 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.834748 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.834767 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.834781 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:40Z","lastTransitionTime":"2025-12-06T09:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.937526 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.937827 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.937921 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.938002 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:40 crc kubenswrapper[4672]: I1206 09:07:40.938112 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:40Z","lastTransitionTime":"2025-12-06T09:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.041185 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.041264 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.041282 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.041310 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.041329 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:41Z","lastTransitionTime":"2025-12-06T09:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.059198 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbbs5_713432b9-3b28-4ad0-b578-9d42aa1931aa/ovnkube-controller/3.log" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.064850 4672 scope.go:117] "RemoveContainer" containerID="749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806" Dec 06 09:07:41 crc kubenswrapper[4672]: E1206 09:07:41.065123 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xbbs5_openshift-ovn-kubernetes(713432b9-3b28-4ad0-b578-9d42aa1931aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.080454 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae1cc18-abb2-44a0-a368-2e211e266739\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://111abfbdab45a6256108067f5721a4dc7c30ba86fb03b635515222586085b2a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be2b7d9693248ad452729c60f6ad3599f1ead11da1334fc50007a3457242d9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bdaa018e1393770e97100fcf2505232341157f89658f052ba5e27572967e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e00ab8bdef1709d73446eacca39c22e7ea478b5d5a272c362ce234c135b6f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e00ab8bdef1709d73446eacca39c22e7ea478b5d5a272c362ce234c135b6f21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:41Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.095788 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:41Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.111671 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:41Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.136408 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:07:39Z\\\",\\\"message\\\":\\\"eflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 09:07:39.560347 6571 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 09:07:39.560413 6571 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 09:07:39.560695 6571 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 09:07:39.561084 6571 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 09:07:39.561161 6571 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 09:07:39.561246 6571 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 09:07:39.574231 6571 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1206 09:07:39.574276 6571 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1206 09:07:39.574354 6571 ovnkube.go:599] Stopped ovnkube\\\\nI1206 09:07:39.574380 6571 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1206 09:07:39.574483 6571 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:07:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xbbs5_openshift-ovn-kubernetes(713432b9-3b28-4ad0-b578-9d42aa1931aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:41Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.144286 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.144343 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.144358 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.144379 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.144396 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:41Z","lastTransitionTime":"2025-12-06T09:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.165013 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4d9a2a4e0be6b9ab12a348356a2cc8e8211a95855cab5a24ff9b3967b837140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:07:29Z\\\",\\\"message\\\":\\\"2025-12-06T09:06:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b6f0cc5a-271f-4a32-bbe2-902361277552\\\\n2025-12-06T09:06:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b6f0cc5a-271f-4a32-bbe2-902361277552 to /host/opt/cni/bin/\\\\n2025-12-06T09:06:43Z [verbose] multus-daemon started\\\\n2025-12-06T09:06:43Z [verbose] Readiness Indicator file check\\\\n2025-12-06T09:07:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:41Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.187058 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:41Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.201261 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:41Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.213819 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:41Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.224229 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:41Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.241155 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://640f821886d65eb7ea8dc8ec35be25c33457d1dfb440632b932dfc0cb39b7b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:41Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.246124 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.246169 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.246184 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.246203 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.246217 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:41Z","lastTransitionTime":"2025-12-06T09:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.257030 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:41Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.269114 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61df9d53-92e8-439f-8d15-44e96d25a23e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cad8f3bb7aca435b771c2e1843d53eabc28463caaf29de4650edcf6681ca9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e75ceaf93a3d396036177b57e3f468fb6bc704896dc27cd2e8ab6924eab53b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ch46n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:41Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.283012 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:41Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.296742 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:41Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.309442 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:41Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.320336 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxrkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37625968-279a-4fc1-bfa2-b03868e7363d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca1d5b78e98dc1b35785758a9e44908823d0f5589f5a8d505ea1e909bb97dbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vls65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxrkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:41Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.334493 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-w587t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fca5f829-3091-4191-abf5-2bece3ab91f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-w587t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:41Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.348894 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.348926 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.348935 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.348971 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.348979 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:41Z","lastTransitionTime":"2025-12-06T09:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.451835 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.452265 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.452330 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.452401 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.452475 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:41Z","lastTransitionTime":"2025-12-06T09:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.554902 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.554939 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.554950 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.554967 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.554979 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:41Z","lastTransitionTime":"2025-12-06T09:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.656978 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.657033 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.657046 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.657065 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.657074 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:41Z","lastTransitionTime":"2025-12-06T09:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.759746 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.759787 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.759796 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.759811 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.759822 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:41Z","lastTransitionTime":"2025-12-06T09:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.862476 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.862588 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.862653 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.862684 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.862705 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:41Z","lastTransitionTime":"2025-12-06T09:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.965409 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.965481 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.965497 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.965516 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:41 crc kubenswrapper[4672]: I1206 09:07:41.965528 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:41Z","lastTransitionTime":"2025-12-06T09:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.067415 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.067447 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.067457 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.067471 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.067480 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:42Z","lastTransitionTime":"2025-12-06T09:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.170342 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.170381 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.170394 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.170412 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.170427 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:42Z","lastTransitionTime":"2025-12-06T09:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.273678 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.273753 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.273771 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.273796 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.273814 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:42Z","lastTransitionTime":"2025-12-06T09:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.375690 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.375738 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.375753 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.375775 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.375789 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:42Z","lastTransitionTime":"2025-12-06T09:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.479369 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.479435 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.479455 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.479481 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.479498 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:42Z","lastTransitionTime":"2025-12-06T09:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.556851 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.556971 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:42 crc kubenswrapper[4672]: E1206 09:07:42.557033 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:07:42 crc kubenswrapper[4672]: E1206 09:07:42.557153 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.557296 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:42 crc kubenswrapper[4672]: E1206 09:07:42.557376 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.557430 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:42 crc kubenswrapper[4672]: E1206 09:07:42.557510 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.578438 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:42Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.583144 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.583179 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.583189 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.583211 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.583227 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:42Z","lastTransitionTime":"2025-12-06T09:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.593951 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxrkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37625968-279a-4fc1-bfa2-b03868e7363d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca1d5b78e98dc1b35785758a9e44908823d0f5589f5a8d505ea1e909bb97dbb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vls65\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxrkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:42Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.614699 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-w587t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fca5f829-3091-4191-abf5-2bece3ab91f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qdq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-w587t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:42Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.648760 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c9c6b89f9bc20b99f8b510e4ad21cbe7176fa92aef76489b9771b65a96c34a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:42Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.668949 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:42Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.685590 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.685663 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.685678 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.685700 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.685712 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:42Z","lastTransitionTime":"2025-12-06T09:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.690095 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://411a4d94c5616561e415b9e5f9091318969376f85efb6abc9d0e80acfd9d77ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb22be1e0f0f6e6c20de4f649463cec068fa8671c9195d33c855a5c7dc21a22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:42Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.703737 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6cf22aa864ae56b106653e6204f7542a80e8533eadd820c6b9c8f5462a0d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmp5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4s7nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:42Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.726684 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"713432b9-3b28-4ad0-b578-9d42aa1931aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:07:39Z\\\",\\\"message\\\":\\\"eflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 09:07:39.560347 6571 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 09:07:39.560413 6571 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 09:07:39.560695 6571 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 09:07:39.561084 6571 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 09:07:39.561161 6571 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 09:07:39.561246 6571 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 09:07:39.574231 6571 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1206 09:07:39.574276 6571 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1206 09:07:39.574354 6571 ovnkube.go:599] Stopped ovnkube\\\\nI1206 09:07:39.574380 6571 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1206 09:07:39.574483 6571 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:07:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xbbs5_openshift-ovn-kubernetes(713432b9-3b28-4ad0-b578-9d42aa1931aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blgnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xbbs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:42Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.740897 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae1cc18-abb2-44a0-a368-2e211e266739\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://111abfbdab45a6256108067f5721a4dc7c30ba86fb03b635515222586085b2a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be2b7d9693248ad452729c60f6ad3599f1ead11da1334fc50007a3457242d9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bdaa018e1393770e97100fcf2505232341157f89658f052ba5e27572967e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e00ab8bdef1709d73446eacca39c22e7ea478b5d5a272c362ce234c135b6f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e00ab8bdef1709d73446eacca39c22e7ea478b5d5a272c362ce234c135b6f21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:42Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.754452 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:42Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.766841 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad17241658e36d7ff15a0546573b3ccc2fe4da57fa6751a21374a00f6436d5be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:42Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.779497 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl2fd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3843b7-3dcd-451e-a394-73bc3f037c9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65d626481b9decd8e6caa43436ce48ea5732e6bba4ae89e22ff00636da864d14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkjbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl2fd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:42Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.787937 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.787970 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.787982 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.787998 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.788010 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:42Z","lastTransitionTime":"2025-12-06T09:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.799378 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4471a809-0ca4-44fd-aa93-3d89e87a2291\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://640f821886d65eb7ea8dc8ec35be25c33457d1dfb440632b932dfc0cb39b7b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c131f876ea1ab7f265d5bcbb2938b85b7809342ee8c9c9092735cc38b10b4e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce01cac417e0599160da3e6974a2f05d52e2e34604c45480a6d712ca511800c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153b845dd11937229eccd4ffdc2c2b1834bd3d021fb108b5159e4be77edb7890\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7309dc53f618cb533f2d19c95873abcab5ed47bd2f362a6bb052f5dd01164ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d27d7da0a3deae4e8b43bedf2a5d3a8de02207fee2c47b92d96de7fe45eb0999\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c677cae0f95d00ce21b144d3fcf57431b3bb7f203a4b8ec6d1fb5cd4e21cd782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wr5rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fdr5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:42Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.824812 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks2jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b493f7-0dae-4eb4-9499-0564410528f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4d9a2a4e0be6b9ab12a348356a2cc8e8211a95855cab5a24ff9b3967b837140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T09:07:29Z\\\",\\\"message\\\":\\\"2025-12-06T09:06:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b6f0cc5a-271f-4a32-bbe2-902361277552\\\\n2025-12-06T09:06:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b6f0cc5a-271f-4a32-bbe2-902361277552 to /host/opt/cni/bin/\\\\n2025-12-06T09:06:43Z [verbose] multus-daemon started\\\\n2025-12-06T09:06:43Z [verbose] Readiness Indicator file check\\\\n2025-12-06T09:07:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5thfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks2jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:42Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.860456 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3505d55c-174e-4512-98f0-983267f3e3ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T09:06:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1206 09:06:34.932202 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 09:06:34.933193 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3267563576/tls.crt::/tmp/serving-cert-3267563576/tls.key\\\\\\\"\\\\nI1206 09:06:40.612789 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 09:06:40.635990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 09:06:40.636035 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 09:06:40.636059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 09:06:40.636064 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 09:06:40.652299 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1206 09:06:40.654218 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654244 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 09:06:40.654249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 09:06:40.654252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 09:06:40.654256 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 09:06:40.654259 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1206 09:06:40.652345 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1206 09:06:40.653818 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T09:06:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:42Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.887416 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7156eba8-b3af-4536-82ad-44ed58e21940\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8dabd3dd7ae170fd627f97f601f4a03915fb13937271fa4369ef308b694d35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c6ec4d65ddcd4aacfba04287fb22a8d68370b29bf3bf739ba2e9cb52d7f5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd9704ac9a074cbd28965a8740c71ee6dae8aeb8e8880f8b062448ff8c935e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:42Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.893929 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.894124 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.894524 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.894676 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.894761 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:42Z","lastTransitionTime":"2025-12-06T09:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.906074 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61df9d53-92e8-439f-8d15-44e96d25a23e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T09:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8cad8f3bb7aca435b771c2e1843d53eabc28463caaf29de4650edcf6681ca9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e75ceaf93a3d396036177b57e3f468fb6bc704896dc27cd2e8ab6924eab53b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T09:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svwl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T09:06:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ch46n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T09:07:42Z is after 2025-08-24T17:21:41Z" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.997536 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.997591 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.997619 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.997635 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:42 crc kubenswrapper[4672]: I1206 09:07:42.997645 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:42Z","lastTransitionTime":"2025-12-06T09:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.100222 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.100291 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.100318 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.100349 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.100372 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:43Z","lastTransitionTime":"2025-12-06T09:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.203277 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.203332 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.203348 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.203373 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.203391 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:43Z","lastTransitionTime":"2025-12-06T09:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.307149 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.308360 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.308569 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.308854 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.309064 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:43Z","lastTransitionTime":"2025-12-06T09:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.431563 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.431620 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.431634 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.431650 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.431663 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:43Z","lastTransitionTime":"2025-12-06T09:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.534523 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.535010 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.535149 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.535294 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.535478 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:43Z","lastTransitionTime":"2025-12-06T09:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.639359 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.639438 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.639455 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.639482 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.639501 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:43Z","lastTransitionTime":"2025-12-06T09:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.743144 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.743198 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.743207 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.743230 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.743259 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:43Z","lastTransitionTime":"2025-12-06T09:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.846393 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.846462 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.846480 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.846507 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.846527 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:43Z","lastTransitionTime":"2025-12-06T09:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.950112 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.950166 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.950179 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.950198 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:43 crc kubenswrapper[4672]: I1206 09:07:43.950209 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:43Z","lastTransitionTime":"2025-12-06T09:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.052624 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.052673 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.052687 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.052706 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.052720 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:44Z","lastTransitionTime":"2025-12-06T09:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.155998 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.156041 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.156050 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.156069 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.156079 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:44Z","lastTransitionTime":"2025-12-06T09:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.258521 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.258557 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.258565 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.258582 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.258621 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:44Z","lastTransitionTime":"2025-12-06T09:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.361416 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.361471 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.361484 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.361507 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.361524 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:44Z","lastTransitionTime":"2025-12-06T09:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.464053 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.464322 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.464402 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.464539 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.464639 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:44Z","lastTransitionTime":"2025-12-06T09:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.556132 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.556592 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.556678 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.556800 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:44 crc kubenswrapper[4672]: E1206 09:07:44.556921 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:07:44 crc kubenswrapper[4672]: E1206 09:07:44.556999 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:07:44 crc kubenswrapper[4672]: E1206 09:07:44.557076 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:07:44 crc kubenswrapper[4672]: E1206 09:07:44.557140 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.567239 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.567275 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.567283 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.567301 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.567312 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:44Z","lastTransitionTime":"2025-12-06T09:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.578585 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.671267 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.671305 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.671314 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.671330 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.671341 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:44Z","lastTransitionTime":"2025-12-06T09:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.774193 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.774257 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.774279 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.774308 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.774328 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:44Z","lastTransitionTime":"2025-12-06T09:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.877795 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.877850 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.877870 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.877896 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.877916 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:44Z","lastTransitionTime":"2025-12-06T09:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.980027 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.980715 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.980769 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.980804 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:44 crc kubenswrapper[4672]: I1206 09:07:44.980828 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:44Z","lastTransitionTime":"2025-12-06T09:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.082888 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.082932 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.082943 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.082961 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.082972 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:45Z","lastTransitionTime":"2025-12-06T09:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.186780 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.186856 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.186879 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.186908 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.186931 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:45Z","lastTransitionTime":"2025-12-06T09:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.290434 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.290758 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.290777 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.290792 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.290801 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:45Z","lastTransitionTime":"2025-12-06T09:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.393813 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.393849 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.393860 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.393874 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.393884 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:45Z","lastTransitionTime":"2025-12-06T09:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.495687 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.495727 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.495739 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.495756 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.495769 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:45Z","lastTransitionTime":"2025-12-06T09:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.598386 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.598441 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.598454 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.598472 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.598484 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:45Z","lastTransitionTime":"2025-12-06T09:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.702436 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.702486 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.702503 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.702525 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.702539 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:45Z","lastTransitionTime":"2025-12-06T09:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.805951 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.806029 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.806047 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.806080 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.806105 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:45Z","lastTransitionTime":"2025-12-06T09:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.908427 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.908460 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.908470 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.908483 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:45 crc kubenswrapper[4672]: I1206 09:07:45.908491 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:45Z","lastTransitionTime":"2025-12-06T09:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.012258 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.012313 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.012325 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.012343 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.012382 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:46Z","lastTransitionTime":"2025-12-06T09:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.115013 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.115038 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.115046 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.115060 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.115068 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:46Z","lastTransitionTime":"2025-12-06T09:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.218325 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.218370 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.218381 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.218398 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.218409 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:46Z","lastTransitionTime":"2025-12-06T09:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.321533 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.321573 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.321584 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.321626 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.321644 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:46Z","lastTransitionTime":"2025-12-06T09:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.393999 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.394067 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.394088 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.394117 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.394139 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:46Z","lastTransitionTime":"2025-12-06T09:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.431387 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.431766 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.431778 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.431816 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.431829 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T09:07:46Z","lastTransitionTime":"2025-12-06T09:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.471172 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-psfxh"] Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.471634 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-psfxh" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.474179 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.474201 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.474248 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.474854 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.479241 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.479310 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:46 crc kubenswrapper[4672]: E1206 09:07:46.479446 4672 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 09:07:46 crc kubenswrapper[4672]: E1206 09:07:46.479517 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 09:08:50.479496923 +0000 UTC m=+148.223757220 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 09:07:46 crc kubenswrapper[4672]: E1206 09:07:46.479782 4672 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 09:07:46 crc kubenswrapper[4672]: E1206 09:07:46.479841 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 09:08:50.479827122 +0000 UTC m=+148.224087429 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.525993 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=2.525975136 podStartE2EDuration="2.525975136s" podCreationTimestamp="2025-12-06 09:07:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:07:46.507957334 +0000 UTC m=+84.252217641" watchObservedRunningTime="2025-12-06 09:07:46.525975136 +0000 UTC m=+84.270235423" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.526182 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=64.526178442 podStartE2EDuration="1m4.526178442s" podCreationTimestamp="2025-12-06 09:06:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:07:46.525674868 +0000 UTC m=+84.269935155" watchObservedRunningTime="2025-12-06 09:07:46.526178442 +0000 UTC m=+84.270438729" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.556269 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.556272 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.556374 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.556735 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:46 crc kubenswrapper[4672]: E1206 09:07:46.556657 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:07:46 crc kubenswrapper[4672]: E1206 09:07:46.556989 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:07:46 crc kubenswrapper[4672]: E1206 09:07:46.557170 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:07:46 crc kubenswrapper[4672]: E1206 09:07:46.557875 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.562238 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-dl2fd" podStartSLOduration=66.562221226 podStartE2EDuration="1m6.562221226s" podCreationTimestamp="2025-12-06 09:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:07:46.56128951 +0000 UTC m=+84.305549817" watchObservedRunningTime="2025-12-06 09:07:46.562221226 +0000 UTC m=+84.306481513" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.577831 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-fdr5p" podStartSLOduration=66.577810002 podStartE2EDuration="1m6.577810002s" podCreationTimestamp="2025-12-06 09:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:07:46.577132004 +0000 UTC m=+84.321392291" watchObservedRunningTime="2025-12-06 09:07:46.577810002 +0000 UTC m=+84.322070289" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.579926 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.580033 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a4900173-d91a-49eb-8541-87f7b518694f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-psfxh\" (UID: \"a4900173-d91a-49eb-8541-87f7b518694f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-psfxh" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.580055 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4900173-d91a-49eb-8541-87f7b518694f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-psfxh\" (UID: \"a4900173-d91a-49eb-8541-87f7b518694f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-psfxh" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.580080 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.580100 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a4900173-d91a-49eb-8541-87f7b518694f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-psfxh\" (UID: \"a4900173-d91a-49eb-8541-87f7b518694f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-psfxh" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.580119 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.580142 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4900173-d91a-49eb-8541-87f7b518694f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-psfxh\" (UID: \"a4900173-d91a-49eb-8541-87f7b518694f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-psfxh" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.580157 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a4900173-d91a-49eb-8541-87f7b518694f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-psfxh\" (UID: \"a4900173-d91a-49eb-8541-87f7b518694f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-psfxh" Dec 06 09:07:46 crc kubenswrapper[4672]: E1206 09:07:46.580246 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:50.580232687 +0000 UTC m=+148.324492974 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:07:46 crc kubenswrapper[4672]: E1206 09:07:46.580341 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 09:07:46 crc kubenswrapper[4672]: E1206 09:07:46.580355 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 09:07:46 crc kubenswrapper[4672]: E1206 09:07:46.580365 4672 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:07:46 crc kubenswrapper[4672]: E1206 09:07:46.580392 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 09:08:50.580385911 +0000 UTC m=+148.324646198 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:07:46 crc kubenswrapper[4672]: E1206 09:07:46.580547 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 09:07:46 crc kubenswrapper[4672]: E1206 09:07:46.580559 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 09:07:46 crc kubenswrapper[4672]: E1206 09:07:46.580565 4672 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:07:46 crc kubenswrapper[4672]: E1206 09:07:46.580584 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 09:08:50.580577996 +0000 UTC m=+148.324838283 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.608232 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-ks2jd" podStartSLOduration=66.608216716 podStartE2EDuration="1m6.608216716s" podCreationTimestamp="2025-12-06 09:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:07:46.593249466 +0000 UTC m=+84.337509753" watchObservedRunningTime="2025-12-06 09:07:46.608216716 +0000 UTC m=+84.352477003" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.608617 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=62.608612506 podStartE2EDuration="1m2.608612506s" podCreationTimestamp="2025-12-06 09:06:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:07:46.606774147 +0000 UTC m=+84.351034434" watchObservedRunningTime="2025-12-06 09:07:46.608612506 +0000 UTC m=+84.352872793" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.624047 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ch46n" podStartSLOduration=65.624027179 podStartE2EDuration="1m5.624027179s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:07:46.62334808 +0000 UTC m=+84.367608367" watchObservedRunningTime="2025-12-06 09:07:46.624027179 +0000 UTC m=+84.368287466" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.681577 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4900173-d91a-49eb-8541-87f7b518694f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-psfxh\" (UID: \"a4900173-d91a-49eb-8541-87f7b518694f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-psfxh" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.681702 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a4900173-d91a-49eb-8541-87f7b518694f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-psfxh\" (UID: \"a4900173-d91a-49eb-8541-87f7b518694f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-psfxh" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.681983 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a4900173-d91a-49eb-8541-87f7b518694f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-psfxh\" (UID: \"a4900173-d91a-49eb-8541-87f7b518694f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-psfxh" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.682029 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4900173-d91a-49eb-8541-87f7b518694f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-psfxh\" (UID: \"a4900173-d91a-49eb-8541-87f7b518694f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-psfxh" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.682103 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a4900173-d91a-49eb-8541-87f7b518694f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-psfxh\" (UID: \"a4900173-d91a-49eb-8541-87f7b518694f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-psfxh" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.682207 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a4900173-d91a-49eb-8541-87f7b518694f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-psfxh\" (UID: \"a4900173-d91a-49eb-8541-87f7b518694f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-psfxh" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.683909 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a4900173-d91a-49eb-8541-87f7b518694f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-psfxh\" (UID: \"a4900173-d91a-49eb-8541-87f7b518694f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-psfxh" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.684015 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a4900173-d91a-49eb-8541-87f7b518694f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-psfxh\" (UID: \"a4900173-d91a-49eb-8541-87f7b518694f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-psfxh" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.696949 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-sxrkj" podStartSLOduration=65.696927768 podStartE2EDuration="1m5.696927768s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:07:46.676889332 +0000 UTC m=+84.421149649" watchObservedRunningTime="2025-12-06 09:07:46.696927768 +0000 UTC m=+84.441188055" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.696992 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4900173-d91a-49eb-8541-87f7b518694f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-psfxh\" (UID: \"a4900173-d91a-49eb-8541-87f7b518694f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-psfxh" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.705231 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4900173-d91a-49eb-8541-87f7b518694f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-psfxh\" (UID: \"a4900173-d91a-49eb-8541-87f7b518694f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-psfxh" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.724513 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=34.724499615 podStartE2EDuration="34.724499615s" podCreationTimestamp="2025-12-06 09:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:07:46.72393383 +0000 UTC m=+84.468194147" watchObservedRunningTime="2025-12-06 09:07:46.724499615 +0000 UTC m=+84.468759902" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.749823 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podStartSLOduration=66.749805182 podStartE2EDuration="1m6.749805182s" podCreationTimestamp="2025-12-06 09:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:07:46.74935052 +0000 UTC m=+84.493610837" watchObservedRunningTime="2025-12-06 09:07:46.749805182 +0000 UTC m=+84.494065469" Dec 06 09:07:46 crc kubenswrapper[4672]: I1206 09:07:46.793003 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-psfxh" Dec 06 09:07:47 crc kubenswrapper[4672]: I1206 09:07:47.081810 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-psfxh" event={"ID":"a4900173-d91a-49eb-8541-87f7b518694f","Type":"ContainerStarted","Data":"bcd3e4f2076bafc6672557be1abe39e1d774ee04e000b40386d460804bbdef58"} Dec 06 09:07:47 crc kubenswrapper[4672]: I1206 09:07:47.081861 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-psfxh" event={"ID":"a4900173-d91a-49eb-8541-87f7b518694f","Type":"ContainerStarted","Data":"0922ffb0e9cf752261b54819a3939eee0bf49c6500c9ec6479166b3735e089e4"} Dec 06 09:07:47 crc kubenswrapper[4672]: I1206 09:07:47.095129 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-psfxh" podStartSLOduration=67.095108397 podStartE2EDuration="1m7.095108397s" podCreationTimestamp="2025-12-06 09:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:07:47.094353326 +0000 UTC m=+84.838613623" watchObservedRunningTime="2025-12-06 09:07:47.095108397 +0000 UTC m=+84.839368684" Dec 06 09:07:48 crc kubenswrapper[4672]: I1206 09:07:48.556261 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:48 crc kubenswrapper[4672]: I1206 09:07:48.556368 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:48 crc kubenswrapper[4672]: I1206 09:07:48.556463 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:48 crc kubenswrapper[4672]: I1206 09:07:48.556561 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:48 crc kubenswrapper[4672]: E1206 09:07:48.556633 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:07:48 crc kubenswrapper[4672]: E1206 09:07:48.556779 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:07:48 crc kubenswrapper[4672]: E1206 09:07:48.557218 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:07:48 crc kubenswrapper[4672]: E1206 09:07:48.557557 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:07:50 crc kubenswrapper[4672]: I1206 09:07:50.555973 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:50 crc kubenswrapper[4672]: I1206 09:07:50.556058 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:50 crc kubenswrapper[4672]: I1206 09:07:50.556112 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:50 crc kubenswrapper[4672]: E1206 09:07:50.556652 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:07:50 crc kubenswrapper[4672]: E1206 09:07:50.556798 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:07:50 crc kubenswrapper[4672]: E1206 09:07:50.556935 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:07:50 crc kubenswrapper[4672]: I1206 09:07:50.556976 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:50 crc kubenswrapper[4672]: E1206 09:07:50.557066 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:07:50 crc kubenswrapper[4672]: I1206 09:07:50.571496 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 06 09:07:52 crc kubenswrapper[4672]: I1206 09:07:52.556465 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:52 crc kubenswrapper[4672]: E1206 09:07:52.558202 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:07:52 crc kubenswrapper[4672]: I1206 09:07:52.558346 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:52 crc kubenswrapper[4672]: E1206 09:07:52.558996 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:07:52 crc kubenswrapper[4672]: I1206 09:07:52.558414 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:52 crc kubenswrapper[4672]: E1206 09:07:52.559326 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:07:52 crc kubenswrapper[4672]: I1206 09:07:52.558366 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:52 crc kubenswrapper[4672]: E1206 09:07:52.559657 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:07:53 crc kubenswrapper[4672]: I1206 09:07:53.558019 4672 scope.go:117] "RemoveContainer" containerID="749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806" Dec 06 09:07:53 crc kubenswrapper[4672]: E1206 09:07:53.559040 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xbbs5_openshift-ovn-kubernetes(713432b9-3b28-4ad0-b578-9d42aa1931aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" Dec 06 09:07:54 crc kubenswrapper[4672]: I1206 09:07:54.556486 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:54 crc kubenswrapper[4672]: I1206 09:07:54.556594 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:54 crc kubenswrapper[4672]: E1206 09:07:54.556712 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:07:54 crc kubenswrapper[4672]: E1206 09:07:54.556828 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:07:54 crc kubenswrapper[4672]: I1206 09:07:54.556914 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:54 crc kubenswrapper[4672]: E1206 09:07:54.556960 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:07:54 crc kubenswrapper[4672]: I1206 09:07:54.557022 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:54 crc kubenswrapper[4672]: E1206 09:07:54.557094 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:07:56 crc kubenswrapper[4672]: I1206 09:07:56.556238 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:56 crc kubenswrapper[4672]: I1206 09:07:56.556257 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:56 crc kubenswrapper[4672]: I1206 09:07:56.557341 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:56 crc kubenswrapper[4672]: E1206 09:07:56.557655 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:07:56 crc kubenswrapper[4672]: I1206 09:07:56.557988 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:56 crc kubenswrapper[4672]: E1206 09:07:56.558154 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:07:56 crc kubenswrapper[4672]: E1206 09:07:56.558244 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:07:56 crc kubenswrapper[4672]: E1206 09:07:56.558772 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:07:58 crc kubenswrapper[4672]: I1206 09:07:58.556487 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:07:58 crc kubenswrapper[4672]: I1206 09:07:58.556495 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:07:58 crc kubenswrapper[4672]: E1206 09:07:58.557897 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:07:58 crc kubenswrapper[4672]: I1206 09:07:58.556523 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:58 crc kubenswrapper[4672]: I1206 09:07:58.556512 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:07:58 crc kubenswrapper[4672]: E1206 09:07:58.557976 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:07:58 crc kubenswrapper[4672]: E1206 09:07:58.558110 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:07:58 crc kubenswrapper[4672]: E1206 09:07:58.558239 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:07:59 crc kubenswrapper[4672]: I1206 09:07:59.225179 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fca5f829-3091-4191-abf5-2bece3ab91f7-metrics-certs\") pod \"network-metrics-daemon-w587t\" (UID: \"fca5f829-3091-4191-abf5-2bece3ab91f7\") " pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:07:59 crc kubenswrapper[4672]: E1206 09:07:59.225418 4672 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 09:07:59 crc kubenswrapper[4672]: E1206 09:07:59.225501 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fca5f829-3091-4191-abf5-2bece3ab91f7-metrics-certs podName:fca5f829-3091-4191-abf5-2bece3ab91f7 nodeName:}" failed. No retries permitted until 2025-12-06 09:09:03.225478027 +0000 UTC m=+160.969738344 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fca5f829-3091-4191-abf5-2bece3ab91f7-metrics-certs") pod "network-metrics-daemon-w587t" (UID: "fca5f829-3091-4191-abf5-2bece3ab91f7") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 09:08:00 crc kubenswrapper[4672]: I1206 09:08:00.556034 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:08:00 crc kubenswrapper[4672]: I1206 09:08:00.556041 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:08:00 crc kubenswrapper[4672]: E1206 09:08:00.556224 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:08:00 crc kubenswrapper[4672]: I1206 09:08:00.556057 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:08:00 crc kubenswrapper[4672]: E1206 09:08:00.556332 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:08:00 crc kubenswrapper[4672]: E1206 09:08:00.556492 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:08:00 crc kubenswrapper[4672]: I1206 09:08:00.556714 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:08:00 crc kubenswrapper[4672]: E1206 09:08:00.556839 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:08:02 crc kubenswrapper[4672]: I1206 09:08:02.556659 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:08:02 crc kubenswrapper[4672]: I1206 09:08:02.556673 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:08:02 crc kubenswrapper[4672]: I1206 09:08:02.557582 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:08:02 crc kubenswrapper[4672]: I1206 09:08:02.558760 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:08:02 crc kubenswrapper[4672]: E1206 09:08:02.558862 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:08:02 crc kubenswrapper[4672]: E1206 09:08:02.558744 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:08:02 crc kubenswrapper[4672]: E1206 09:08:02.559109 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:08:02 crc kubenswrapper[4672]: E1206 09:08:02.559238 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:08:04 crc kubenswrapper[4672]: I1206 09:08:04.556546 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:08:04 crc kubenswrapper[4672]: E1206 09:08:04.556758 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:08:04 crc kubenswrapper[4672]: I1206 09:08:04.556546 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:08:04 crc kubenswrapper[4672]: I1206 09:08:04.556561 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:08:04 crc kubenswrapper[4672]: I1206 09:08:04.556561 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:08:04 crc kubenswrapper[4672]: E1206 09:08:04.556953 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:08:04 crc kubenswrapper[4672]: E1206 09:08:04.557129 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:08:04 crc kubenswrapper[4672]: E1206 09:08:04.557223 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:08:06 crc kubenswrapper[4672]: I1206 09:08:06.556088 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:08:06 crc kubenswrapper[4672]: I1206 09:08:06.556177 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:08:06 crc kubenswrapper[4672]: I1206 09:08:06.556092 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:08:06 crc kubenswrapper[4672]: E1206 09:08:06.556278 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:08:06 crc kubenswrapper[4672]: I1206 09:08:06.556298 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:08:06 crc kubenswrapper[4672]: E1206 09:08:06.556365 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:08:06 crc kubenswrapper[4672]: E1206 09:08:06.556436 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:08:06 crc kubenswrapper[4672]: E1206 09:08:06.556493 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:08:07 crc kubenswrapper[4672]: I1206 09:08:07.556880 4672 scope.go:117] "RemoveContainer" containerID="749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806" Dec 06 09:08:07 crc kubenswrapper[4672]: E1206 09:08:07.557055 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xbbs5_openshift-ovn-kubernetes(713432b9-3b28-4ad0-b578-9d42aa1931aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" Dec 06 09:08:08 crc kubenswrapper[4672]: I1206 09:08:08.556560 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:08:08 crc kubenswrapper[4672]: I1206 09:08:08.556664 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:08:08 crc kubenswrapper[4672]: I1206 09:08:08.556561 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:08:08 crc kubenswrapper[4672]: E1206 09:08:08.556857 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:08:08 crc kubenswrapper[4672]: E1206 09:08:08.557047 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:08:08 crc kubenswrapper[4672]: I1206 09:08:08.557136 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:08:08 crc kubenswrapper[4672]: E1206 09:08:08.557262 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:08:08 crc kubenswrapper[4672]: E1206 09:08:08.557387 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:08:10 crc kubenswrapper[4672]: I1206 09:08:10.556816 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:08:10 crc kubenswrapper[4672]: I1206 09:08:10.556906 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:08:10 crc kubenswrapper[4672]: I1206 09:08:10.556837 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:08:10 crc kubenswrapper[4672]: I1206 09:08:10.556816 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:08:10 crc kubenswrapper[4672]: E1206 09:08:10.557351 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:08:10 crc kubenswrapper[4672]: E1206 09:08:10.557253 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:08:10 crc kubenswrapper[4672]: E1206 09:08:10.556979 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:08:10 crc kubenswrapper[4672]: E1206 09:08:10.557748 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:08:12 crc kubenswrapper[4672]: I1206 09:08:12.556883 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:08:12 crc kubenswrapper[4672]: E1206 09:08:12.557099 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:08:12 crc kubenswrapper[4672]: I1206 09:08:12.558853 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:08:12 crc kubenswrapper[4672]: I1206 09:08:12.558909 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:08:12 crc kubenswrapper[4672]: I1206 09:08:12.558964 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:08:12 crc kubenswrapper[4672]: E1206 09:08:12.559446 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:08:12 crc kubenswrapper[4672]: E1206 09:08:12.559640 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:08:12 crc kubenswrapper[4672]: E1206 09:08:12.559938 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:08:14 crc kubenswrapper[4672]: I1206 09:08:14.555905 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:08:14 crc kubenswrapper[4672]: I1206 09:08:14.555976 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:08:14 crc kubenswrapper[4672]: I1206 09:08:14.555996 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:08:14 crc kubenswrapper[4672]: I1206 09:08:14.556124 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:08:14 crc kubenswrapper[4672]: E1206 09:08:14.556126 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:08:14 crc kubenswrapper[4672]: E1206 09:08:14.556272 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:08:14 crc kubenswrapper[4672]: E1206 09:08:14.556411 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:08:14 crc kubenswrapper[4672]: E1206 09:08:14.556627 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:08:16 crc kubenswrapper[4672]: I1206 09:08:16.189263 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ks2jd_25b493f7-0dae-4eb4-9499-0564410528f7/kube-multus/1.log" Dec 06 09:08:16 crc kubenswrapper[4672]: I1206 09:08:16.190528 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ks2jd_25b493f7-0dae-4eb4-9499-0564410528f7/kube-multus/0.log" Dec 06 09:08:16 crc kubenswrapper[4672]: I1206 09:08:16.190581 4672 generic.go:334] "Generic (PLEG): container finished" podID="25b493f7-0dae-4eb4-9499-0564410528f7" containerID="e4d9a2a4e0be6b9ab12a348356a2cc8e8211a95855cab5a24ff9b3967b837140" exitCode=1 Dec 06 09:08:16 crc kubenswrapper[4672]: I1206 09:08:16.190644 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ks2jd" event={"ID":"25b493f7-0dae-4eb4-9499-0564410528f7","Type":"ContainerDied","Data":"e4d9a2a4e0be6b9ab12a348356a2cc8e8211a95855cab5a24ff9b3967b837140"} Dec 06 09:08:16 crc kubenswrapper[4672]: I1206 09:08:16.190701 4672 scope.go:117] "RemoveContainer" containerID="3cb8c1c80b3e467c3c83bf04eafb3d88e6b8c30601fa0cbf1c293185ed8b7328" Dec 06 09:08:16 crc kubenswrapper[4672]: I1206 09:08:16.191325 4672 scope.go:117] "RemoveContainer" containerID="e4d9a2a4e0be6b9ab12a348356a2cc8e8211a95855cab5a24ff9b3967b837140" Dec 06 09:08:16 crc kubenswrapper[4672]: E1206 09:08:16.191631 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-ks2jd_openshift-multus(25b493f7-0dae-4eb4-9499-0564410528f7)\"" pod="openshift-multus/multus-ks2jd" podUID="25b493f7-0dae-4eb4-9499-0564410528f7" Dec 06 09:08:16 crc kubenswrapper[4672]: I1206 09:08:16.210970 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=26.210946143 podStartE2EDuration="26.210946143s" podCreationTimestamp="2025-12-06 09:07:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:07:52.571146227 +0000 UTC m=+90.315406554" watchObservedRunningTime="2025-12-06 09:08:16.210946143 +0000 UTC m=+113.955206460" Dec 06 09:08:16 crc kubenswrapper[4672]: I1206 09:08:16.556101 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:08:16 crc kubenswrapper[4672]: I1206 09:08:16.556191 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:08:16 crc kubenswrapper[4672]: I1206 09:08:16.556221 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:08:16 crc kubenswrapper[4672]: E1206 09:08:16.556339 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:08:16 crc kubenswrapper[4672]: E1206 09:08:16.556436 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:08:16 crc kubenswrapper[4672]: E1206 09:08:16.556585 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:08:16 crc kubenswrapper[4672]: I1206 09:08:16.556752 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:08:16 crc kubenswrapper[4672]: E1206 09:08:16.556920 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:08:17 crc kubenswrapper[4672]: I1206 09:08:17.197999 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ks2jd_25b493f7-0dae-4eb4-9499-0564410528f7/kube-multus/1.log" Dec 06 09:08:18 crc kubenswrapper[4672]: I1206 09:08:18.556593 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:08:18 crc kubenswrapper[4672]: E1206 09:08:18.556869 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:08:18 crc kubenswrapper[4672]: I1206 09:08:18.556959 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:08:18 crc kubenswrapper[4672]: E1206 09:08:18.557157 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:08:18 crc kubenswrapper[4672]: I1206 09:08:18.557274 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:08:18 crc kubenswrapper[4672]: E1206 09:08:18.557325 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:08:18 crc kubenswrapper[4672]: I1206 09:08:18.557515 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:08:18 crc kubenswrapper[4672]: E1206 09:08:18.557577 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:08:20 crc kubenswrapper[4672]: I1206 09:08:20.556627 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:08:20 crc kubenswrapper[4672]: E1206 09:08:20.556854 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:08:20 crc kubenswrapper[4672]: I1206 09:08:20.557125 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:08:20 crc kubenswrapper[4672]: E1206 09:08:20.557190 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:08:20 crc kubenswrapper[4672]: I1206 09:08:20.557350 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:08:20 crc kubenswrapper[4672]: E1206 09:08:20.557425 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:08:20 crc kubenswrapper[4672]: I1206 09:08:20.557728 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:08:20 crc kubenswrapper[4672]: E1206 09:08:20.557810 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:08:22 crc kubenswrapper[4672]: I1206 09:08:22.556774 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:08:22 crc kubenswrapper[4672]: I1206 09:08:22.556916 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:08:22 crc kubenswrapper[4672]: I1206 09:08:22.556930 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:08:22 crc kubenswrapper[4672]: I1206 09:08:22.558143 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:08:22 crc kubenswrapper[4672]: E1206 09:08:22.558294 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:08:22 crc kubenswrapper[4672]: E1206 09:08:22.558440 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:08:22 crc kubenswrapper[4672]: E1206 09:08:22.558514 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:08:22 crc kubenswrapper[4672]: E1206 09:08:22.558578 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:08:22 crc kubenswrapper[4672]: I1206 09:08:22.558583 4672 scope.go:117] "RemoveContainer" containerID="749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806" Dec 06 09:08:22 crc kubenswrapper[4672]: E1206 09:08:22.588361 4672 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 06 09:08:22 crc kubenswrapper[4672]: E1206 09:08:22.621804 4672 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 06 09:08:23 crc kubenswrapper[4672]: I1206 09:08:23.222253 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbbs5_713432b9-3b28-4ad0-b578-9d42aa1931aa/ovnkube-controller/3.log" Dec 06 09:08:23 crc kubenswrapper[4672]: I1206 09:08:23.225763 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" event={"ID":"713432b9-3b28-4ad0-b578-9d42aa1931aa","Type":"ContainerStarted","Data":"c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29"} Dec 06 09:08:23 crc kubenswrapper[4672]: I1206 09:08:23.226370 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:08:23 crc kubenswrapper[4672]: I1206 09:08:23.261122 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" podStartSLOduration=102.261095461 podStartE2EDuration="1m42.261095461s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:23.260057143 +0000 UTC m=+121.004317450" watchObservedRunningTime="2025-12-06 09:08:23.261095461 +0000 UTC m=+121.005355758" Dec 06 09:08:23 crc kubenswrapper[4672]: I1206 09:08:23.516369 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-w587t"] Dec 06 09:08:23 crc kubenswrapper[4672]: I1206 09:08:23.516513 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:08:23 crc kubenswrapper[4672]: E1206 09:08:23.516648 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:08:24 crc kubenswrapper[4672]: I1206 09:08:24.556348 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:08:24 crc kubenswrapper[4672]: I1206 09:08:24.556436 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:08:24 crc kubenswrapper[4672]: I1206 09:08:24.556348 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:08:24 crc kubenswrapper[4672]: E1206 09:08:24.556584 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:08:24 crc kubenswrapper[4672]: E1206 09:08:24.556738 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:08:24 crc kubenswrapper[4672]: E1206 09:08:24.556879 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:08:25 crc kubenswrapper[4672]: I1206 09:08:25.556407 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:08:25 crc kubenswrapper[4672]: E1206 09:08:25.556680 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:08:26 crc kubenswrapper[4672]: I1206 09:08:26.556931 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:08:26 crc kubenswrapper[4672]: E1206 09:08:26.557116 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:08:26 crc kubenswrapper[4672]: I1206 09:08:26.557388 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:08:26 crc kubenswrapper[4672]: E1206 09:08:26.557478 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:08:26 crc kubenswrapper[4672]: I1206 09:08:26.557767 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:08:26 crc kubenswrapper[4672]: E1206 09:08:26.557895 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:08:27 crc kubenswrapper[4672]: I1206 09:08:27.556314 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:08:27 crc kubenswrapper[4672]: E1206 09:08:27.556527 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:08:27 crc kubenswrapper[4672]: E1206 09:08:27.623474 4672 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 06 09:08:28 crc kubenswrapper[4672]: I1206 09:08:28.106355 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:08:28 crc kubenswrapper[4672]: I1206 09:08:28.556116 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:08:28 crc kubenswrapper[4672]: I1206 09:08:28.556162 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:08:28 crc kubenswrapper[4672]: I1206 09:08:28.556170 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:08:28 crc kubenswrapper[4672]: E1206 09:08:28.556305 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:08:28 crc kubenswrapper[4672]: E1206 09:08:28.556380 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:08:28 crc kubenswrapper[4672]: E1206 09:08:28.556447 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:08:29 crc kubenswrapper[4672]: I1206 09:08:29.556499 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:08:29 crc kubenswrapper[4672]: E1206 09:08:29.556770 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:08:30 crc kubenswrapper[4672]: I1206 09:08:30.556149 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:08:30 crc kubenswrapper[4672]: I1206 09:08:30.556205 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:08:30 crc kubenswrapper[4672]: I1206 09:08:30.556361 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:08:30 crc kubenswrapper[4672]: E1206 09:08:30.556381 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:08:30 crc kubenswrapper[4672]: E1206 09:08:30.556488 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:08:30 crc kubenswrapper[4672]: E1206 09:08:30.556657 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:08:31 crc kubenswrapper[4672]: I1206 09:08:31.555930 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:08:31 crc kubenswrapper[4672]: E1206 09:08:31.556211 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:08:31 crc kubenswrapper[4672]: I1206 09:08:31.556357 4672 scope.go:117] "RemoveContainer" containerID="e4d9a2a4e0be6b9ab12a348356a2cc8e8211a95855cab5a24ff9b3967b837140" Dec 06 09:08:32 crc kubenswrapper[4672]: I1206 09:08:32.262922 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ks2jd_25b493f7-0dae-4eb4-9499-0564410528f7/kube-multus/1.log" Dec 06 09:08:32 crc kubenswrapper[4672]: I1206 09:08:32.262998 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ks2jd" event={"ID":"25b493f7-0dae-4eb4-9499-0564410528f7","Type":"ContainerStarted","Data":"091aa187d1ee2bf8ad4eebac8370dc750f5636fb05c10d1368d28b50dd876465"} Dec 06 09:08:32 crc kubenswrapper[4672]: I1206 09:08:32.556906 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:08:32 crc kubenswrapper[4672]: I1206 09:08:32.556950 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:08:32 crc kubenswrapper[4672]: E1206 09:08:32.558219 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:08:32 crc kubenswrapper[4672]: I1206 09:08:32.558261 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:08:32 crc kubenswrapper[4672]: E1206 09:08:32.558422 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:08:32 crc kubenswrapper[4672]: E1206 09:08:32.558764 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:08:32 crc kubenswrapper[4672]: E1206 09:08:32.624227 4672 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 06 09:08:33 crc kubenswrapper[4672]: I1206 09:08:33.556717 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:08:33 crc kubenswrapper[4672]: E1206 09:08:33.556833 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:08:34 crc kubenswrapper[4672]: I1206 09:08:34.556394 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:08:34 crc kubenswrapper[4672]: I1206 09:08:34.556421 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:08:34 crc kubenswrapper[4672]: I1206 09:08:34.556706 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:08:34 crc kubenswrapper[4672]: E1206 09:08:34.556663 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:08:34 crc kubenswrapper[4672]: E1206 09:08:34.556925 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:08:34 crc kubenswrapper[4672]: E1206 09:08:34.557012 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:08:35 crc kubenswrapper[4672]: I1206 09:08:35.556009 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:08:35 crc kubenswrapper[4672]: E1206 09:08:35.556296 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:08:36 crc kubenswrapper[4672]: I1206 09:08:36.556128 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:08:36 crc kubenswrapper[4672]: I1206 09:08:36.556143 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:08:36 crc kubenswrapper[4672]: E1206 09:08:36.556335 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 09:08:36 crc kubenswrapper[4672]: E1206 09:08:36.556449 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 09:08:36 crc kubenswrapper[4672]: I1206 09:08:36.556286 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:08:36 crc kubenswrapper[4672]: E1206 09:08:36.556583 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 09:08:37 crc kubenswrapper[4672]: I1206 09:08:37.556284 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:08:37 crc kubenswrapper[4672]: E1206 09:08:37.556421 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w587t" podUID="fca5f829-3091-4191-abf5-2bece3ab91f7" Dec 06 09:08:38 crc kubenswrapper[4672]: I1206 09:08:38.556027 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:08:38 crc kubenswrapper[4672]: I1206 09:08:38.556027 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:08:38 crc kubenswrapper[4672]: I1206 09:08:38.556379 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:08:38 crc kubenswrapper[4672]: I1206 09:08:38.559365 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 06 09:08:38 crc kubenswrapper[4672]: I1206 09:08:38.559434 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 06 09:08:38 crc kubenswrapper[4672]: I1206 09:08:38.559373 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 06 09:08:38 crc kubenswrapper[4672]: I1206 09:08:38.559860 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 06 09:08:39 crc kubenswrapper[4672]: I1206 09:08:39.555842 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:08:39 crc kubenswrapper[4672]: I1206 09:08:39.558291 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 06 09:08:39 crc kubenswrapper[4672]: I1206 09:08:39.558783 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 06 09:08:47 crc kubenswrapper[4672]: I1206 09:08:47.961050 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.018294 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lcghp"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.019118 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.025234 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.025630 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.025977 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.026201 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.027370 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.030430 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.030722 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.038450 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.043832 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.043898 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.044195 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.060135 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zvxtd"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.060720 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zvxtd" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.061275 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhb2v"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.061634 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhb2v" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.063208 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lm8cx"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.063855 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.063955 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-x88bb"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.064350 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-x88bb" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.069912 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.071062 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.071117 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.071293 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.071411 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.071565 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.071416 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.071500 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.071783 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.071500 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.071534 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.071854 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.073534 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.073740 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.073872 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.073981 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.073989 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.074260 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.074360 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.074407 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.074508 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.074681 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.076188 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.076232 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.083268 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.083576 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.091647 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.092308 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.104672 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.105002 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-b8m6z"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.105360 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-q66t7"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.105653 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-q66t7" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.105874 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-dcdqg"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.106037 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.106148 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dcdqg" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.106292 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-b8m6z" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.121698 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.121807 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.122237 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.122431 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.123321 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.123452 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.123611 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.124008 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.124177 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.124429 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.124582 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.124704 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.124761 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rcnv8"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.125295 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-4vzn5"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.125395 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.125552 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rcnv8" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.125750 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vzn5" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.126253 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-q8xwg"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.126287 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.126514 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.126660 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.126760 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.126810 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.126827 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.127457 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.127472 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.127863 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.128447 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-q8xwg" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.142410 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-47kpj"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.159758 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-47kpj" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.162328 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4l2wz"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.162830 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r6dhb"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.163283 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r6dhb" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.163709 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4l2wz" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.164545 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.164709 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.164810 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.164909 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.165020 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.165129 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.165283 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.165457 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.165640 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.165781 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.165823 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.166186 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.166243 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.166409 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.178196 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.178416 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.178446 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.178511 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.178562 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.178590 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.178679 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.178692 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.178763 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.179979 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.180657 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.184977 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.185579 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.185907 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.185982 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.186170 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.186466 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-x9m9h"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.187125 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-x9m9h" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.192730 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.194920 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.194960 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.196421 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qff28"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.197099 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.197865 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.198221 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.198293 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qff28" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.198452 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.198833 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.199157 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.199278 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.199350 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.199370 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.199398 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.199468 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.199576 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.200690 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05e5af51-76dc-4825-bab8-a5048aea49e9-client-ca\") pod \"controller-manager-879f6c89f-zvxtd\" (UID: \"05e5af51-76dc-4825-bab8-a5048aea49e9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zvxtd" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.200742 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/211614db-3bf5-4db7-9146-cc91303fc217-client-ca\") pod \"route-controller-manager-6576b87f9c-5pxtv\" (UID: \"211614db-3bf5-4db7-9146-cc91303fc217\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.200766 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-etcd-client\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.200786 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.200811 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.200833 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzzgf\" (UniqueName: \"kubernetes.io/projected/87e773f5-6efb-4613-9af8-f05c7af849e1-kube-api-access-fzzgf\") pod \"machine-api-operator-5694c8668f-b8m6z\" (UID: \"87e773f5-6efb-4613-9af8-f05c7af849e1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b8m6z" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.200852 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.200871 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vh9z\" (UniqueName: \"kubernetes.io/projected/05e5af51-76dc-4825-bab8-a5048aea49e9-kube-api-access-4vh9z\") pod \"controller-manager-879f6c89f-zvxtd\" (UID: \"05e5af51-76dc-4825-bab8-a5048aea49e9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zvxtd" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.200888 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-audit\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.200906 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbzhs\" (UniqueName: \"kubernetes.io/projected/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-kube-api-access-qbzhs\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.200924 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.200943 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.200962 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2fe5591f-8503-4eea-9b4f-e85419856dd6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5fpzc\" (UID: \"2fe5591f-8503-4eea-9b4f-e85419856dd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.200983 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201000 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2fe5591f-8503-4eea-9b4f-e85419856dd6-audit-policies\") pod \"apiserver-7bbb656c7d-5fpzc\" (UID: \"2fe5591f-8503-4eea-9b4f-e85419856dd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201016 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ae93c21-14e1-4248-98cf-a250cc060f20-config\") pod \"console-operator-58897d9998-q66t7\" (UID: \"8ae93c21-14e1-4248-98cf-a250cc060f20\") " pod="openshift-console-operator/console-operator-58897d9998-q66t7" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201032 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-encryption-config\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201049 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/211614db-3bf5-4db7-9146-cc91303fc217-config\") pod \"route-controller-manager-6576b87f9c-5pxtv\" (UID: \"211614db-3bf5-4db7-9146-cc91303fc217\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201067 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-etcd-serving-ca\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201083 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201099 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05e5af51-76dc-4825-bab8-a5048aea49e9-serving-cert\") pod \"controller-manager-879f6c89f-zvxtd\" (UID: \"05e5af51-76dc-4825-bab8-a5048aea49e9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zvxtd" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201117 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fe5591f-8503-4eea-9b4f-e85419856dd6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5fpzc\" (UID: \"2fe5591f-8503-4eea-9b4f-e85419856dd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201133 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72mz7\" (UniqueName: \"kubernetes.io/projected/2fe5591f-8503-4eea-9b4f-e85419856dd6-kube-api-access-72mz7\") pod \"apiserver-7bbb656c7d-5fpzc\" (UID: \"2fe5591f-8503-4eea-9b4f-e85419856dd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201154 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-node-pullsecrets\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201177 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-audit-policies\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201194 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj6kn\" (UniqueName: \"kubernetes.io/projected/f28bb046-9dd7-47e0-a498-1928568abe59-kube-api-access-cj6kn\") pod \"openshift-controller-manager-operator-756b6f6bc6-vhb2v\" (UID: \"f28bb046-9dd7-47e0-a498-1928568abe59\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhb2v" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201212 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de34b8a9-076f-4aa5-acb7-52361b6deeb8-trusted-ca-bundle\") pod \"console-f9d7485db-dcdqg\" (UID: \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\") " pod="openshift-console/console-f9d7485db-dcdqg" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201230 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201248 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2fe5591f-8503-4eea-9b4f-e85419856dd6-etcd-client\") pod \"apiserver-7bbb656c7d-5fpzc\" (UID: \"2fe5591f-8503-4eea-9b4f-e85419856dd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201278 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201298 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc9lc\" (UniqueName: \"kubernetes.io/projected/64de6d79-f439-4a73-9ac6-605a71c8aab7-kube-api-access-bc9lc\") pod \"downloads-7954f5f757-x88bb\" (UID: \"64de6d79-f439-4a73-9ac6-605a71c8aab7\") " pod="openshift-console/downloads-7954f5f757-x88bb" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201313 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f28bb046-9dd7-47e0-a498-1928568abe59-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vhb2v\" (UID: \"f28bb046-9dd7-47e0-a498-1928568abe59\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhb2v" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201362 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-config\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201377 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de34b8a9-076f-4aa5-acb7-52361b6deeb8-oauth-serving-cert\") pod \"console-f9d7485db-dcdqg\" (UID: \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\") " pod="openshift-console/console-f9d7485db-dcdqg" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201393 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/211614db-3bf5-4db7-9146-cc91303fc217-serving-cert\") pod \"route-controller-manager-6576b87f9c-5pxtv\" (UID: \"211614db-3bf5-4db7-9146-cc91303fc217\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201410 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwjp5\" (UniqueName: \"kubernetes.io/projected/211614db-3bf5-4db7-9146-cc91303fc217-kube-api-access-lwjp5\") pod \"route-controller-manager-6576b87f9c-5pxtv\" (UID: \"211614db-3bf5-4db7-9146-cc91303fc217\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201427 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de34b8a9-076f-4aa5-acb7-52361b6deeb8-console-serving-cert\") pod \"console-f9d7485db-dcdqg\" (UID: \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\") " pod="openshift-console/console-f9d7485db-dcdqg" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201446 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/87e773f5-6efb-4613-9af8-f05c7af849e1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-b8m6z\" (UID: \"87e773f5-6efb-4613-9af8-f05c7af849e1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b8m6z" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201461 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-serving-cert\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201479 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de34b8a9-076f-4aa5-acb7-52361b6deeb8-service-ca\") pod \"console-f9d7485db-dcdqg\" (UID: \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\") " pod="openshift-console/console-f9d7485db-dcdqg" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201499 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201520 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bwjj\" (UniqueName: \"kubernetes.io/projected/8ae93c21-14e1-4248-98cf-a250cc060f20-kube-api-access-4bwjj\") pod \"console-operator-58897d9998-q66t7\" (UID: \"8ae93c21-14e1-4248-98cf-a250cc060f20\") " pod="openshift-console-operator/console-operator-58897d9998-q66t7" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201544 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/05e5af51-76dc-4825-bab8-a5048aea49e9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zvxtd\" (UID: \"05e5af51-76dc-4825-bab8-a5048aea49e9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zvxtd" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201563 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/87e773f5-6efb-4613-9af8-f05c7af849e1-images\") pod \"machine-api-operator-5694c8668f-b8m6z\" (UID: \"87e773f5-6efb-4613-9af8-f05c7af849e1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b8m6z" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201578 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-audit-dir\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201609 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ae93c21-14e1-4248-98cf-a250cc060f20-trusted-ca\") pod \"console-operator-58897d9998-q66t7\" (UID: \"8ae93c21-14e1-4248-98cf-a250cc060f20\") " pod="openshift-console-operator/console-operator-58897d9998-q66t7" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201633 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fe5591f-8503-4eea-9b4f-e85419856dd6-serving-cert\") pod \"apiserver-7bbb656c7d-5fpzc\" (UID: \"2fe5591f-8503-4eea-9b4f-e85419856dd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201666 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-image-import-ca\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201685 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de34b8a9-076f-4aa5-acb7-52361b6deeb8-console-config\") pod \"console-f9d7485db-dcdqg\" (UID: \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\") " pod="openshift-console/console-f9d7485db-dcdqg" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201704 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f28bb046-9dd7-47e0-a498-1928568abe59-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vhb2v\" (UID: \"f28bb046-9dd7-47e0-a498-1928568abe59\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhb2v" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201721 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2fe5591f-8503-4eea-9b4f-e85419856dd6-audit-dir\") pod \"apiserver-7bbb656c7d-5fpzc\" (UID: \"2fe5591f-8503-4eea-9b4f-e85419856dd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201738 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ae93c21-14e1-4248-98cf-a250cc060f20-serving-cert\") pod \"console-operator-58897d9998-q66t7\" (UID: \"8ae93c21-14e1-4248-98cf-a250cc060f20\") " pod="openshift-console-operator/console-operator-58897d9998-q66t7" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201767 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05e5af51-76dc-4825-bab8-a5048aea49e9-config\") pod \"controller-manager-879f6c89f-zvxtd\" (UID: \"05e5af51-76dc-4825-bab8-a5048aea49e9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zvxtd" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201785 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201804 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87e773f5-6efb-4613-9af8-f05c7af849e1-config\") pod \"machine-api-operator-5694c8668f-b8m6z\" (UID: \"87e773f5-6efb-4613-9af8-f05c7af849e1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b8m6z" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201826 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de34b8a9-076f-4aa5-acb7-52361b6deeb8-console-oauth-config\") pod \"console-f9d7485db-dcdqg\" (UID: \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\") " pod="openshift-console/console-f9d7485db-dcdqg" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201843 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201861 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnw2g\" (UniqueName: \"kubernetes.io/projected/de34b8a9-076f-4aa5-acb7-52361b6deeb8-kube-api-access-cnw2g\") pod \"console-f9d7485db-dcdqg\" (UID: \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\") " pod="openshift-console/console-f9d7485db-dcdqg" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201878 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2fe5591f-8503-4eea-9b4f-e85419856dd6-encryption-config\") pod \"apiserver-7bbb656c7d-5fpzc\" (UID: \"2fe5591f-8503-4eea-9b4f-e85419856dd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201901 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-audit-dir\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.201918 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwzdj\" (UniqueName: \"kubernetes.io/projected/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-kube-api-access-fwzdj\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.202097 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.203920 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.207665 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lds42"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.208431 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lprjd"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.208952 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lprjd" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.209372 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lds42" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.215818 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-26xdk"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.216935 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-26xdk" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.219023 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dvs5c"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.225041 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-445qb"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.225623 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4h2gq"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.226531 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4r8t9"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.227921 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.228361 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zl58f"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.228393 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dvs5c" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.228409 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-445qb" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.229082 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4r8t9" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.229148 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-4h2gq" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.229980 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dbpp6"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.230373 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.230407 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zl58f" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.233108 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5ss48"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.245193 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-24dsx"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.248241 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.250823 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-24dsx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.250436 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-znflx"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.255866 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-pjlvh"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.259839 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pjlvh" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.260056 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-znflx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.260356 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5ss48" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.268581 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vrlvb"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.269578 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vrlvb" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.270414 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.270523 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8phjh"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.273007 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8phjh" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.276830 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c2g6x"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.282439 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pzl96"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.282961 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pzl96" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.283034 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c2g6x" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.283802 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.283983 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416860-wvm9x"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.284646 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-wvm9x" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.284876 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-bm7cx"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.288666 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-bm7cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.286813 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xqnzx"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.296119 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lcghp"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.296310 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhb2v"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.296414 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xqnzx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.299256 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lm8cx"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.299998 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zvxtd"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.302007 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-x88bb"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.302846 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87e773f5-6efb-4613-9af8-f05c7af849e1-config\") pod \"machine-api-operator-5694c8668f-b8m6z\" (UID: \"87e773f5-6efb-4613-9af8-f05c7af849e1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b8m6z" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.302994 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de34b8a9-076f-4aa5-acb7-52361b6deeb8-console-oauth-config\") pod \"console-f9d7485db-dcdqg\" (UID: \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\") " pod="openshift-console/console-f9d7485db-dcdqg" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.303117 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.303239 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnw2g\" (UniqueName: \"kubernetes.io/projected/de34b8a9-076f-4aa5-acb7-52361b6deeb8-kube-api-access-cnw2g\") pod \"console-f9d7485db-dcdqg\" (UID: \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\") " pod="openshift-console/console-f9d7485db-dcdqg" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.303335 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2fe5591f-8503-4eea-9b4f-e85419856dd6-encryption-config\") pod \"apiserver-7bbb656c7d-5fpzc\" (UID: \"2fe5591f-8503-4eea-9b4f-e85419856dd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.303467 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a7a39312-812f-45a0-ab3f-362048a42c5f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lds42\" (UID: \"a7a39312-812f-45a0-ab3f-362048a42c5f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lds42" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.303638 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-audit-dir\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.303774 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwzdj\" (UniqueName: \"kubernetes.io/projected/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-kube-api-access-fwzdj\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.303884 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77d1174d-bfc2-4145-9bf2-c2b648f903e8-metrics-certs\") pod \"router-default-5444994796-x9m9h\" (UID: \"77d1174d-bfc2-4145-9bf2-c2b648f903e8\") " pod="openshift-ingress/router-default-5444994796-x9m9h" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.304029 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05e5af51-76dc-4825-bab8-a5048aea49e9-client-ca\") pod \"controller-manager-879f6c89f-zvxtd\" (UID: \"05e5af51-76dc-4825-bab8-a5048aea49e9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zvxtd" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.304158 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/211614db-3bf5-4db7-9146-cc91303fc217-client-ca\") pod \"route-controller-manager-6576b87f9c-5pxtv\" (UID: \"211614db-3bf5-4db7-9146-cc91303fc217\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.304314 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-etcd-client\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.304428 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.304662 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87e773f5-6efb-4613-9af8-f05c7af849e1-config\") pod \"machine-api-operator-5694c8668f-b8m6z\" (UID: \"87e773f5-6efb-4613-9af8-f05c7af849e1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b8m6z" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.304589 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9824eace-01c1-49c3-9094-3f926eda9487-metrics-tls\") pod \"dns-operator-744455d44c-q8xwg\" (UID: \"9824eace-01c1-49c3-9094-3f926eda9487\") " pod="openshift-dns-operator/dns-operator-744455d44c-q8xwg" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.304904 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.305036 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzzgf\" (UniqueName: \"kubernetes.io/projected/87e773f5-6efb-4613-9af8-f05c7af849e1-kube-api-access-fzzgf\") pod \"machine-api-operator-5694c8668f-b8m6z\" (UID: \"87e773f5-6efb-4613-9af8-f05c7af849e1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b8m6z" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.305151 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.305262 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjpvw\" (UniqueName: \"kubernetes.io/projected/475471e2-43d3-46f3-9aa1-b44f497b626f-kube-api-access-xjpvw\") pod \"openshift-apiserver-operator-796bbdcf4f-4l2wz\" (UID: \"475471e2-43d3-46f3-9aa1-b44f497b626f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4l2wz" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.305427 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vh9z\" (UniqueName: \"kubernetes.io/projected/05e5af51-76dc-4825-bab8-a5048aea49e9-kube-api-access-4vh9z\") pod \"controller-manager-879f6c89f-zvxtd\" (UID: \"05e5af51-76dc-4825-bab8-a5048aea49e9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zvxtd" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.305579 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-audit\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.307062 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbzhs\" (UniqueName: \"kubernetes.io/projected/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-kube-api-access-qbzhs\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.307259 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.307361 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.307461 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2fe5591f-8503-4eea-9b4f-e85419856dd6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5fpzc\" (UID: \"2fe5591f-8503-4eea-9b4f-e85419856dd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.307550 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrtjw\" (UniqueName: \"kubernetes.io/projected/9824eace-01c1-49c3-9094-3f926eda9487-kube-api-access-vrtjw\") pod \"dns-operator-744455d44c-q8xwg\" (UID: \"9824eace-01c1-49c3-9094-3f926eda9487\") " pod="openshift-dns-operator/dns-operator-744455d44c-q8xwg" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.307895 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/77d1174d-bfc2-4145-9bf2-c2b648f903e8-stats-auth\") pod \"router-default-5444994796-x9m9h\" (UID: \"77d1174d-bfc2-4145-9bf2-c2b648f903e8\") " pod="openshift-ingress/router-default-5444994796-x9m9h" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.308075 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9f8f\" (UniqueName: \"kubernetes.io/projected/77d1174d-bfc2-4145-9bf2-c2b648f903e8-kube-api-access-z9f8f\") pod \"router-default-5444994796-x9m9h\" (UID: \"77d1174d-bfc2-4145-9bf2-c2b648f903e8\") " pod="openshift-ingress/router-default-5444994796-x9m9h" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.308235 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.308402 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2fe5591f-8503-4eea-9b4f-e85419856dd6-audit-policies\") pod \"apiserver-7bbb656c7d-5fpzc\" (UID: \"2fe5591f-8503-4eea-9b4f-e85419856dd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.308550 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ae93c21-14e1-4248-98cf-a250cc060f20-config\") pod \"console-operator-58897d9998-q66t7\" (UID: \"8ae93c21-14e1-4248-98cf-a250cc060f20\") " pod="openshift-console-operator/console-operator-58897d9998-q66t7" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.315956 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de34b8a9-076f-4aa5-acb7-52361b6deeb8-console-oauth-config\") pod \"console-f9d7485db-dcdqg\" (UID: \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\") " pod="openshift-console/console-f9d7485db-dcdqg" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.316218 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.317852 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rcnv8"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.317901 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.321924 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-dcdqg"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.321993 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qff28"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.323231 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-g4dbv"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.327653 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.332583 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.332763 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21dbd9c3-6afd-44dd-aa63-c1094b853b5d-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-47kpj\" (UID: \"21dbd9c3-6afd-44dd-aa63-c1094b853b5d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-47kpj" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.305233 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-audit-dir\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.337873 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-g4dbv" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.338259 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2fe5591f-8503-4eea-9b4f-e85419856dd6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5fpzc\" (UID: \"2fe5591f-8503-4eea-9b4f-e85419856dd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.339326 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/211614db-3bf5-4db7-9146-cc91303fc217-client-ca\") pod \"route-controller-manager-6576b87f9c-5pxtv\" (UID: \"211614db-3bf5-4db7-9146-cc91303fc217\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.339857 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05e5af51-76dc-4825-bab8-a5048aea49e9-client-ca\") pod \"controller-manager-879f6c89f-zvxtd\" (UID: \"05e5af51-76dc-4825-bab8-a5048aea49e9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zvxtd" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.340439 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-audit\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.340542 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-encryption-config\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.340701 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/211614db-3bf5-4db7-9146-cc91303fc217-config\") pod \"route-controller-manager-6576b87f9c-5pxtv\" (UID: \"211614db-3bf5-4db7-9146-cc91303fc217\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.340739 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/77d1174d-bfc2-4145-9bf2-c2b648f903e8-default-certificate\") pod \"router-default-5444994796-x9m9h\" (UID: \"77d1174d-bfc2-4145-9bf2-c2b648f903e8\") " pod="openshift-ingress/router-default-5444994796-x9m9h" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.340759 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-etcd-serving-ca\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.340780 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.340940 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/285de502-3cec-4e87-b096-c9485f99ac4b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lprjd\" (UID: \"285de502-3cec-4e87-b096-c9485f99ac4b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lprjd" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.340972 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05e5af51-76dc-4825-bab8-a5048aea49e9-serving-cert\") pod \"controller-manager-879f6c89f-zvxtd\" (UID: \"05e5af51-76dc-4825-bab8-a5048aea49e9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zvxtd" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.340990 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fe5591f-8503-4eea-9b4f-e85419856dd6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5fpzc\" (UID: \"2fe5591f-8503-4eea-9b4f-e85419856dd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.341014 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72mz7\" (UniqueName: \"kubernetes.io/projected/2fe5591f-8503-4eea-9b4f-e85419856dd6-kube-api-access-72mz7\") pod \"apiserver-7bbb656c7d-5fpzc\" (UID: \"2fe5591f-8503-4eea-9b4f-e85419856dd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.341063 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ae93c21-14e1-4248-98cf-a250cc060f20-config\") pod \"console-operator-58897d9998-q66t7\" (UID: \"8ae93c21-14e1-4248-98cf-a250cc060f20\") " pod="openshift-console-operator/console-operator-58897d9998-q66t7" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.341149 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/475471e2-43d3-46f3-9aa1-b44f497b626f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4l2wz\" (UID: \"475471e2-43d3-46f3-9aa1-b44f497b626f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4l2wz" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.341172 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21dbd9c3-6afd-44dd-aa63-c1094b853b5d-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-47kpj\" (UID: \"21dbd9c3-6afd-44dd-aa63-c1094b853b5d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-47kpj" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.341197 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-node-pullsecrets\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.341218 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/475471e2-43d3-46f3-9aa1-b44f497b626f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4l2wz\" (UID: \"475471e2-43d3-46f3-9aa1-b44f497b626f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4l2wz" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.341238 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0def7e4f-7c51-4814-9ceb-7ba90a4699ad-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r6dhb\" (UID: \"0def7e4f-7c51-4814-9ceb-7ba90a4699ad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r6dhb" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.342419 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.343301 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dvs5c"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.343408 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.343655 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-node-pullsecrets\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.343820 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-audit-policies\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.343977 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj6kn\" (UniqueName: \"kubernetes.io/projected/f28bb046-9dd7-47e0-a498-1928568abe59-kube-api-access-cj6kn\") pod \"openshift-controller-manager-operator-756b6f6bc6-vhb2v\" (UID: \"f28bb046-9dd7-47e0-a498-1928568abe59\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhb2v" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.344071 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de34b8a9-076f-4aa5-acb7-52361b6deeb8-trusted-ca-bundle\") pod \"console-f9d7485db-dcdqg\" (UID: \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\") " pod="openshift-console/console-f9d7485db-dcdqg" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.350384 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.350478 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2fe5591f-8503-4eea-9b4f-e85419856dd6-etcd-client\") pod \"apiserver-7bbb656c7d-5fpzc\" (UID: \"2fe5591f-8503-4eea-9b4f-e85419856dd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.350565 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/637b32e8-5e9a-47ac-aeaf-60709cdfba63-serving-cert\") pod \"openshift-config-operator-7777fb866f-4vzn5\" (UID: \"637b32e8-5e9a-47ac-aeaf-60709cdfba63\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vzn5" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.350781 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/88233a33-81de-4a10-8e6b-bf8ae80beb22-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-rcnv8\" (UID: \"88233a33-81de-4a10-8e6b-bf8ae80beb22\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rcnv8" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.350903 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.344617 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-etcd-serving-ca\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.348797 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05e5af51-76dc-4825-bab8-a5048aea49e9-serving-cert\") pod \"controller-manager-879f6c89f-zvxtd\" (UID: \"05e5af51-76dc-4825-bab8-a5048aea49e9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zvxtd" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.349045 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.350939 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.344206 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/211614db-3bf5-4db7-9146-cc91303fc217-config\") pod \"route-controller-manager-6576b87f9c-5pxtv\" (UID: \"211614db-3bf5-4db7-9146-cc91303fc217\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.351189 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc9lc\" (UniqueName: \"kubernetes.io/projected/64de6d79-f439-4a73-9ac6-605a71c8aab7-kube-api-access-bc9lc\") pod \"downloads-7954f5f757-x88bb\" (UID: \"64de6d79-f439-4a73-9ac6-605a71c8aab7\") " pod="openshift-console/downloads-7954f5f757-x88bb" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.351270 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f28bb046-9dd7-47e0-a498-1928568abe59-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vhb2v\" (UID: \"f28bb046-9dd7-47e0-a498-1928568abe59\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhb2v" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.351355 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77d1174d-bfc2-4145-9bf2-c2b648f903e8-service-ca-bundle\") pod \"router-default-5444994796-x9m9h\" (UID: \"77d1174d-bfc2-4145-9bf2-c2b648f903e8\") " pod="openshift-ingress/router-default-5444994796-x9m9h" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.351493 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-config\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.351617 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de34b8a9-076f-4aa5-acb7-52361b6deeb8-oauth-serving-cert\") pod \"console-f9d7485db-dcdqg\" (UID: \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\") " pod="openshift-console/console-f9d7485db-dcdqg" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.351729 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/211614db-3bf5-4db7-9146-cc91303fc217-serving-cert\") pod \"route-controller-manager-6576b87f9c-5pxtv\" (UID: \"211614db-3bf5-4db7-9146-cc91303fc217\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.351829 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwjp5\" (UniqueName: \"kubernetes.io/projected/211614db-3bf5-4db7-9146-cc91303fc217-kube-api-access-lwjp5\") pod \"route-controller-manager-6576b87f9c-5pxtv\" (UID: \"211614db-3bf5-4db7-9146-cc91303fc217\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.351916 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2dgs\" (UniqueName: \"kubernetes.io/projected/285de502-3cec-4e87-b096-c9485f99ac4b-kube-api-access-d2dgs\") pod \"kube-storage-version-migrator-operator-b67b599dd-lprjd\" (UID: \"285de502-3cec-4e87-b096-c9485f99ac4b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lprjd" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.352014 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de34b8a9-076f-4aa5-acb7-52361b6deeb8-console-serving-cert\") pod \"console-f9d7485db-dcdqg\" (UID: \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\") " pod="openshift-console/console-f9d7485db-dcdqg" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.352110 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/87e773f5-6efb-4613-9af8-f05c7af849e1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-b8m6z\" (UID: \"87e773f5-6efb-4613-9af8-f05c7af849e1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b8m6z" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.352198 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-serving-cert\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.352298 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de34b8a9-076f-4aa5-acb7-52361b6deeb8-service-ca\") pod \"console-f9d7485db-dcdqg\" (UID: \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\") " pod="openshift-console/console-f9d7485db-dcdqg" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.352391 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.352488 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21dbd9c3-6afd-44dd-aa63-c1094b853b5d-config\") pod \"kube-apiserver-operator-766d6c64bb-47kpj\" (UID: \"21dbd9c3-6afd-44dd-aa63-c1094b853b5d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-47kpj" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.353784 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de34b8a9-076f-4aa5-acb7-52361b6deeb8-oauth-serving-cert\") pod \"console-f9d7485db-dcdqg\" (UID: \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\") " pod="openshift-console/console-f9d7485db-dcdqg" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.345764 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.346390 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fe5591f-8503-4eea-9b4f-e85419856dd6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5fpzc\" (UID: \"2fe5591f-8503-4eea-9b4f-e85419856dd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.354636 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.348075 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de34b8a9-076f-4aa5-acb7-52361b6deeb8-trusted-ca-bundle\") pod \"console-f9d7485db-dcdqg\" (UID: \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\") " pod="openshift-console/console-f9d7485db-dcdqg" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.352589 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bwjj\" (UniqueName: \"kubernetes.io/projected/8ae93c21-14e1-4248-98cf-a250cc060f20-kube-api-access-4bwjj\") pod \"console-operator-58897d9998-q66t7\" (UID: \"8ae93c21-14e1-4248-98cf-a250cc060f20\") " pod="openshift-console-operator/console-operator-58897d9998-q66t7" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.358066 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/637b32e8-5e9a-47ac-aeaf-60709cdfba63-available-featuregates\") pod \"openshift-config-operator-7777fb866f-4vzn5\" (UID: \"637b32e8-5e9a-47ac-aeaf-60709cdfba63\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vzn5" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.358168 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rch7j\" (UniqueName: \"kubernetes.io/projected/637b32e8-5e9a-47ac-aeaf-60709cdfba63-kube-api-access-rch7j\") pod \"openshift-config-operator-7777fb866f-4vzn5\" (UID: \"637b32e8-5e9a-47ac-aeaf-60709cdfba63\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vzn5" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.358250 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/285de502-3cec-4e87-b096-c9485f99ac4b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lprjd\" (UID: \"285de502-3cec-4e87-b096-c9485f99ac4b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lprjd" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.358331 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcx64\" (UniqueName: \"kubernetes.io/projected/a7a39312-812f-45a0-ab3f-362048a42c5f-kube-api-access-jcx64\") pod \"olm-operator-6b444d44fb-lds42\" (UID: \"a7a39312-812f-45a0-ab3f-362048a42c5f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lds42" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.358435 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/05e5af51-76dc-4825-bab8-a5048aea49e9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zvxtd\" (UID: \"05e5af51-76dc-4825-bab8-a5048aea49e9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zvxtd" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.358527 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/87e773f5-6efb-4613-9af8-f05c7af849e1-images\") pod \"machine-api-operator-5694c8668f-b8m6z\" (UID: \"87e773f5-6efb-4613-9af8-f05c7af849e1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b8m6z" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.358692 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-audit-dir\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.359089 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4h2gq"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.359169 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-4vzn5"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.358628 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-audit-dir\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.359580 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ae93c21-14e1-4248-98cf-a250cc060f20-trusted-ca\") pod \"console-operator-58897d9998-q66t7\" (UID: \"8ae93c21-14e1-4248-98cf-a250cc060f20\") " pod="openshift-console-operator/console-operator-58897d9998-q66t7" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.359643 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0def7e4f-7c51-4814-9ceb-7ba90a4699ad-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r6dhb\" (UID: \"0def7e4f-7c51-4814-9ceb-7ba90a4699ad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r6dhb" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.359664 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a7a39312-812f-45a0-ab3f-362048a42c5f-srv-cert\") pod \"olm-operator-6b444d44fb-lds42\" (UID: \"a7a39312-812f-45a0-ab3f-362048a42c5f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lds42" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.359708 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fe5591f-8503-4eea-9b4f-e85419856dd6-serving-cert\") pod \"apiserver-7bbb656c7d-5fpzc\" (UID: \"2fe5591f-8503-4eea-9b4f-e85419856dd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.359729 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7wt5\" (UniqueName: \"kubernetes.io/projected/88233a33-81de-4a10-8e6b-bf8ae80beb22-kube-api-access-g7wt5\") pod \"cluster-samples-operator-665b6dd947-rcnv8\" (UID: \"88233a33-81de-4a10-8e6b-bf8ae80beb22\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rcnv8" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.359747 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0def7e4f-7c51-4814-9ceb-7ba90a4699ad-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r6dhb\" (UID: \"0def7e4f-7c51-4814-9ceb-7ba90a4699ad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r6dhb" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.359785 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-image-import-ca\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.359802 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zk26\" (UniqueName: \"kubernetes.io/projected/1babc617-f6e7-4ec3-a4a2-82cd7ca080fb-kube-api-access-4zk26\") pod \"migrator-59844c95c7-qff28\" (UID: \"1babc617-f6e7-4ec3-a4a2-82cd7ca080fb\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qff28" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.359824 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de34b8a9-076f-4aa5-acb7-52361b6deeb8-console-config\") pod \"console-f9d7485db-dcdqg\" (UID: \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\") " pod="openshift-console/console-f9d7485db-dcdqg" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.359857 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f28bb046-9dd7-47e0-a498-1928568abe59-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vhb2v\" (UID: \"f28bb046-9dd7-47e0-a498-1928568abe59\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhb2v" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.359876 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2fe5591f-8503-4eea-9b4f-e85419856dd6-audit-dir\") pod \"apiserver-7bbb656c7d-5fpzc\" (UID: \"2fe5591f-8503-4eea-9b4f-e85419856dd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.359901 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ae93c21-14e1-4248-98cf-a250cc060f20-serving-cert\") pod \"console-operator-58897d9998-q66t7\" (UID: \"8ae93c21-14e1-4248-98cf-a250cc060f20\") " pod="openshift-console-operator/console-operator-58897d9998-q66t7" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.359958 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05e5af51-76dc-4825-bab8-a5048aea49e9-config\") pod \"controller-manager-879f6c89f-zvxtd\" (UID: \"05e5af51-76dc-4825-bab8-a5048aea49e9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zvxtd" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.359975 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.345112 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2fe5591f-8503-4eea-9b4f-e85419856dd6-audit-policies\") pod \"apiserver-7bbb656c7d-5fpzc\" (UID: \"2fe5591f-8503-4eea-9b4f-e85419856dd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.367163 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-b8m6z"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.367241 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-q66t7"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.367256 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-q8xwg"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.352416 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-audit-policies\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.370559 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8phjh"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.370643 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dbpp6"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.371387 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-config\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.350472 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.371946 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ae93c21-14e1-4248-98cf-a250cc060f20-trusted-ca\") pod \"console-operator-58897d9998-q66t7\" (UID: \"8ae93c21-14e1-4248-98cf-a250cc060f20\") " pod="openshift-console-operator/console-operator-58897d9998-q66t7" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.372700 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de34b8a9-076f-4aa5-acb7-52361b6deeb8-console-config\") pod \"console-f9d7485db-dcdqg\" (UID: \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\") " pod="openshift-console/console-f9d7485db-dcdqg" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.373780 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-image-import-ca\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.374765 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f28bb046-9dd7-47e0-a498-1928568abe59-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vhb2v\" (UID: \"f28bb046-9dd7-47e0-a498-1928568abe59\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhb2v" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.374806 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2fe5591f-8503-4eea-9b4f-e85419856dd6-audit-dir\") pod \"apiserver-7bbb656c7d-5fpzc\" (UID: \"2fe5591f-8503-4eea-9b4f-e85419856dd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.375169 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.377553 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05e5af51-76dc-4825-bab8-a5048aea49e9-config\") pod \"controller-manager-879f6c89f-zvxtd\" (UID: \"05e5af51-76dc-4825-bab8-a5048aea49e9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zvxtd" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.377969 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-etcd-client\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.383038 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/05e5af51-76dc-4825-bab8-a5048aea49e9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zvxtd\" (UID: \"05e5af51-76dc-4825-bab8-a5048aea49e9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zvxtd" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.383682 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/87e773f5-6efb-4613-9af8-f05c7af849e1-images\") pod \"machine-api-operator-5694c8668f-b8m6z\" (UID: \"87e773f5-6efb-4613-9af8-f05c7af849e1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b8m6z" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.385028 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de34b8a9-076f-4aa5-acb7-52361b6deeb8-service-ca\") pod \"console-f9d7485db-dcdqg\" (UID: \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\") " pod="openshift-console/console-f9d7485db-dcdqg" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.386247 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ae93c21-14e1-4248-98cf-a250cc060f20-serving-cert\") pod \"console-operator-58897d9998-q66t7\" (UID: \"8ae93c21-14e1-4248-98cf-a250cc060f20\") " pod="openshift-console-operator/console-operator-58897d9998-q66t7" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.392513 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/87e773f5-6efb-4613-9af8-f05c7af849e1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-b8m6z\" (UID: \"87e773f5-6efb-4613-9af8-f05c7af849e1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b8m6z" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.395201 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.397276 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-serving-cert\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.411222 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.411511 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.414211 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.424135 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.427725 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.427875 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2fe5591f-8503-4eea-9b4f-e85419856dd6-encryption-config\") pod \"apiserver-7bbb656c7d-5fpzc\" (UID: \"2fe5591f-8503-4eea-9b4f-e85419856dd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.428072 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2fe5591f-8503-4eea-9b4f-e85419856dd6-etcd-client\") pod \"apiserver-7bbb656c7d-5fpzc\" (UID: \"2fe5591f-8503-4eea-9b4f-e85419856dd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.428293 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-encryption-config\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.428345 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f28bb046-9dd7-47e0-a498-1928568abe59-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vhb2v\" (UID: \"f28bb046-9dd7-47e0-a498-1928568abe59\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhb2v" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.428805 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.429251 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fe5591f-8503-4eea-9b4f-e85419856dd6-serving-cert\") pod \"apiserver-7bbb656c7d-5fpzc\" (UID: \"2fe5591f-8503-4eea-9b4f-e85419856dd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.429851 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lprjd"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.431035 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/211614db-3bf5-4db7-9146-cc91303fc217-serving-cert\") pod \"route-controller-manager-6576b87f9c-5pxtv\" (UID: \"211614db-3bf5-4db7-9146-cc91303fc217\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.435693 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-47kpj"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.438237 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de34b8a9-076f-4aa5-acb7-52361b6deeb8-console-serving-cert\") pod \"console-f9d7485db-dcdqg\" (UID: \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\") " pod="openshift-console/console-f9d7485db-dcdqg" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.442172 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zl58f"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.442771 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r6dhb"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.444151 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.465034 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.465310 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/637b32e8-5e9a-47ac-aeaf-60709cdfba63-available-featuregates\") pod \"openshift-config-operator-7777fb866f-4vzn5\" (UID: \"637b32e8-5e9a-47ac-aeaf-60709cdfba63\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vzn5" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.465376 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rch7j\" (UniqueName: \"kubernetes.io/projected/637b32e8-5e9a-47ac-aeaf-60709cdfba63-kube-api-access-rch7j\") pod \"openshift-config-operator-7777fb866f-4vzn5\" (UID: \"637b32e8-5e9a-47ac-aeaf-60709cdfba63\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vzn5" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.465397 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/285de502-3cec-4e87-b096-c9485f99ac4b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lprjd\" (UID: \"285de502-3cec-4e87-b096-c9485f99ac4b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lprjd" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.465420 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcx64\" (UniqueName: \"kubernetes.io/projected/a7a39312-812f-45a0-ab3f-362048a42c5f-kube-api-access-jcx64\") pod \"olm-operator-6b444d44fb-lds42\" (UID: \"a7a39312-812f-45a0-ab3f-362048a42c5f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lds42" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.465442 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0def7e4f-7c51-4814-9ceb-7ba90a4699ad-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r6dhb\" (UID: \"0def7e4f-7c51-4814-9ceb-7ba90a4699ad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r6dhb" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.465495 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a7a39312-812f-45a0-ab3f-362048a42c5f-srv-cert\") pod \"olm-operator-6b444d44fb-lds42\" (UID: \"a7a39312-812f-45a0-ab3f-362048a42c5f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lds42" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.465522 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7wt5\" (UniqueName: \"kubernetes.io/projected/88233a33-81de-4a10-8e6b-bf8ae80beb22-kube-api-access-g7wt5\") pod \"cluster-samples-operator-665b6dd947-rcnv8\" (UID: \"88233a33-81de-4a10-8e6b-bf8ae80beb22\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rcnv8" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.465543 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0def7e4f-7c51-4814-9ceb-7ba90a4699ad-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r6dhb\" (UID: \"0def7e4f-7c51-4814-9ceb-7ba90a4699ad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r6dhb" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.465560 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zk26\" (UniqueName: \"kubernetes.io/projected/1babc617-f6e7-4ec3-a4a2-82cd7ca080fb-kube-api-access-4zk26\") pod \"migrator-59844c95c7-qff28\" (UID: \"1babc617-f6e7-4ec3-a4a2-82cd7ca080fb\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qff28" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.465637 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a7a39312-812f-45a0-ab3f-362048a42c5f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lds42\" (UID: \"a7a39312-812f-45a0-ab3f-362048a42c5f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lds42" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.465667 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5458b22-c606-4b4f-934e-ecb972895455-serving-cert\") pod \"authentication-operator-69f744f599-445qb\" (UID: \"c5458b22-c606-4b4f-934e-ecb972895455\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-445qb" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.465687 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qjmg\" (UniqueName: \"kubernetes.io/projected/acfb03ae-0ebb-47ec-8433-7de29e729cac-kube-api-access-8qjmg\") pod \"machine-approver-56656f9798-24dsx\" (UID: \"acfb03ae-0ebb-47ec-8433-7de29e729cac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-24dsx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.465709 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5458b22-c606-4b4f-934e-ecb972895455-service-ca-bundle\") pod \"authentication-operator-69f744f599-445qb\" (UID: \"c5458b22-c606-4b4f-934e-ecb972895455\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-445qb" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.465736 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77d1174d-bfc2-4145-9bf2-c2b648f903e8-metrics-certs\") pod \"router-default-5444994796-x9m9h\" (UID: \"77d1174d-bfc2-4145-9bf2-c2b648f903e8\") " pod="openshift-ingress/router-default-5444994796-x9m9h" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.466411 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9824eace-01c1-49c3-9094-3f926eda9487-metrics-tls\") pod \"dns-operator-744455d44c-q8xwg\" (UID: \"9824eace-01c1-49c3-9094-3f926eda9487\") " pod="openshift-dns-operator/dns-operator-744455d44c-q8xwg" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.466454 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5458b22-c606-4b4f-934e-ecb972895455-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-445qb\" (UID: \"c5458b22-c606-4b4f-934e-ecb972895455\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-445qb" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.466481 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/acfb03ae-0ebb-47ec-8433-7de29e729cac-machine-approver-tls\") pod \"machine-approver-56656f9798-24dsx\" (UID: \"acfb03ae-0ebb-47ec-8433-7de29e729cac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-24dsx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.466516 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjpvw\" (UniqueName: \"kubernetes.io/projected/475471e2-43d3-46f3-9aa1-b44f497b626f-kube-api-access-xjpvw\") pod \"openshift-apiserver-operator-796bbdcf4f-4l2wz\" (UID: \"475471e2-43d3-46f3-9aa1-b44f497b626f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4l2wz" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.466539 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/acfb03ae-0ebb-47ec-8433-7de29e729cac-auth-proxy-config\") pod \"machine-approver-56656f9798-24dsx\" (UID: \"acfb03ae-0ebb-47ec-8433-7de29e729cac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-24dsx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.466563 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b8a761a8-3e6d-42eb-b0f8-db388dcf6952-config-volume\") pod \"collect-profiles-29416860-wvm9x\" (UID: \"b8a761a8-3e6d-42eb-b0f8-db388dcf6952\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-wvm9x" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.466586 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrtjw\" (UniqueName: \"kubernetes.io/projected/9824eace-01c1-49c3-9094-3f926eda9487-kube-api-access-vrtjw\") pod \"dns-operator-744455d44c-q8xwg\" (UID: \"9824eace-01c1-49c3-9094-3f926eda9487\") " pod="openshift-dns-operator/dns-operator-744455d44c-q8xwg" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.466631 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5458b22-c606-4b4f-934e-ecb972895455-config\") pod \"authentication-operator-69f744f599-445qb\" (UID: \"c5458b22-c606-4b4f-934e-ecb972895455\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-445qb" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.466654 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vwvd\" (UniqueName: \"kubernetes.io/projected/c5458b22-c606-4b4f-934e-ecb972895455-kube-api-access-9vwvd\") pod \"authentication-operator-69f744f599-445qb\" (UID: \"c5458b22-c606-4b4f-934e-ecb972895455\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-445qb" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.466678 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/77d1174d-bfc2-4145-9bf2-c2b648f903e8-stats-auth\") pod \"router-default-5444994796-x9m9h\" (UID: \"77d1174d-bfc2-4145-9bf2-c2b648f903e8\") " pod="openshift-ingress/router-default-5444994796-x9m9h" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.466704 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9f8f\" (UniqueName: \"kubernetes.io/projected/77d1174d-bfc2-4145-9bf2-c2b648f903e8-kube-api-access-z9f8f\") pod \"router-default-5444994796-x9m9h\" (UID: \"77d1174d-bfc2-4145-9bf2-c2b648f903e8\") " pod="openshift-ingress/router-default-5444994796-x9m9h" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.466730 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zldc\" (UniqueName: \"kubernetes.io/projected/dda9e216-90cd-466b-a9bd-03fd4914b1b8-kube-api-access-9zldc\") pod \"service-ca-operator-777779d784-8phjh\" (UID: \"dda9e216-90cd-466b-a9bd-03fd4914b1b8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8phjh" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.466752 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7njz\" (UniqueName: \"kubernetes.io/projected/45b650d8-842d-4139-9858-94c3019f7be2-kube-api-access-v7njz\") pod \"service-ca-9c57cc56f-bm7cx\" (UID: \"45b650d8-842d-4139-9858-94c3019f7be2\") " pod="openshift-service-ca/service-ca-9c57cc56f-bm7cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.466757 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0def7e4f-7c51-4814-9ceb-7ba90a4699ad-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r6dhb\" (UID: \"0def7e4f-7c51-4814-9ceb-7ba90a4699ad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r6dhb" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.466772 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21dbd9c3-6afd-44dd-aa63-c1094b853b5d-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-47kpj\" (UID: \"21dbd9c3-6afd-44dd-aa63-c1094b853b5d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-47kpj" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.466858 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/77d1174d-bfc2-4145-9bf2-c2b648f903e8-default-certificate\") pod \"router-default-5444994796-x9m9h\" (UID: \"77d1174d-bfc2-4145-9bf2-c2b648f903e8\") " pod="openshift-ingress/router-default-5444994796-x9m9h" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.466890 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vwpz\" (UniqueName: \"kubernetes.io/projected/b8a761a8-3e6d-42eb-b0f8-db388dcf6952-kube-api-access-2vwpz\") pod \"collect-profiles-29416860-wvm9x\" (UID: \"b8a761a8-3e6d-42eb-b0f8-db388dcf6952\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-wvm9x" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.466916 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/285de502-3cec-4e87-b096-c9485f99ac4b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lprjd\" (UID: \"285de502-3cec-4e87-b096-c9485f99ac4b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lprjd" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.466938 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dda9e216-90cd-466b-a9bd-03fd4914b1b8-serving-cert\") pod \"service-ca-operator-777779d784-8phjh\" (UID: \"dda9e216-90cd-466b-a9bd-03fd4914b1b8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8phjh" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.466969 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/475471e2-43d3-46f3-9aa1-b44f497b626f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4l2wz\" (UID: \"475471e2-43d3-46f3-9aa1-b44f497b626f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4l2wz" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.466992 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21dbd9c3-6afd-44dd-aa63-c1094b853b5d-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-47kpj\" (UID: \"21dbd9c3-6afd-44dd-aa63-c1094b853b5d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-47kpj" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.467015 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/45b650d8-842d-4139-9858-94c3019f7be2-signing-cabundle\") pod \"service-ca-9c57cc56f-bm7cx\" (UID: \"45b650d8-842d-4139-9858-94c3019f7be2\") " pod="openshift-service-ca/service-ca-9c57cc56f-bm7cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.467039 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/475471e2-43d3-46f3-9aa1-b44f497b626f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4l2wz\" (UID: \"475471e2-43d3-46f3-9aa1-b44f497b626f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4l2wz" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.467056 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0def7e4f-7c51-4814-9ceb-7ba90a4699ad-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r6dhb\" (UID: \"0def7e4f-7c51-4814-9ceb-7ba90a4699ad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r6dhb" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.467082 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/45b650d8-842d-4139-9858-94c3019f7be2-signing-key\") pod \"service-ca-9c57cc56f-bm7cx\" (UID: \"45b650d8-842d-4139-9858-94c3019f7be2\") " pod="openshift-service-ca/service-ca-9c57cc56f-bm7cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.467117 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b8a761a8-3e6d-42eb-b0f8-db388dcf6952-secret-volume\") pod \"collect-profiles-29416860-wvm9x\" (UID: \"b8a761a8-3e6d-42eb-b0f8-db388dcf6952\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-wvm9x" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.467139 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acfb03ae-0ebb-47ec-8433-7de29e729cac-config\") pod \"machine-approver-56656f9798-24dsx\" (UID: \"acfb03ae-0ebb-47ec-8433-7de29e729cac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-24dsx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.467162 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/637b32e8-5e9a-47ac-aeaf-60709cdfba63-serving-cert\") pod \"openshift-config-operator-7777fb866f-4vzn5\" (UID: \"637b32e8-5e9a-47ac-aeaf-60709cdfba63\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vzn5" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.467180 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/88233a33-81de-4a10-8e6b-bf8ae80beb22-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-rcnv8\" (UID: \"88233a33-81de-4a10-8e6b-bf8ae80beb22\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rcnv8" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.467219 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dda9e216-90cd-466b-a9bd-03fd4914b1b8-config\") pod \"service-ca-operator-777779d784-8phjh\" (UID: \"dda9e216-90cd-466b-a9bd-03fd4914b1b8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8phjh" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.467249 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77d1174d-bfc2-4145-9bf2-c2b648f903e8-service-ca-bundle\") pod \"router-default-5444994796-x9m9h\" (UID: \"77d1174d-bfc2-4145-9bf2-c2b648f903e8\") " pod="openshift-ingress/router-default-5444994796-x9m9h" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.467289 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2dgs\" (UniqueName: \"kubernetes.io/projected/285de502-3cec-4e87-b096-c9485f99ac4b-kube-api-access-d2dgs\") pod \"kube-storage-version-migrator-operator-b67b599dd-lprjd\" (UID: \"285de502-3cec-4e87-b096-c9485f99ac4b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lprjd" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.467312 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21dbd9c3-6afd-44dd-aa63-c1094b853b5d-config\") pod \"kube-apiserver-operator-766d6c64bb-47kpj\" (UID: \"21dbd9c3-6afd-44dd-aa63-c1094b853b5d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-47kpj" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.467928 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21dbd9c3-6afd-44dd-aa63-c1094b853b5d-config\") pod \"kube-apiserver-operator-766d6c64bb-47kpj\" (UID: \"21dbd9c3-6afd-44dd-aa63-c1094b853b5d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-47kpj" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.468527 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/637b32e8-5e9a-47ac-aeaf-60709cdfba63-available-featuregates\") pod \"openshift-config-operator-7777fb866f-4vzn5\" (UID: \"637b32e8-5e9a-47ac-aeaf-60709cdfba63\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vzn5" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.469248 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4r8t9"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.469413 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.470226 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/475471e2-43d3-46f3-9aa1-b44f497b626f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4l2wz\" (UID: \"475471e2-43d3-46f3-9aa1-b44f497b626f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4l2wz" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.475715 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/285de502-3cec-4e87-b096-c9485f99ac4b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lprjd\" (UID: \"285de502-3cec-4e87-b096-c9485f99ac4b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lprjd" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.480796 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/88233a33-81de-4a10-8e6b-bf8ae80beb22-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-rcnv8\" (UID: \"88233a33-81de-4a10-8e6b-bf8ae80beb22\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rcnv8" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.480956 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77d1174d-bfc2-4145-9bf2-c2b648f903e8-service-ca-bundle\") pod \"router-default-5444994796-x9m9h\" (UID: \"77d1174d-bfc2-4145-9bf2-c2b648f903e8\") " pod="openshift-ingress/router-default-5444994796-x9m9h" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.481113 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a7a39312-812f-45a0-ab3f-362048a42c5f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lds42\" (UID: \"a7a39312-812f-45a0-ab3f-362048a42c5f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lds42" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.481359 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0def7e4f-7c51-4814-9ceb-7ba90a4699ad-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r6dhb\" (UID: \"0def7e4f-7c51-4814-9ceb-7ba90a4699ad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r6dhb" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.481811 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21dbd9c3-6afd-44dd-aa63-c1094b853b5d-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-47kpj\" (UID: \"21dbd9c3-6afd-44dd-aa63-c1094b853b5d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-47kpj" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.481834 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77d1174d-bfc2-4145-9bf2-c2b648f903e8-metrics-certs\") pod \"router-default-5444994796-x9m9h\" (UID: \"77d1174d-bfc2-4145-9bf2-c2b648f903e8\") " pod="openshift-ingress/router-default-5444994796-x9m9h" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.482847 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.483713 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a7a39312-812f-45a0-ab3f-362048a42c5f-srv-cert\") pod \"olm-operator-6b444d44fb-lds42\" (UID: \"a7a39312-812f-45a0-ab3f-362048a42c5f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lds42" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.485024 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4l2wz"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.487203 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/285de502-3cec-4e87-b096-c9485f99ac4b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lprjd\" (UID: \"285de502-3cec-4e87-b096-c9485f99ac4b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lprjd" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.488367 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rqdv8"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.490376 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/475471e2-43d3-46f3-9aa1-b44f497b626f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4l2wz\" (UID: \"475471e2-43d3-46f3-9aa1-b44f497b626f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4l2wz" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.490832 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rqdv8" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.493418 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-znflx"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.495041 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-445qb"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.499713 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-pjlvh"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.502683 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.507793 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pzl96"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.508531 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/637b32e8-5e9a-47ac-aeaf-60709cdfba63-serving-cert\") pod \"openshift-config-operator-7777fb866f-4vzn5\" (UID: \"637b32e8-5e9a-47ac-aeaf-60709cdfba63\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vzn5" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.508953 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9824eace-01c1-49c3-9094-3f926eda9487-metrics-tls\") pod \"dns-operator-744455d44c-q8xwg\" (UID: \"9824eace-01c1-49c3-9094-3f926eda9487\") " pod="openshift-dns-operator/dns-operator-744455d44c-q8xwg" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.509659 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/77d1174d-bfc2-4145-9bf2-c2b648f903e8-default-certificate\") pod \"router-default-5444994796-x9m9h\" (UID: \"77d1174d-bfc2-4145-9bf2-c2b648f903e8\") " pod="openshift-ingress/router-default-5444994796-x9m9h" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.511121 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c2g6x"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.515251 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416860-wvm9x"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.516519 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xqnzx"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.519895 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/77d1174d-bfc2-4145-9bf2-c2b648f903e8-stats-auth\") pod \"router-default-5444994796-x9m9h\" (UID: \"77d1174d-bfc2-4145-9bf2-c2b648f903e8\") " pod="openshift-ingress/router-default-5444994796-x9m9h" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.519976 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lds42"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.522652 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.523731 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vrlvb"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.525396 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5ss48"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.526839 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rqdv8"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.527962 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-26xdk"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.529054 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-bm7cx"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.530074 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wt47x"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.531722 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7gxtl"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.532038 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-wt47x" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.532493 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7gxtl" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.532852 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wt47x"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.535867 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7gxtl"] Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.541645 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.561260 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.568392 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b8a761a8-3e6d-42eb-b0f8-db388dcf6952-secret-volume\") pod \"collect-profiles-29416860-wvm9x\" (UID: \"b8a761a8-3e6d-42eb-b0f8-db388dcf6952\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-wvm9x" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.568443 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acfb03ae-0ebb-47ec-8433-7de29e729cac-config\") pod \"machine-approver-56656f9798-24dsx\" (UID: \"acfb03ae-0ebb-47ec-8433-7de29e729cac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-24dsx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.568489 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dda9e216-90cd-466b-a9bd-03fd4914b1b8-config\") pod \"service-ca-operator-777779d784-8phjh\" (UID: \"dda9e216-90cd-466b-a9bd-03fd4914b1b8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8phjh" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.568666 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5458b22-c606-4b4f-934e-ecb972895455-serving-cert\") pod \"authentication-operator-69f744f599-445qb\" (UID: \"c5458b22-c606-4b4f-934e-ecb972895455\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-445qb" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.568697 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qjmg\" (UniqueName: \"kubernetes.io/projected/acfb03ae-0ebb-47ec-8433-7de29e729cac-kube-api-access-8qjmg\") pod \"machine-approver-56656f9798-24dsx\" (UID: \"acfb03ae-0ebb-47ec-8433-7de29e729cac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-24dsx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.568733 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5458b22-c606-4b4f-934e-ecb972895455-service-ca-bundle\") pod \"authentication-operator-69f744f599-445qb\" (UID: \"c5458b22-c606-4b4f-934e-ecb972895455\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-445qb" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.568783 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5458b22-c606-4b4f-934e-ecb972895455-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-445qb\" (UID: \"c5458b22-c606-4b4f-934e-ecb972895455\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-445qb" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.568815 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/acfb03ae-0ebb-47ec-8433-7de29e729cac-machine-approver-tls\") pod \"machine-approver-56656f9798-24dsx\" (UID: \"acfb03ae-0ebb-47ec-8433-7de29e729cac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-24dsx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.568864 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/acfb03ae-0ebb-47ec-8433-7de29e729cac-auth-proxy-config\") pod \"machine-approver-56656f9798-24dsx\" (UID: \"acfb03ae-0ebb-47ec-8433-7de29e729cac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-24dsx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.568927 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b8a761a8-3e6d-42eb-b0f8-db388dcf6952-config-volume\") pod \"collect-profiles-29416860-wvm9x\" (UID: \"b8a761a8-3e6d-42eb-b0f8-db388dcf6952\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-wvm9x" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.569026 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5458b22-c606-4b4f-934e-ecb972895455-config\") pod \"authentication-operator-69f744f599-445qb\" (UID: \"c5458b22-c606-4b4f-934e-ecb972895455\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-445qb" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.569054 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vwvd\" (UniqueName: \"kubernetes.io/projected/c5458b22-c606-4b4f-934e-ecb972895455-kube-api-access-9vwvd\") pod \"authentication-operator-69f744f599-445qb\" (UID: \"c5458b22-c606-4b4f-934e-ecb972895455\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-445qb" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.569116 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zldc\" (UniqueName: \"kubernetes.io/projected/dda9e216-90cd-466b-a9bd-03fd4914b1b8-kube-api-access-9zldc\") pod \"service-ca-operator-777779d784-8phjh\" (UID: \"dda9e216-90cd-466b-a9bd-03fd4914b1b8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8phjh" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.569143 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7njz\" (UniqueName: \"kubernetes.io/projected/45b650d8-842d-4139-9858-94c3019f7be2-kube-api-access-v7njz\") pod \"service-ca-9c57cc56f-bm7cx\" (UID: \"45b650d8-842d-4139-9858-94c3019f7be2\") " pod="openshift-service-ca/service-ca-9c57cc56f-bm7cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.569197 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vwpz\" (UniqueName: \"kubernetes.io/projected/b8a761a8-3e6d-42eb-b0f8-db388dcf6952-kube-api-access-2vwpz\") pod \"collect-profiles-29416860-wvm9x\" (UID: \"b8a761a8-3e6d-42eb-b0f8-db388dcf6952\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-wvm9x" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.569283 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dda9e216-90cd-466b-a9bd-03fd4914b1b8-serving-cert\") pod \"service-ca-operator-777779d784-8phjh\" (UID: \"dda9e216-90cd-466b-a9bd-03fd4914b1b8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8phjh" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.569398 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/45b650d8-842d-4139-9858-94c3019f7be2-signing-cabundle\") pod \"service-ca-9c57cc56f-bm7cx\" (UID: \"45b650d8-842d-4139-9858-94c3019f7be2\") " pod="openshift-service-ca/service-ca-9c57cc56f-bm7cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.569466 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/45b650d8-842d-4139-9858-94c3019f7be2-signing-key\") pod \"service-ca-9c57cc56f-bm7cx\" (UID: \"45b650d8-842d-4139-9858-94c3019f7be2\") " pod="openshift-service-ca/service-ca-9c57cc56f-bm7cx" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.572789 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b8a761a8-3e6d-42eb-b0f8-db388dcf6952-secret-volume\") pod \"collect-profiles-29416860-wvm9x\" (UID: \"b8a761a8-3e6d-42eb-b0f8-db388dcf6952\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-wvm9x" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.582187 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.601924 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.622103 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.641368 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.662419 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.682421 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.713251 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.721417 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5458b22-c606-4b4f-934e-ecb972895455-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-445qb\" (UID: \"c5458b22-c606-4b4f-934e-ecb972895455\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-445qb" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.722428 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.733177 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5458b22-c606-4b4f-934e-ecb972895455-serving-cert\") pod \"authentication-operator-69f744f599-445qb\" (UID: \"c5458b22-c606-4b4f-934e-ecb972895455\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-445qb" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.743426 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.750955 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5458b22-c606-4b4f-934e-ecb972895455-config\") pod \"authentication-operator-69f744f599-445qb\" (UID: \"c5458b22-c606-4b4f-934e-ecb972895455\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-445qb" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.762561 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.770720 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5458b22-c606-4b4f-934e-ecb972895455-service-ca-bundle\") pod \"authentication-operator-69f744f599-445qb\" (UID: \"c5458b22-c606-4b4f-934e-ecb972895455\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-445qb" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.781702 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.802534 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.821710 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.842245 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.862136 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.882061 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.901811 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.922097 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.941862 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.969438 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 06 09:08:48 crc kubenswrapper[4672]: I1206 09:08:48.981804 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.002428 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.030731 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.042713 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.062317 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.081787 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.101686 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.121576 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.142071 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.162394 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.182617 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.202258 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.221652 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.233976 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/acfb03ae-0ebb-47ec-8433-7de29e729cac-machine-approver-tls\") pod \"machine-approver-56656f9798-24dsx\" (UID: \"acfb03ae-0ebb-47ec-8433-7de29e729cac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-24dsx" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.242741 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.259969 4672 request.go:700] Waited for 1.000107814s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-machine-approver/configmaps?fieldSelector=metadata.name%3Dkube-rbac-proxy&limit=500&resourceVersion=0 Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.265836 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.270589 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/acfb03ae-0ebb-47ec-8433-7de29e729cac-auth-proxy-config\") pod \"machine-approver-56656f9798-24dsx\" (UID: \"acfb03ae-0ebb-47ec-8433-7de29e729cac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-24dsx" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.281397 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.290512 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acfb03ae-0ebb-47ec-8433-7de29e729cac-config\") pod \"machine-approver-56656f9798-24dsx\" (UID: \"acfb03ae-0ebb-47ec-8433-7de29e729cac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-24dsx" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.302403 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.322185 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.342495 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.361959 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.381894 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.402705 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.421781 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.440946 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.460787 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.471713 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dda9e216-90cd-466b-a9bd-03fd4914b1b8-config\") pod \"service-ca-operator-777779d784-8phjh\" (UID: \"dda9e216-90cd-466b-a9bd-03fd4914b1b8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8phjh" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.482360 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.501068 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.513801 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dda9e216-90cd-466b-a9bd-03fd4914b1b8-serving-cert\") pod \"service-ca-operator-777779d784-8phjh\" (UID: \"dda9e216-90cd-466b-a9bd-03fd4914b1b8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8phjh" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.522713 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.541054 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.561199 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 06 09:08:49 crc kubenswrapper[4672]: E1206 09:08:49.569872 4672 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Dec 06 09:08:49 crc kubenswrapper[4672]: E1206 09:08:49.570089 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45b650d8-842d-4139-9858-94c3019f7be2-signing-key podName:45b650d8-842d-4139-9858-94c3019f7be2 nodeName:}" failed. No retries permitted until 2025-12-06 09:08:50.070064682 +0000 UTC m=+147.814324969 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/45b650d8-842d-4139-9858-94c3019f7be2-signing-key") pod "service-ca-9c57cc56f-bm7cx" (UID: "45b650d8-842d-4139-9858-94c3019f7be2") : failed to sync secret cache: timed out waiting for the condition Dec 06 09:08:49 crc kubenswrapper[4672]: E1206 09:08:49.569954 4672 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Dec 06 09:08:49 crc kubenswrapper[4672]: E1206 09:08:49.570255 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b8a761a8-3e6d-42eb-b0f8-db388dcf6952-config-volume podName:b8a761a8-3e6d-42eb-b0f8-db388dcf6952 nodeName:}" failed. No retries permitted until 2025-12-06 09:08:50.070246928 +0000 UTC m=+147.814507215 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/b8a761a8-3e6d-42eb-b0f8-db388dcf6952-config-volume") pod "collect-profiles-29416860-wvm9x" (UID: "b8a761a8-3e6d-42eb-b0f8-db388dcf6952") : failed to sync configmap cache: timed out waiting for the condition Dec 06 09:08:49 crc kubenswrapper[4672]: E1206 09:08:49.569967 4672 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Dec 06 09:08:49 crc kubenswrapper[4672]: E1206 09:08:49.570410 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/45b650d8-842d-4139-9858-94c3019f7be2-signing-cabundle podName:45b650d8-842d-4139-9858-94c3019f7be2 nodeName:}" failed. No retries permitted until 2025-12-06 09:08:50.070402242 +0000 UTC m=+147.814662529 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/45b650d8-842d-4139-9858-94c3019f7be2-signing-cabundle") pod "service-ca-9c57cc56f-bm7cx" (UID: "45b650d8-842d-4139-9858-94c3019f7be2") : failed to sync configmap cache: timed out waiting for the condition Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.581258 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.602316 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.621879 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.641230 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.660979 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.682358 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.701492 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.721836 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.742676 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.762417 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.791154 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.801872 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.849236 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwzdj\" (UniqueName: \"kubernetes.io/projected/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-kube-api-access-fwzdj\") pod \"oauth-openshift-558db77b4-lm8cx\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.858641 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzzgf\" (UniqueName: \"kubernetes.io/projected/87e773f5-6efb-4613-9af8-f05c7af849e1-kube-api-access-fzzgf\") pod \"machine-api-operator-5694c8668f-b8m6z\" (UID: \"87e773f5-6efb-4613-9af8-f05c7af849e1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b8m6z" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.888146 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.892362 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnw2g\" (UniqueName: \"kubernetes.io/projected/de34b8a9-076f-4aa5-acb7-52361b6deeb8-kube-api-access-cnw2g\") pod \"console-f9d7485db-dcdqg\" (UID: \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\") " pod="openshift-console/console-f9d7485db-dcdqg" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.903510 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.922543 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.927754 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.961565 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vh9z\" (UniqueName: \"kubernetes.io/projected/05e5af51-76dc-4825-bab8-a5048aea49e9-kube-api-access-4vh9z\") pod \"controller-manager-879f6c89f-zvxtd\" (UID: \"05e5af51-76dc-4825-bab8-a5048aea49e9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zvxtd" Dec 06 09:08:49 crc kubenswrapper[4672]: I1206 09:08:49.979803 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbzhs\" (UniqueName: \"kubernetes.io/projected/3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89-kube-api-access-qbzhs\") pod \"apiserver-76f77b778f-lcghp\" (UID: \"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89\") " pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.000049 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72mz7\" (UniqueName: \"kubernetes.io/projected/2fe5591f-8503-4eea-9b4f-e85419856dd6-kube-api-access-72mz7\") pod \"apiserver-7bbb656c7d-5fpzc\" (UID: \"2fe5591f-8503-4eea-9b4f-e85419856dd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.021264 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.040420 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj6kn\" (UniqueName: \"kubernetes.io/projected/f28bb046-9dd7-47e0-a498-1928568abe59-kube-api-access-cj6kn\") pod \"openshift-controller-manager-operator-756b6f6bc6-vhb2v\" (UID: \"f28bb046-9dd7-47e0-a498-1928568abe59\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhb2v" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.074181 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dcdqg" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.090988 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-b8m6z" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.091183 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b8a761a8-3e6d-42eb-b0f8-db388dcf6952-config-volume\") pod \"collect-profiles-29416860-wvm9x\" (UID: \"b8a761a8-3e6d-42eb-b0f8-db388dcf6952\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-wvm9x" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.091283 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/45b650d8-842d-4139-9858-94c3019f7be2-signing-cabundle\") pod \"service-ca-9c57cc56f-bm7cx\" (UID: \"45b650d8-842d-4139-9858-94c3019f7be2\") " pod="openshift-service-ca/service-ca-9c57cc56f-bm7cx" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.091322 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/45b650d8-842d-4139-9858-94c3019f7be2-signing-key\") pod \"service-ca-9c57cc56f-bm7cx\" (UID: \"45b650d8-842d-4139-9858-94c3019f7be2\") " pod="openshift-service-ca/service-ca-9c57cc56f-bm7cx" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.092107 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b8a761a8-3e6d-42eb-b0f8-db388dcf6952-config-volume\") pod \"collect-profiles-29416860-wvm9x\" (UID: \"b8a761a8-3e6d-42eb-b0f8-db388dcf6952\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-wvm9x" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.093085 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/45b650d8-842d-4139-9858-94c3019f7be2-signing-cabundle\") pod \"service-ca-9c57cc56f-bm7cx\" (UID: \"45b650d8-842d-4139-9858-94c3019f7be2\") " pod="openshift-service-ca/service-ca-9c57cc56f-bm7cx" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.095928 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc9lc\" (UniqueName: \"kubernetes.io/projected/64de6d79-f439-4a73-9ac6-605a71c8aab7-kube-api-access-bc9lc\") pod \"downloads-7954f5f757-x88bb\" (UID: \"64de6d79-f439-4a73-9ac6-605a71c8aab7\") " pod="openshift-console/downloads-7954f5f757-x88bb" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.097518 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/45b650d8-842d-4139-9858-94c3019f7be2-signing-key\") pod \"service-ca-9c57cc56f-bm7cx\" (UID: \"45b650d8-842d-4139-9858-94c3019f7be2\") " pod="openshift-service-ca/service-ca-9c57cc56f-bm7cx" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.099764 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwjp5\" (UniqueName: \"kubernetes.io/projected/211614db-3bf5-4db7-9146-cc91303fc217-kube-api-access-lwjp5\") pod \"route-controller-manager-6576b87f9c-5pxtv\" (UID: \"211614db-3bf5-4db7-9146-cc91303fc217\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.121947 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bwjj\" (UniqueName: \"kubernetes.io/projected/8ae93c21-14e1-4248-98cf-a250cc060f20-kube-api-access-4bwjj\") pod \"console-operator-58897d9998-q66t7\" (UID: \"8ae93c21-14e1-4248-98cf-a250cc060f20\") " pod="openshift-console-operator/console-operator-58897d9998-q66t7" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.143371 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zk26\" (UniqueName: \"kubernetes.io/projected/1babc617-f6e7-4ec3-a4a2-82cd7ca080fb-kube-api-access-4zk26\") pod \"migrator-59844c95c7-qff28\" (UID: \"1babc617-f6e7-4ec3-a4a2-82cd7ca080fb\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qff28" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.145098 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.177253 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21dbd9c3-6afd-44dd-aa63-c1094b853b5d-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-47kpj\" (UID: \"21dbd9c3-6afd-44dd-aa63-c1094b853b5d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-47kpj" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.181021 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0def7e4f-7c51-4814-9ceb-7ba90a4699ad-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r6dhb\" (UID: \"0def7e4f-7c51-4814-9ceb-7ba90a4699ad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r6dhb" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.188942 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zvxtd" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.197095 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhb2v" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.201908 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lm8cx"] Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.212175 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rch7j\" (UniqueName: \"kubernetes.io/projected/637b32e8-5e9a-47ac-aeaf-60709cdfba63-kube-api-access-rch7j\") pod \"openshift-config-operator-7777fb866f-4vzn5\" (UID: \"637b32e8-5e9a-47ac-aeaf-60709cdfba63\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vzn5" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.212472 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-47kpj" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.218676 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrtjw\" (UniqueName: \"kubernetes.io/projected/9824eace-01c1-49c3-9094-3f926eda9487-kube-api-access-vrtjw\") pod \"dns-operator-744455d44c-q8xwg\" (UID: \"9824eace-01c1-49c3-9094-3f926eda9487\") " pod="openshift-dns-operator/dns-operator-744455d44c-q8xwg" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.223667 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r6dhb" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.246366 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcx64\" (UniqueName: \"kubernetes.io/projected/a7a39312-812f-45a0-ab3f-362048a42c5f-kube-api-access-jcx64\") pod \"olm-operator-6b444d44fb-lds42\" (UID: \"a7a39312-812f-45a0-ab3f-362048a42c5f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lds42" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.248359 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-x88bb" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.257051 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjpvw\" (UniqueName: \"kubernetes.io/projected/475471e2-43d3-46f3-9aa1-b44f497b626f-kube-api-access-xjpvw\") pod \"openshift-apiserver-operator-796bbdcf4f-4l2wz\" (UID: \"475471e2-43d3-46f3-9aa1-b44f497b626f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4l2wz" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.281530 4672 request.go:700] Waited for 1.795692391s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/serviceaccounts/cluster-samples-operator/token Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.283926 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2dgs\" (UniqueName: \"kubernetes.io/projected/285de502-3cec-4e87-b096-c9485f99ac4b-kube-api-access-d2dgs\") pod \"kube-storage-version-migrator-operator-b67b599dd-lprjd\" (UID: \"285de502-3cec-4e87-b096-c9485f99ac4b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lprjd" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.287093 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qff28" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.308436 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7wt5\" (UniqueName: \"kubernetes.io/projected/88233a33-81de-4a10-8e6b-bf8ae80beb22-kube-api-access-g7wt5\") pod \"cluster-samples-operator-665b6dd947-rcnv8\" (UID: \"88233a33-81de-4a10-8e6b-bf8ae80beb22\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rcnv8" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.330211 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.331303 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lds42" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.332076 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-q66t7" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.334418 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9f8f\" (UniqueName: \"kubernetes.io/projected/77d1174d-bfc2-4145-9bf2-c2b648f903e8-kube-api-access-z9f8f\") pod \"router-default-5444994796-x9m9h\" (UID: \"77d1174d-bfc2-4145-9bf2-c2b648f903e8\") " pod="openshift-ingress/router-default-5444994796-x9m9h" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.339085 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-x9m9h" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.339771 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc"] Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.341634 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.359188 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.363726 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.398519 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.409012 4672 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.409462 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rcnv8" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.409476 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" event={"ID":"d543231a-ae36-4b66-ac6a-fc3b48a0acb3","Type":"ContainerStarted","Data":"075673aeebdbdc41ac66ce8a8366f6fe6ea1bde1b3e0a17fc12d625948d2e747"} Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.430546 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.441870 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-dcdqg"] Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.445655 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.462833 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.484587 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.486534 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vzn5" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.498976 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-q8xwg" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.500137 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-b8m6z"] Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.501833 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.501872 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.506520 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.509664 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.513460 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.544207 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qjmg\" (UniqueName: \"kubernetes.io/projected/acfb03ae-0ebb-47ec-8433-7de29e729cac-kube-api-access-8qjmg\") pod \"machine-approver-56656f9798-24dsx\" (UID: \"acfb03ae-0ebb-47ec-8433-7de29e729cac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-24dsx" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.544452 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4l2wz" Dec 06 09:08:50 crc kubenswrapper[4672]: W1206 09:08:50.554989 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde34b8a9_076f_4aa5_acb7_52361b6deeb8.slice/crio-b7240216e085894892f947c16b9d6387fb36461f29a2d91654a708ccb061a26b WatchSource:0}: Error finding container b7240216e085894892f947c16b9d6387fb36461f29a2d91654a708ccb061a26b: Status 404 returned error can't find the container with id b7240216e085894892f947c16b9d6387fb36461f29a2d91654a708ccb061a26b Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.564268 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vwvd\" (UniqueName: \"kubernetes.io/projected/c5458b22-c606-4b4f-934e-ecb972895455-kube-api-access-9vwvd\") pod \"authentication-operator-69f744f599-445qb\" (UID: \"c5458b22-c606-4b4f-934e-ecb972895455\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-445qb" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.566340 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lprjd" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.585970 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.590946 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vwpz\" (UniqueName: \"kubernetes.io/projected/b8a761a8-3e6d-42eb-b0f8-db388dcf6952-kube-api-access-2vwpz\") pod \"collect-profiles-29416860-wvm9x\" (UID: \"b8a761a8-3e6d-42eb-b0f8-db388dcf6952\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-wvm9x" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.602908 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zldc\" (UniqueName: \"kubernetes.io/projected/dda9e216-90cd-466b-a9bd-03fd4914b1b8-kube-api-access-9zldc\") pod \"service-ca-operator-777779d784-8phjh\" (UID: \"dda9e216-90cd-466b-a9bd-03fd4914b1b8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8phjh" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.605061 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.605299 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.605332 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:08:50 crc kubenswrapper[4672]: E1206 09:08:50.606135 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:10:52.606106801 +0000 UTC m=+270.350367088 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.613573 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.616535 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.616888 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.639017 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-x88bb"] Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.646455 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7njz\" (UniqueName: \"kubernetes.io/projected/45b650d8-842d-4139-9858-94c3019f7be2-kube-api-access-v7njz\") pod \"service-ca-9c57cc56f-bm7cx\" (UID: \"45b650d8-842d-4139-9858-94c3019f7be2\") " pod="openshift-service-ca/service-ca-9c57cc56f-bm7cx" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.651641 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zvxtd"] Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.651695 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-47kpj"] Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.677073 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-445qb" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.703243 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-24dsx" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.707686 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0807b471-6f7b-4326-b33a-b4e274f94607-srv-cert\") pod \"catalog-operator-68c6474976-znflx\" (UID: \"0807b471-6f7b-4326-b33a-b4e274f94607\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-znflx" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.707727 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/efef9ae7-ed6d-40fc-9b40-70b1d55383df-webhook-cert\") pod \"packageserver-d55dfcdfc-pzl96\" (UID: \"efef9ae7-ed6d-40fc-9b40-70b1d55383df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pzl96" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.707748 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg2v7\" (UniqueName: \"kubernetes.io/projected/7116521c-c2a3-4c0e-bacf-9d83f4c59087-kube-api-access-vg2v7\") pod \"machine-config-controller-84d6567774-pjlvh\" (UID: \"7116521c-c2a3-4c0e-bacf-9d83f4c59087\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pjlvh" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.707786 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxzr8\" (UniqueName: \"kubernetes.io/projected/0eaff321-3498-4f22-9b82-89f459cb982c-kube-api-access-wxzr8\") pod \"etcd-operator-b45778765-26xdk\" (UID: \"0eaff321-3498-4f22-9b82-89f459cb982c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-26xdk" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.707824 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jms4\" (UniqueName: \"kubernetes.io/projected/0e864b8d-8d39-4fb3-97ad-30963169ecde-kube-api-access-9jms4\") pod \"package-server-manager-789f6589d5-c2g6x\" (UID: \"0e864b8d-8d39-4fb3-97ad-30963169ecde\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c2g6x" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.707844 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0eaff321-3498-4f22-9b82-89f459cb982c-etcd-client\") pod \"etcd-operator-b45778765-26xdk\" (UID: \"0eaff321-3498-4f22-9b82-89f459cb982c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-26xdk" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.707864 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2b8e997-a3d6-43cc-a637-11f8c0a710ec-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-5ss48\" (UID: \"b2b8e997-a3d6-43cc-a637-11f8c0a710ec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5ss48" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.707883 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7116521c-c2a3-4c0e-bacf-9d83f4c59087-proxy-tls\") pod \"machine-config-controller-84d6567774-pjlvh\" (UID: \"7116521c-c2a3-4c0e-bacf-9d83f4c59087\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pjlvh" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.707903 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mck6n\" (UniqueName: \"kubernetes.io/projected/f874c07b-7566-441d-9546-6c3f7b64de13-kube-api-access-mck6n\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.707930 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f874c07b-7566-441d-9546-6c3f7b64de13-installation-pull-secrets\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.707947 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2b8e997-a3d6-43cc-a637-11f8c0a710ec-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-5ss48\" (UID: \"b2b8e997-a3d6-43cc-a637-11f8c0a710ec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5ss48" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.707973 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56a63b61-c938-4435-9112-a02277f6caa4-config\") pod \"kube-controller-manager-operator-78b949d7b-zl58f\" (UID: \"56a63b61-c938-4435-9112-a02277f6caa4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zl58f" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.707991 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0eaff321-3498-4f22-9b82-89f459cb982c-etcd-service-ca\") pod \"etcd-operator-b45778765-26xdk\" (UID: \"0eaff321-3498-4f22-9b82-89f459cb982c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-26xdk" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.708017 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56a63b61-c938-4435-9112-a02277f6caa4-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zl58f\" (UID: \"56a63b61-c938-4435-9112-a02277f6caa4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zl58f" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.708039 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e864b8d-8d39-4fb3-97ad-30963169ecde-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-c2g6x\" (UID: \"0e864b8d-8d39-4fb3-97ad-30963169ecde\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c2g6x" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.708070 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9kb8\" (UniqueName: \"kubernetes.io/projected/9a2d76b4-eb44-49ba-ad51-fbe3022af615-kube-api-access-b9kb8\") pod \"control-plane-machine-set-operator-78cbb6b69f-vrlvb\" (UID: \"9a2d76b4-eb44-49ba-ad51-fbe3022af615\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vrlvb" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.708087 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f874c07b-7566-441d-9546-6c3f7b64de13-ca-trust-extracted\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.708130 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2b8e997-a3d6-43cc-a637-11f8c0a710ec-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-5ss48\" (UID: \"b2b8e997-a3d6-43cc-a637-11f8c0a710ec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5ss48" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.708146 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgtkj\" (UniqueName: \"kubernetes.io/projected/459ffd9e-e358-41cb-b902-43e162b2c9d9-kube-api-access-lgtkj\") pod \"ingress-operator-5b745b69d9-dvs5c\" (UID: \"459ffd9e-e358-41cb-b902-43e162b2c9d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dvs5c" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.708164 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2dfe938c-2f3d-4e4c-9156-d2d87b4478fe-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xqnzx\" (UID: \"2dfe938c-2f3d-4e4c-9156-d2d87b4478fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-xqnzx" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.708182 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmgwg\" (UniqueName: \"kubernetes.io/projected/8760ee59-3419-480a-a540-6640481b8e1e-kube-api-access-fmgwg\") pod \"machine-config-operator-74547568cd-4r8t9\" (UID: \"8760ee59-3419-480a-a540-6640481b8e1e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4r8t9" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.708198 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/459ffd9e-e358-41cb-b902-43e162b2c9d9-metrics-tls\") pod \"ingress-operator-5b745b69d9-dvs5c\" (UID: \"459ffd9e-e358-41cb-b902-43e162b2c9d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dvs5c" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.708216 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54kws\" (UniqueName: \"kubernetes.io/projected/79f3925d-9a36-418b-bd25-50dd03106705-kube-api-access-54kws\") pod \"machine-config-server-g4dbv\" (UID: \"79f3925d-9a36-418b-bd25-50dd03106705\") " pod="openshift-machine-config-operator/machine-config-server-g4dbv" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.708246 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/79f3925d-9a36-418b-bd25-50dd03106705-node-bootstrap-token\") pod \"machine-config-server-g4dbv\" (UID: \"79f3925d-9a36-418b-bd25-50dd03106705\") " pod="openshift-machine-config-operator/machine-config-server-g4dbv" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.708263 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7116521c-c2a3-4c0e-bacf-9d83f4c59087-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-pjlvh\" (UID: \"7116521c-c2a3-4c0e-bacf-9d83f4c59087\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pjlvh" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.708311 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f874c07b-7566-441d-9546-6c3f7b64de13-registry-certificates\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.708345 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.708363 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f874c07b-7566-441d-9546-6c3f7b64de13-registry-tls\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.708398 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zldq\" (UniqueName: \"kubernetes.io/projected/b2b8e997-a3d6-43cc-a637-11f8c0a710ec-kube-api-access-6zldq\") pod \"cluster-image-registry-operator-dc59b4c8b-5ss48\" (UID: \"b2b8e997-a3d6-43cc-a637-11f8c0a710ec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5ss48" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.708415 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0807b471-6f7b-4326-b33a-b4e274f94607-profile-collector-cert\") pod \"catalog-operator-68c6474976-znflx\" (UID: \"0807b471-6f7b-4326-b33a-b4e274f94607\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-znflx" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.708440 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56a63b61-c938-4435-9112-a02277f6caa4-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zl58f\" (UID: \"56a63b61-c938-4435-9112-a02277f6caa4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zl58f" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.708458 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqcsb\" (UniqueName: \"kubernetes.io/projected/efef9ae7-ed6d-40fc-9b40-70b1d55383df-kube-api-access-hqcsb\") pod \"packageserver-d55dfcdfc-pzl96\" (UID: \"efef9ae7-ed6d-40fc-9b40-70b1d55383df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pzl96" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.708477 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8760ee59-3419-480a-a540-6640481b8e1e-proxy-tls\") pod \"machine-config-operator-74547568cd-4r8t9\" (UID: \"8760ee59-3419-480a-a540-6640481b8e1e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4r8t9" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.708494 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/459ffd9e-e358-41cb-b902-43e162b2c9d9-trusted-ca\") pod \"ingress-operator-5b745b69d9-dvs5c\" (UID: \"459ffd9e-e358-41cb-b902-43e162b2c9d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dvs5c" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.708513 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9a2d76b4-eb44-49ba-ad51-fbe3022af615-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-vrlvb\" (UID: \"9a2d76b4-eb44-49ba-ad51-fbe3022af615\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vrlvb" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.708553 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/efef9ae7-ed6d-40fc-9b40-70b1d55383df-apiservice-cert\") pod \"packageserver-d55dfcdfc-pzl96\" (UID: \"efef9ae7-ed6d-40fc-9b40-70b1d55383df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pzl96" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.708589 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f874c07b-7566-441d-9546-6c3f7b64de13-trusted-ca\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.708639 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8760ee59-3419-480a-a540-6640481b8e1e-images\") pod \"machine-config-operator-74547568cd-4r8t9\" (UID: \"8760ee59-3419-480a-a540-6640481b8e1e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4r8t9" Dec 06 09:08:50 crc kubenswrapper[4672]: W1206 09:08:50.711228 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21dbd9c3_6afd_44dd_aa63_c1094b853b5d.slice/crio-811fd059e30dede4e50c0e9571323088e4345979536dfb527d9a34f877fe675c WatchSource:0}: Error finding container 811fd059e30dede4e50c0e9571323088e4345979536dfb527d9a34f877fe675c: Status 404 returned error can't find the container with id 811fd059e30dede4e50c0e9571323088e4345979536dfb527d9a34f877fe675c Dec 06 09:08:50 crc kubenswrapper[4672]: E1206 09:08:50.711798 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:51.211760732 +0000 UTC m=+148.956021019 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.715684 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8760ee59-3419-480a-a540-6640481b8e1e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4r8t9\" (UID: \"8760ee59-3419-480a-a540-6640481b8e1e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4r8t9" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.716241 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eaff321-3498-4f22-9b82-89f459cb982c-config\") pod \"etcd-operator-b45778765-26xdk\" (UID: \"0eaff321-3498-4f22-9b82-89f459cb982c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-26xdk" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.717993 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhnt8\" (UniqueName: \"kubernetes.io/projected/2dfe938c-2f3d-4e4c-9156-d2d87b4478fe-kube-api-access-nhnt8\") pod \"marketplace-operator-79b997595-xqnzx\" (UID: \"2dfe938c-2f3d-4e4c-9156-d2d87b4478fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-xqnzx" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.718037 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th7r6\" (UniqueName: \"kubernetes.io/projected/1bc7e85b-9077-40c2-a33d-daa00d8b1d47-kube-api-access-th7r6\") pod \"multus-admission-controller-857f4d67dd-4h2gq\" (UID: \"1bc7e85b-9077-40c2-a33d-daa00d8b1d47\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4h2gq" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.718145 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f874c07b-7566-441d-9546-6c3f7b64de13-bound-sa-token\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.718166 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/79f3925d-9a36-418b-bd25-50dd03106705-certs\") pod \"machine-config-server-g4dbv\" (UID: \"79f3925d-9a36-418b-bd25-50dd03106705\") " pod="openshift-machine-config-operator/machine-config-server-g4dbv" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.718214 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0eaff321-3498-4f22-9b82-89f459cb982c-serving-cert\") pod \"etcd-operator-b45778765-26xdk\" (UID: \"0eaff321-3498-4f22-9b82-89f459cb982c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-26xdk" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.718283 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/efef9ae7-ed6d-40fc-9b40-70b1d55383df-tmpfs\") pod \"packageserver-d55dfcdfc-pzl96\" (UID: \"efef9ae7-ed6d-40fc-9b40-70b1d55383df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pzl96" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.718305 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0eaff321-3498-4f22-9b82-89f459cb982c-etcd-ca\") pod \"etcd-operator-b45778765-26xdk\" (UID: \"0eaff321-3498-4f22-9b82-89f459cb982c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-26xdk" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.718323 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfs7n\" (UniqueName: \"kubernetes.io/projected/0807b471-6f7b-4326-b33a-b4e274f94607-kube-api-access-tfs7n\") pod \"catalog-operator-68c6474976-znflx\" (UID: \"0807b471-6f7b-4326-b33a-b4e274f94607\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-znflx" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.718396 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/459ffd9e-e358-41cb-b902-43e162b2c9d9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dvs5c\" (UID: \"459ffd9e-e358-41cb-b902-43e162b2c9d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dvs5c" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.718415 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2dfe938c-2f3d-4e4c-9156-d2d87b4478fe-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xqnzx\" (UID: \"2dfe938c-2f3d-4e4c-9156-d2d87b4478fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-xqnzx" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.718433 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1bc7e85b-9077-40c2-a33d-daa00d8b1d47-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4h2gq\" (UID: \"1bc7e85b-9077-40c2-a33d-daa00d8b1d47\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4h2gq" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.757572 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8phjh" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.766900 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lcghp"] Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.799419 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhb2v"] Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.803474 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-bm7cx" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.803578 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-wvm9x" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.819966 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.820215 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/efef9ae7-ed6d-40fc-9b40-70b1d55383df-tmpfs\") pod \"packageserver-d55dfcdfc-pzl96\" (UID: \"efef9ae7-ed6d-40fc-9b40-70b1d55383df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pzl96" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.820249 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0eaff321-3498-4f22-9b82-89f459cb982c-etcd-ca\") pod \"etcd-operator-b45778765-26xdk\" (UID: \"0eaff321-3498-4f22-9b82-89f459cb982c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-26xdk" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.820292 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfs7n\" (UniqueName: \"kubernetes.io/projected/0807b471-6f7b-4326-b33a-b4e274f94607-kube-api-access-tfs7n\") pod \"catalog-operator-68c6474976-znflx\" (UID: \"0807b471-6f7b-4326-b33a-b4e274f94607\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-znflx" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.820316 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c007303-368b-4322-8e3c-7b89c9f29c9e-metrics-tls\") pod \"dns-default-rqdv8\" (UID: \"4c007303-368b-4322-8e3c-7b89c9f29c9e\") " pod="openshift-dns/dns-default-rqdv8" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.820336 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/459ffd9e-e358-41cb-b902-43e162b2c9d9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dvs5c\" (UID: \"459ffd9e-e358-41cb-b902-43e162b2c9d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dvs5c" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.820353 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2dfe938c-2f3d-4e4c-9156-d2d87b4478fe-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xqnzx\" (UID: \"2dfe938c-2f3d-4e4c-9156-d2d87b4478fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-xqnzx" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.820373 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1bc7e85b-9077-40c2-a33d-daa00d8b1d47-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4h2gq\" (UID: \"1bc7e85b-9077-40c2-a33d-daa00d8b1d47\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4h2gq" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.820392 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3a64ce15-c29c-46af-91a1-309857493594-csi-data-dir\") pod \"csi-hostpathplugin-wt47x\" (UID: \"3a64ce15-c29c-46af-91a1-309857493594\") " pod="hostpath-provisioner/csi-hostpathplugin-wt47x" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.820414 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0807b471-6f7b-4326-b33a-b4e274f94607-srv-cert\") pod \"catalog-operator-68c6474976-znflx\" (UID: \"0807b471-6f7b-4326-b33a-b4e274f94607\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-znflx" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.820440 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/efef9ae7-ed6d-40fc-9b40-70b1d55383df-webhook-cert\") pod \"packageserver-d55dfcdfc-pzl96\" (UID: \"efef9ae7-ed6d-40fc-9b40-70b1d55383df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pzl96" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.820459 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg2v7\" (UniqueName: \"kubernetes.io/projected/7116521c-c2a3-4c0e-bacf-9d83f4c59087-kube-api-access-vg2v7\") pod \"machine-config-controller-84d6567774-pjlvh\" (UID: \"7116521c-c2a3-4c0e-bacf-9d83f4c59087\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pjlvh" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.820480 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxzr8\" (UniqueName: \"kubernetes.io/projected/0eaff321-3498-4f22-9b82-89f459cb982c-kube-api-access-wxzr8\") pod \"etcd-operator-b45778765-26xdk\" (UID: \"0eaff321-3498-4f22-9b82-89f459cb982c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-26xdk" Dec 06 09:08:50 crc kubenswrapper[4672]: E1206 09:08:50.825539 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:51.325498563 +0000 UTC m=+149.069758850 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.826164 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/efef9ae7-ed6d-40fc-9b40-70b1d55383df-tmpfs\") pod \"packageserver-d55dfcdfc-pzl96\" (UID: \"efef9ae7-ed6d-40fc-9b40-70b1d55383df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pzl96" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.827743 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0eaff321-3498-4f22-9b82-89f459cb982c-etcd-ca\") pod \"etcd-operator-b45778765-26xdk\" (UID: \"0eaff321-3498-4f22-9b82-89f459cb982c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-26xdk" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.829246 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jms4\" (UniqueName: \"kubernetes.io/projected/0e864b8d-8d39-4fb3-97ad-30963169ecde-kube-api-access-9jms4\") pod \"package-server-manager-789f6589d5-c2g6x\" (UID: \"0e864b8d-8d39-4fb3-97ad-30963169ecde\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c2g6x" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.829321 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0eaff321-3498-4f22-9b82-89f459cb982c-etcd-client\") pod \"etcd-operator-b45778765-26xdk\" (UID: \"0eaff321-3498-4f22-9b82-89f459cb982c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-26xdk" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.829362 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2b8e997-a3d6-43cc-a637-11f8c0a710ec-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-5ss48\" (UID: \"b2b8e997-a3d6-43cc-a637-11f8c0a710ec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5ss48" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.829390 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7116521c-c2a3-4c0e-bacf-9d83f4c59087-proxy-tls\") pod \"machine-config-controller-84d6567774-pjlvh\" (UID: \"7116521c-c2a3-4c0e-bacf-9d83f4c59087\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pjlvh" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.829470 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mck6n\" (UniqueName: \"kubernetes.io/projected/f874c07b-7566-441d-9546-6c3f7b64de13-kube-api-access-mck6n\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.829525 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56a63b61-c938-4435-9112-a02277f6caa4-config\") pod \"kube-controller-manager-operator-78b949d7b-zl58f\" (UID: \"56a63b61-c938-4435-9112-a02277f6caa4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zl58f" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.829550 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0eaff321-3498-4f22-9b82-89f459cb982c-etcd-service-ca\") pod \"etcd-operator-b45778765-26xdk\" (UID: \"0eaff321-3498-4f22-9b82-89f459cb982c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-26xdk" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.829583 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f874c07b-7566-441d-9546-6c3f7b64de13-installation-pull-secrets\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.829618 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2b8e997-a3d6-43cc-a637-11f8c0a710ec-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-5ss48\" (UID: \"b2b8e997-a3d6-43cc-a637-11f8c0a710ec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5ss48" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.829673 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56a63b61-c938-4435-9112-a02277f6caa4-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zl58f\" (UID: \"56a63b61-c938-4435-9112-a02277f6caa4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zl58f" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.829698 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e864b8d-8d39-4fb3-97ad-30963169ecde-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-c2g6x\" (UID: \"0e864b8d-8d39-4fb3-97ad-30963169ecde\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c2g6x" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.829743 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3a64ce15-c29c-46af-91a1-309857493594-socket-dir\") pod \"csi-hostpathplugin-wt47x\" (UID: \"3a64ce15-c29c-46af-91a1-309857493594\") " pod="hostpath-provisioner/csi-hostpathplugin-wt47x" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.829792 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9kb8\" (UniqueName: \"kubernetes.io/projected/9a2d76b4-eb44-49ba-ad51-fbe3022af615-kube-api-access-b9kb8\") pod \"control-plane-machine-set-operator-78cbb6b69f-vrlvb\" (UID: \"9a2d76b4-eb44-49ba-ad51-fbe3022af615\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vrlvb" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.829821 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f874c07b-7566-441d-9546-6c3f7b64de13-ca-trust-extracted\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.829865 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2dfe938c-2f3d-4e4c-9156-d2d87b4478fe-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xqnzx\" (UID: \"2dfe938c-2f3d-4e4c-9156-d2d87b4478fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-xqnzx" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.829888 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmgwg\" (UniqueName: \"kubernetes.io/projected/8760ee59-3419-480a-a540-6640481b8e1e-kube-api-access-fmgwg\") pod \"machine-config-operator-74547568cd-4r8t9\" (UID: \"8760ee59-3419-480a-a540-6640481b8e1e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4r8t9" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.829916 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2b8e997-a3d6-43cc-a637-11f8c0a710ec-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-5ss48\" (UID: \"b2b8e997-a3d6-43cc-a637-11f8c0a710ec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5ss48" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.829937 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgtkj\" (UniqueName: \"kubernetes.io/projected/459ffd9e-e358-41cb-b902-43e162b2c9d9-kube-api-access-lgtkj\") pod \"ingress-operator-5b745b69d9-dvs5c\" (UID: \"459ffd9e-e358-41cb-b902-43e162b2c9d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dvs5c" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.829961 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/459ffd9e-e358-41cb-b902-43e162b2c9d9-metrics-tls\") pod \"ingress-operator-5b745b69d9-dvs5c\" (UID: \"459ffd9e-e358-41cb-b902-43e162b2c9d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dvs5c" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.829987 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54kws\" (UniqueName: \"kubernetes.io/projected/79f3925d-9a36-418b-bd25-50dd03106705-kube-api-access-54kws\") pod \"machine-config-server-g4dbv\" (UID: \"79f3925d-9a36-418b-bd25-50dd03106705\") " pod="openshift-machine-config-operator/machine-config-server-g4dbv" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.830011 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/79f3925d-9a36-418b-bd25-50dd03106705-node-bootstrap-token\") pod \"machine-config-server-g4dbv\" (UID: \"79f3925d-9a36-418b-bd25-50dd03106705\") " pod="openshift-machine-config-operator/machine-config-server-g4dbv" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.830032 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3a64ce15-c29c-46af-91a1-309857493594-mountpoint-dir\") pod \"csi-hostpathplugin-wt47x\" (UID: \"3a64ce15-c29c-46af-91a1-309857493594\") " pod="hostpath-provisioner/csi-hostpathplugin-wt47x" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.830058 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7116521c-c2a3-4c0e-bacf-9d83f4c59087-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-pjlvh\" (UID: \"7116521c-c2a3-4c0e-bacf-9d83f4c59087\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pjlvh" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.830086 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3a64ce15-c29c-46af-91a1-309857493594-registration-dir\") pod \"csi-hostpathplugin-wt47x\" (UID: \"3a64ce15-c29c-46af-91a1-309857493594\") " pod="hostpath-provisioner/csi-hostpathplugin-wt47x" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.830122 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f874c07b-7566-441d-9546-6c3f7b64de13-registry-certificates\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.844663 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f874c07b-7566-441d-9546-6c3f7b64de13-registry-certificates\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.844786 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.844844 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f874c07b-7566-441d-9546-6c3f7b64de13-registry-tls\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.844874 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4zwh\" (UniqueName: \"kubernetes.io/projected/3a64ce15-c29c-46af-91a1-309857493594-kube-api-access-c4zwh\") pod \"csi-hostpathplugin-wt47x\" (UID: \"3a64ce15-c29c-46af-91a1-309857493594\") " pod="hostpath-provisioner/csi-hostpathplugin-wt47x" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.844907 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0807b471-6f7b-4326-b33a-b4e274f94607-profile-collector-cert\") pod \"catalog-operator-68c6474976-znflx\" (UID: \"0807b471-6f7b-4326-b33a-b4e274f94607\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-znflx" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.844946 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zldq\" (UniqueName: \"kubernetes.io/projected/b2b8e997-a3d6-43cc-a637-11f8c0a710ec-kube-api-access-6zldq\") pod \"cluster-image-registry-operator-dc59b4c8b-5ss48\" (UID: \"b2b8e997-a3d6-43cc-a637-11f8c0a710ec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5ss48" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.844980 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56a63b61-c938-4435-9112-a02277f6caa4-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zl58f\" (UID: \"56a63b61-c938-4435-9112-a02277f6caa4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zl58f" Dec 06 09:08:50 crc kubenswrapper[4672]: E1206 09:08:50.846086 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:51.346071949 +0000 UTC m=+149.090332236 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.849455 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56a63b61-c938-4435-9112-a02277f6caa4-config\") pod \"kube-controller-manager-operator-78b949d7b-zl58f\" (UID: \"56a63b61-c938-4435-9112-a02277f6caa4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zl58f" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.849872 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f874c07b-7566-441d-9546-6c3f7b64de13-ca-trust-extracted\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.850586 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7116521c-c2a3-4c0e-bacf-9d83f4c59087-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-pjlvh\" (UID: \"7116521c-c2a3-4c0e-bacf-9d83f4c59087\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pjlvh" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.851350 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0eaff321-3498-4f22-9b82-89f459cb982c-etcd-service-ca\") pod \"etcd-operator-b45778765-26xdk\" (UID: \"0eaff321-3498-4f22-9b82-89f459cb982c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-26xdk" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.852937 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2b8e997-a3d6-43cc-a637-11f8c0a710ec-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-5ss48\" (UID: \"b2b8e997-a3d6-43cc-a637-11f8c0a710ec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5ss48" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.855785 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e864b8d-8d39-4fb3-97ad-30963169ecde-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-c2g6x\" (UID: \"0e864b8d-8d39-4fb3-97ad-30963169ecde\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c2g6x" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.855880 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2dfe938c-2f3d-4e4c-9156-d2d87b4478fe-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xqnzx\" (UID: \"2dfe938c-2f3d-4e4c-9156-d2d87b4478fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-xqnzx" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.856360 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2dfe938c-2f3d-4e4c-9156-d2d87b4478fe-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xqnzx\" (UID: \"2dfe938c-2f3d-4e4c-9156-d2d87b4478fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-xqnzx" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.856457 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0807b471-6f7b-4326-b33a-b4e274f94607-srv-cert\") pod \"catalog-operator-68c6474976-znflx\" (UID: \"0807b471-6f7b-4326-b33a-b4e274f94607\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-znflx" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.859736 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqcsb\" (UniqueName: \"kubernetes.io/projected/efef9ae7-ed6d-40fc-9b40-70b1d55383df-kube-api-access-hqcsb\") pod \"packageserver-d55dfcdfc-pzl96\" (UID: \"efef9ae7-ed6d-40fc-9b40-70b1d55383df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pzl96" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.862669 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8760ee59-3419-480a-a540-6640481b8e1e-proxy-tls\") pod \"machine-config-operator-74547568cd-4r8t9\" (UID: \"8760ee59-3419-480a-a540-6640481b8e1e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4r8t9" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.863948 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/459ffd9e-e358-41cb-b902-43e162b2c9d9-trusted-ca\") pod \"ingress-operator-5b745b69d9-dvs5c\" (UID: \"459ffd9e-e358-41cb-b902-43e162b2c9d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dvs5c" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.863995 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9a2d76b4-eb44-49ba-ad51-fbe3022af615-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-vrlvb\" (UID: \"9a2d76b4-eb44-49ba-ad51-fbe3022af615\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vrlvb" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.864453 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aecee641-b961-499e-9387-0a5d008a205c-cert\") pod \"ingress-canary-7gxtl\" (UID: \"aecee641-b961-499e-9387-0a5d008a205c\") " pod="openshift-ingress-canary/ingress-canary-7gxtl" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.870932 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/79f3925d-9a36-418b-bd25-50dd03106705-node-bootstrap-token\") pod \"machine-config-server-g4dbv\" (UID: \"79f3925d-9a36-418b-bd25-50dd03106705\") " pod="openshift-machine-config-operator/machine-config-server-g4dbv" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.871529 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1bc7e85b-9077-40c2-a33d-daa00d8b1d47-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4h2gq\" (UID: \"1bc7e85b-9077-40c2-a33d-daa00d8b1d47\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4h2gq" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.871590 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/459ffd9e-e358-41cb-b902-43e162b2c9d9-metrics-tls\") pod \"ingress-operator-5b745b69d9-dvs5c\" (UID: \"459ffd9e-e358-41cb-b902-43e162b2c9d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dvs5c" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.876153 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/459ffd9e-e358-41cb-b902-43e162b2c9d9-trusted-ca\") pod \"ingress-operator-5b745b69d9-dvs5c\" (UID: \"459ffd9e-e358-41cb-b902-43e162b2c9d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dvs5c" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.876645 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0eaff321-3498-4f22-9b82-89f459cb982c-etcd-client\") pod \"etcd-operator-b45778765-26xdk\" (UID: \"0eaff321-3498-4f22-9b82-89f459cb982c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-26xdk" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.876830 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/efef9ae7-ed6d-40fc-9b40-70b1d55383df-apiservice-cert\") pod \"packageserver-d55dfcdfc-pzl96\" (UID: \"efef9ae7-ed6d-40fc-9b40-70b1d55383df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pzl96" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.877161 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2b8e997-a3d6-43cc-a637-11f8c0a710ec-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-5ss48\" (UID: \"b2b8e997-a3d6-43cc-a637-11f8c0a710ec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5ss48" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.877461 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7116521c-c2a3-4c0e-bacf-9d83f4c59087-proxy-tls\") pod \"machine-config-controller-84d6567774-pjlvh\" (UID: \"7116521c-c2a3-4c0e-bacf-9d83f4c59087\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pjlvh" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.878023 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9a2d76b4-eb44-49ba-ad51-fbe3022af615-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-vrlvb\" (UID: \"9a2d76b4-eb44-49ba-ad51-fbe3022af615\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vrlvb" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.878348 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/efef9ae7-ed6d-40fc-9b40-70b1d55383df-webhook-cert\") pod \"packageserver-d55dfcdfc-pzl96\" (UID: \"efef9ae7-ed6d-40fc-9b40-70b1d55383df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pzl96" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.879364 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c007303-368b-4322-8e3c-7b89c9f29c9e-config-volume\") pod \"dns-default-rqdv8\" (UID: \"4c007303-368b-4322-8e3c-7b89c9f29c9e\") " pod="openshift-dns/dns-default-rqdv8" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.879511 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8760ee59-3419-480a-a540-6640481b8e1e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4r8t9\" (UID: \"8760ee59-3419-480a-a540-6640481b8e1e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4r8t9" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.882637 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8760ee59-3419-480a-a540-6640481b8e1e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4r8t9\" (UID: \"8760ee59-3419-480a-a540-6640481b8e1e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4r8t9" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.882822 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f874c07b-7566-441d-9546-6c3f7b64de13-trusted-ca\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.882884 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8760ee59-3419-480a-a540-6640481b8e1e-images\") pod \"machine-config-operator-74547568cd-4r8t9\" (UID: \"8760ee59-3419-480a-a540-6640481b8e1e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4r8t9" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.886390 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56a63b61-c938-4435-9112-a02277f6caa4-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zl58f\" (UID: \"56a63b61-c938-4435-9112-a02277f6caa4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zl58f" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.886461 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-q66t7"] Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.887371 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eaff321-3498-4f22-9b82-89f459cb982c-config\") pod \"etcd-operator-b45778765-26xdk\" (UID: \"0eaff321-3498-4f22-9b82-89f459cb982c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-26xdk" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.888523 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eaff321-3498-4f22-9b82-89f459cb982c-config\") pod \"etcd-operator-b45778765-26xdk\" (UID: \"0eaff321-3498-4f22-9b82-89f459cb982c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-26xdk" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.888803 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f874c07b-7566-441d-9546-6c3f7b64de13-trusted-ca\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.889810 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhnt8\" (UniqueName: \"kubernetes.io/projected/2dfe938c-2f3d-4e4c-9156-d2d87b4478fe-kube-api-access-nhnt8\") pod \"marketplace-operator-79b997595-xqnzx\" (UID: \"2dfe938c-2f3d-4e4c-9156-d2d87b4478fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-xqnzx" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.889870 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3a64ce15-c29c-46af-91a1-309857493594-plugins-dir\") pod \"csi-hostpathplugin-wt47x\" (UID: \"3a64ce15-c29c-46af-91a1-309857493594\") " pod="hostpath-provisioner/csi-hostpathplugin-wt47x" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.890040 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxzr8\" (UniqueName: \"kubernetes.io/projected/0eaff321-3498-4f22-9b82-89f459cb982c-kube-api-access-wxzr8\") pod \"etcd-operator-b45778765-26xdk\" (UID: \"0eaff321-3498-4f22-9b82-89f459cb982c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-26xdk" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.890180 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8760ee59-3419-480a-a540-6640481b8e1e-images\") pod \"machine-config-operator-74547568cd-4r8t9\" (UID: \"8760ee59-3419-480a-a540-6640481b8e1e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4r8t9" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.890352 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th7r6\" (UniqueName: \"kubernetes.io/projected/1bc7e85b-9077-40c2-a33d-daa00d8b1d47-kube-api-access-th7r6\") pod \"multus-admission-controller-857f4d67dd-4h2gq\" (UID: \"1bc7e85b-9077-40c2-a33d-daa00d8b1d47\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4h2gq" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.891569 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/efef9ae7-ed6d-40fc-9b40-70b1d55383df-apiservice-cert\") pod \"packageserver-d55dfcdfc-pzl96\" (UID: \"efef9ae7-ed6d-40fc-9b40-70b1d55383df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pzl96" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.892206 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f874c07b-7566-441d-9546-6c3f7b64de13-bound-sa-token\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.892249 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfs7n\" (UniqueName: \"kubernetes.io/projected/0807b471-6f7b-4326-b33a-b4e274f94607-kube-api-access-tfs7n\") pod \"catalog-operator-68c6474976-znflx\" (UID: \"0807b471-6f7b-4326-b33a-b4e274f94607\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-znflx" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.892261 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/79f3925d-9a36-418b-bd25-50dd03106705-certs\") pod \"machine-config-server-g4dbv\" (UID: \"79f3925d-9a36-418b-bd25-50dd03106705\") " pod="openshift-machine-config-operator/machine-config-server-g4dbv" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.892345 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f26fq\" (UniqueName: \"kubernetes.io/projected/aecee641-b961-499e-9387-0a5d008a205c-kube-api-access-f26fq\") pod \"ingress-canary-7gxtl\" (UID: \"aecee641-b961-499e-9387-0a5d008a205c\") " pod="openshift-ingress-canary/ingress-canary-7gxtl" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.892395 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0eaff321-3498-4f22-9b82-89f459cb982c-serving-cert\") pod \"etcd-operator-b45778765-26xdk\" (UID: \"0eaff321-3498-4f22-9b82-89f459cb982c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-26xdk" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.892871 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdbzq\" (UniqueName: \"kubernetes.io/projected/4c007303-368b-4322-8e3c-7b89c9f29c9e-kube-api-access-kdbzq\") pod \"dns-default-rqdv8\" (UID: \"4c007303-368b-4322-8e3c-7b89c9f29c9e\") " pod="openshift-dns/dns-default-rqdv8" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.896188 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f874c07b-7566-441d-9546-6c3f7b64de13-registry-tls\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.898359 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f874c07b-7566-441d-9546-6c3f7b64de13-installation-pull-secrets\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.900442 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qff28"] Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.901208 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.909396 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8760ee59-3419-480a-a540-6640481b8e1e-proxy-tls\") pod \"machine-config-operator-74547568cd-4r8t9\" (UID: \"8760ee59-3419-480a-a540-6640481b8e1e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4r8t9" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.909592 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0807b471-6f7b-4326-b33a-b4e274f94607-profile-collector-cert\") pod \"catalog-operator-68c6474976-znflx\" (UID: \"0807b471-6f7b-4326-b33a-b4e274f94607\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-znflx" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.911743 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/459ffd9e-e358-41cb-b902-43e162b2c9d9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dvs5c\" (UID: \"459ffd9e-e358-41cb-b902-43e162b2c9d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dvs5c" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.917182 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0eaff321-3498-4f22-9b82-89f459cb982c-serving-cert\") pod \"etcd-operator-b45778765-26xdk\" (UID: \"0eaff321-3498-4f22-9b82-89f459cb982c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-26xdk" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.917813 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/79f3925d-9a36-418b-bd25-50dd03106705-certs\") pod \"machine-config-server-g4dbv\" (UID: \"79f3925d-9a36-418b-bd25-50dd03106705\") " pod="openshift-machine-config-operator/machine-config-server-g4dbv" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.923665 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg2v7\" (UniqueName: \"kubernetes.io/projected/7116521c-c2a3-4c0e-bacf-9d83f4c59087-kube-api-access-vg2v7\") pod \"machine-config-controller-84d6567774-pjlvh\" (UID: \"7116521c-c2a3-4c0e-bacf-9d83f4c59087\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pjlvh" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.943727 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lds42"] Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.960194 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r6dhb"] Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.967989 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jms4\" (UniqueName: \"kubernetes.io/projected/0e864b8d-8d39-4fb3-97ad-30963169ecde-kube-api-access-9jms4\") pod \"package-server-manager-789f6589d5-c2g6x\" (UID: \"0e864b8d-8d39-4fb3-97ad-30963169ecde\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c2g6x" Dec 06 09:08:50 crc kubenswrapper[4672]: I1206 09:08:50.980554 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2b8e997-a3d6-43cc-a637-11f8c0a710ec-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-5ss48\" (UID: \"b2b8e997-a3d6-43cc-a637-11f8c0a710ec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5ss48" Dec 06 09:08:50 crc kubenswrapper[4672]: W1206 09:08:50.995485 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ae93c21_14e1_4248_98cf_a250cc060f20.slice/crio-fbaf00a32458a4caf67b738e710125cb5b4767c71b2359d84983a7a6ee1dca1a WatchSource:0}: Error finding container fbaf00a32458a4caf67b738e710125cb5b4767c71b2359d84983a7a6ee1dca1a: Status 404 returned error can't find the container with id fbaf00a32458a4caf67b738e710125cb5b4767c71b2359d84983a7a6ee1dca1a Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:50.997053 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:50.997359 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdbzq\" (UniqueName: \"kubernetes.io/projected/4c007303-368b-4322-8e3c-7b89c9f29c9e-kube-api-access-kdbzq\") pod \"dns-default-rqdv8\" (UID: \"4c007303-368b-4322-8e3c-7b89c9f29c9e\") " pod="openshift-dns/dns-default-rqdv8" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:50.997393 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c007303-368b-4322-8e3c-7b89c9f29c9e-metrics-tls\") pod \"dns-default-rqdv8\" (UID: \"4c007303-368b-4322-8e3c-7b89c9f29c9e\") " pod="openshift-dns/dns-default-rqdv8" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:50.997439 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3a64ce15-c29c-46af-91a1-309857493594-csi-data-dir\") pod \"csi-hostpathplugin-wt47x\" (UID: \"3a64ce15-c29c-46af-91a1-309857493594\") " pod="hostpath-provisioner/csi-hostpathplugin-wt47x" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:50.997508 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3a64ce15-c29c-46af-91a1-309857493594-socket-dir\") pod \"csi-hostpathplugin-wt47x\" (UID: \"3a64ce15-c29c-46af-91a1-309857493594\") " pod="hostpath-provisioner/csi-hostpathplugin-wt47x" Dec 06 09:08:51 crc kubenswrapper[4672]: E1206 09:08:50.997619 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:51.49756232 +0000 UTC m=+149.241822607 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:50.997670 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3a64ce15-c29c-46af-91a1-309857493594-mountpoint-dir\") pod \"csi-hostpathplugin-wt47x\" (UID: \"3a64ce15-c29c-46af-91a1-309857493594\") " pod="hostpath-provisioner/csi-hostpathplugin-wt47x" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:50.999289 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3a64ce15-c29c-46af-91a1-309857493594-socket-dir\") pod \"csi-hostpathplugin-wt47x\" (UID: \"3a64ce15-c29c-46af-91a1-309857493594\") " pod="hostpath-provisioner/csi-hostpathplugin-wt47x" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:50.999782 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3a64ce15-c29c-46af-91a1-309857493594-mountpoint-dir\") pod \"csi-hostpathplugin-wt47x\" (UID: \"3a64ce15-c29c-46af-91a1-309857493594\") " pod="hostpath-provisioner/csi-hostpathplugin-wt47x" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:50.999967 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3a64ce15-c29c-46af-91a1-309857493594-csi-data-dir\") pod \"csi-hostpathplugin-wt47x\" (UID: \"3a64ce15-c29c-46af-91a1-309857493594\") " pod="hostpath-provisioner/csi-hostpathplugin-wt47x" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.000343 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3a64ce15-c29c-46af-91a1-309857493594-registration-dir\") pod \"csi-hostpathplugin-wt47x\" (UID: \"3a64ce15-c29c-46af-91a1-309857493594\") " pod="hostpath-provisioner/csi-hostpathplugin-wt47x" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:50.997751 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3a64ce15-c29c-46af-91a1-309857493594-registration-dir\") pod \"csi-hostpathplugin-wt47x\" (UID: \"3a64ce15-c29c-46af-91a1-309857493594\") " pod="hostpath-provisioner/csi-hostpathplugin-wt47x" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.003388 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4zwh\" (UniqueName: \"kubernetes.io/projected/3a64ce15-c29c-46af-91a1-309857493594-kube-api-access-c4zwh\") pod \"csi-hostpathplugin-wt47x\" (UID: \"3a64ce15-c29c-46af-91a1-309857493594\") " pod="hostpath-provisioner/csi-hostpathplugin-wt47x" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.003957 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:51 crc kubenswrapper[4672]: E1206 09:08:51.004507 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:51.504487437 +0000 UTC m=+149.248747724 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.004620 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aecee641-b961-499e-9387-0a5d008a205c-cert\") pod \"ingress-canary-7gxtl\" (UID: \"aecee641-b961-499e-9387-0a5d008a205c\") " pod="openshift-ingress-canary/ingress-canary-7gxtl" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.004674 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c007303-368b-4322-8e3c-7b89c9f29c9e-config-volume\") pod \"dns-default-rqdv8\" (UID: \"4c007303-368b-4322-8e3c-7b89c9f29c9e\") " pod="openshift-dns/dns-default-rqdv8" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.005333 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3a64ce15-c29c-46af-91a1-309857493594-plugins-dir\") pod \"csi-hostpathplugin-wt47x\" (UID: \"3a64ce15-c29c-46af-91a1-309857493594\") " pod="hostpath-provisioner/csi-hostpathplugin-wt47x" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.005438 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f26fq\" (UniqueName: \"kubernetes.io/projected/aecee641-b961-499e-9387-0a5d008a205c-kube-api-access-f26fq\") pod \"ingress-canary-7gxtl\" (UID: \"aecee641-b961-499e-9387-0a5d008a205c\") " pod="openshift-ingress-canary/ingress-canary-7gxtl" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.005779 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3a64ce15-c29c-46af-91a1-309857493594-plugins-dir\") pod \"csi-hostpathplugin-wt47x\" (UID: \"3a64ce15-c29c-46af-91a1-309857493594\") " pod="hostpath-provisioner/csi-hostpathplugin-wt47x" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.005984 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c007303-368b-4322-8e3c-7b89c9f29c9e-config-volume\") pod \"dns-default-rqdv8\" (UID: \"4c007303-368b-4322-8e3c-7b89c9f29c9e\") " pod="openshift-dns/dns-default-rqdv8" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.008052 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c007303-368b-4322-8e3c-7b89c9f29c9e-metrics-tls\") pod \"dns-default-rqdv8\" (UID: \"4c007303-368b-4322-8e3c-7b89c9f29c9e\") " pod="openshift-dns/dns-default-rqdv8" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.014311 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pjlvh" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.025090 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-znflx" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.027173 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rcnv8"] Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.029559 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56a63b61-c938-4435-9112-a02277f6caa4-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zl58f\" (UID: \"56a63b61-c938-4435-9112-a02277f6caa4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zl58f" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.031057 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aecee641-b961-499e-9387-0a5d008a205c-cert\") pod \"ingress-canary-7gxtl\" (UID: \"aecee641-b961-499e-9387-0a5d008a205c\") " pod="openshift-ingress-canary/ingress-canary-7gxtl" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.044371 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mck6n\" (UniqueName: \"kubernetes.io/projected/f874c07b-7566-441d-9546-6c3f7b64de13-kube-api-access-mck6n\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.058007 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgtkj\" (UniqueName: \"kubernetes.io/projected/459ffd9e-e358-41cb-b902-43e162b2c9d9-kube-api-access-lgtkj\") pod \"ingress-operator-5b745b69d9-dvs5c\" (UID: \"459ffd9e-e358-41cb-b902-43e162b2c9d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dvs5c" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.060272 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9kb8\" (UniqueName: \"kubernetes.io/projected/9a2d76b4-eb44-49ba-ad51-fbe3022af615-kube-api-access-b9kb8\") pod \"control-plane-machine-set-operator-78cbb6b69f-vrlvb\" (UID: \"9a2d76b4-eb44-49ba-ad51-fbe3022af615\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vrlvb" Dec 06 09:08:51 crc kubenswrapper[4672]: W1206 09:08:51.065484 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0def7e4f_7c51_4814_9ceb_7ba90a4699ad.slice/crio-362001b04d58ca948041b37fa212748932d08843e1ebb560c5596d97b4cffe0b WatchSource:0}: Error finding container 362001b04d58ca948041b37fa212748932d08843e1ebb560c5596d97b4cffe0b: Status 404 returned error can't find the container with id 362001b04d58ca948041b37fa212748932d08843e1ebb560c5596d97b4cffe0b Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.081990 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54kws\" (UniqueName: \"kubernetes.io/projected/79f3925d-9a36-418b-bd25-50dd03106705-kube-api-access-54kws\") pod \"machine-config-server-g4dbv\" (UID: \"79f3925d-9a36-418b-bd25-50dd03106705\") " pod="openshift-machine-config-operator/machine-config-server-g4dbv" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.086267 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c2g6x" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.109970 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:51 crc kubenswrapper[4672]: E1206 09:08:51.111459 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:51.611427586 +0000 UTC m=+149.355687873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.114494 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmgwg\" (UniqueName: \"kubernetes.io/projected/8760ee59-3419-480a-a540-6640481b8e1e-kube-api-access-fmgwg\") pod \"machine-config-operator-74547568cd-4r8t9\" (UID: \"8760ee59-3419-480a-a540-6640481b8e1e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4r8t9" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.125050 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-g4dbv" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.126106 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv"] Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.127574 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zldq\" (UniqueName: \"kubernetes.io/projected/b2b8e997-a3d6-43cc-a637-11f8c0a710ec-kube-api-access-6zldq\") pod \"cluster-image-registry-operator-dc59b4c8b-5ss48\" (UID: \"b2b8e997-a3d6-43cc-a637-11f8c0a710ec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5ss48" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.141500 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqcsb\" (UniqueName: \"kubernetes.io/projected/efef9ae7-ed6d-40fc-9b40-70b1d55383df-kube-api-access-hqcsb\") pod \"packageserver-d55dfcdfc-pzl96\" (UID: \"efef9ae7-ed6d-40fc-9b40-70b1d55383df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pzl96" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.169670 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhnt8\" (UniqueName: \"kubernetes.io/projected/2dfe938c-2f3d-4e4c-9156-d2d87b4478fe-kube-api-access-nhnt8\") pod \"marketplace-operator-79b997595-xqnzx\" (UID: \"2dfe938c-2f3d-4e4c-9156-d2d87b4478fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-xqnzx" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.183384 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th7r6\" (UniqueName: \"kubernetes.io/projected/1bc7e85b-9077-40c2-a33d-daa00d8b1d47-kube-api-access-th7r6\") pod \"multus-admission-controller-857f4d67dd-4h2gq\" (UID: \"1bc7e85b-9077-40c2-a33d-daa00d8b1d47\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4h2gq" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.195493 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-4vzn5"] Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.212130 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-26xdk" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.213547 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:51 crc kubenswrapper[4672]: E1206 09:08:51.214080 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:51.714061326 +0000 UTC m=+149.458321623 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.228640 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f874c07b-7566-441d-9546-6c3f7b64de13-bound-sa-token\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.242040 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdbzq\" (UniqueName: \"kubernetes.io/projected/4c007303-368b-4322-8e3c-7b89c9f29c9e-kube-api-access-kdbzq\") pod \"dns-default-rqdv8\" (UID: \"4c007303-368b-4322-8e3c-7b89c9f29c9e\") " pod="openshift-dns/dns-default-rqdv8" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.259840 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-4h2gq" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.265734 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dvs5c" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.267169 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4r8t9" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.271752 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4zwh\" (UniqueName: \"kubernetes.io/projected/3a64ce15-c29c-46af-91a1-309857493594-kube-api-access-c4zwh\") pod \"csi-hostpathplugin-wt47x\" (UID: \"3a64ce15-c29c-46af-91a1-309857493594\") " pod="hostpath-provisioner/csi-hostpathplugin-wt47x" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.289881 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f26fq\" (UniqueName: \"kubernetes.io/projected/aecee641-b961-499e-9387-0a5d008a205c-kube-api-access-f26fq\") pod \"ingress-canary-7gxtl\" (UID: \"aecee641-b961-499e-9387-0a5d008a205c\") " pod="openshift-ingress-canary/ingress-canary-7gxtl" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.294232 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zl58f" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.316276 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:51 crc kubenswrapper[4672]: E1206 09:08:51.316615 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:51.816566312 +0000 UTC m=+149.560826599 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.316890 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:51 crc kubenswrapper[4672]: E1206 09:08:51.317337 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:51.817328435 +0000 UTC m=+149.561588722 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.337123 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5ss48" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.355834 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vrlvb" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.374155 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pzl96" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.418819 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xqnzx" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.419231 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:51 crc kubenswrapper[4672]: E1206 09:08:51.419743 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:51.919713707 +0000 UTC m=+149.663973994 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.424003 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:51 crc kubenswrapper[4672]: E1206 09:08:51.427587 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:51.927564262 +0000 UTC m=+149.671824549 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.443216 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-wt47x" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.449523 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7gxtl" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.453074 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" event={"ID":"d543231a-ae36-4b66-ac6a-fc3b48a0acb3","Type":"ContainerStarted","Data":"c8da844de12dcbf66a5630dbb05a91305e0eced78e6cafaf7ed87af5be982b66"} Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.453634 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.465512 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lds42" event={"ID":"a7a39312-812f-45a0-ab3f-362048a42c5f","Type":"ContainerStarted","Data":"a529d1af88b0a8b6545ceb508d0c517cc065ad86b415ff67a81c583c8dbdc936"} Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.479773 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-47kpj" event={"ID":"21dbd9c3-6afd-44dd-aa63-c1094b853b5d","Type":"ContainerStarted","Data":"811fd059e30dede4e50c0e9571323088e4345979536dfb527d9a34f877fe675c"} Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.481559 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dcdqg" event={"ID":"de34b8a9-076f-4aa5-acb7-52361b6deeb8","Type":"ContainerStarted","Data":"b7240216e085894892f947c16b9d6387fb36461f29a2d91654a708ccb061a26b"} Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.487406 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r6dhb" event={"ID":"0def7e4f-7c51-4814-9ceb-7ba90a4699ad","Type":"ContainerStarted","Data":"362001b04d58ca948041b37fa212748932d08843e1ebb560c5596d97b4cffe0b"} Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.488051 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-b8m6z" event={"ID":"87e773f5-6efb-4613-9af8-f05c7af849e1","Type":"ContainerStarted","Data":"051d7333ceab117c4eacdfc5171fac77b3bd5e99e40f2557ad180f528514593e"} Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.490229 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhb2v" event={"ID":"f28bb046-9dd7-47e0-a498-1928568abe59","Type":"ContainerStarted","Data":"f038b9c6c469252fa7e9f56220424daff5e78c1bf0d4c77d5c651accbb0954ad"} Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.490896 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zvxtd" event={"ID":"05e5af51-76dc-4825-bab8-a5048aea49e9","Type":"ContainerStarted","Data":"f6b1e85ec4948743ad314bc62ad1e35b28c3c6710cd616e3a5b5d228b329c479"} Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.510573 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qff28" event={"ID":"1babc617-f6e7-4ec3-a4a2-82cd7ca080fb","Type":"ContainerStarted","Data":"f14b31531744d7d2b110aa8e7f4682ef380cb56ae35d855bfb84d1f6e73fa709"} Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.520088 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rqdv8" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.529958 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:51 crc kubenswrapper[4672]: E1206 09:08:51.532513 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:52.032487651 +0000 UTC m=+149.776747938 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.543879 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-x88bb" event={"ID":"64de6d79-f439-4a73-9ac6-605a71c8aab7","Type":"ContainerStarted","Data":"6fbd52396778a6e857d9bd7953e89dbac11b4080e00de59f3b8d2894e40b935b"} Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.558713 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-24dsx" event={"ID":"acfb03ae-0ebb-47ec-8433-7de29e729cac","Type":"ContainerStarted","Data":"c75422316229f9a9e05899e2988fcd69a9037676162b294d276d988347d826cf"} Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.582775 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" event={"ID":"2fe5591f-8503-4eea-9b4f-e85419856dd6","Type":"ContainerDied","Data":"1ffbe22919e68efe3d9a2356f19389c289d94434df4e78d7994b069e3a22829a"} Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.581292 4672 generic.go:334] "Generic (PLEG): container finished" podID="2fe5591f-8503-4eea-9b4f-e85419856dd6" containerID="1ffbe22919e68efe3d9a2356f19389c289d94434df4e78d7994b069e3a22829a" exitCode=0 Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.604310 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" event={"ID":"2fe5591f-8503-4eea-9b4f-e85419856dd6","Type":"ContainerStarted","Data":"5f2f216c12aef5faf01faa23370a76bfc94fdfbeacff459390f23745736873d9"} Dec 06 09:08:51 crc kubenswrapper[4672]: W1206 09:08:51.626391 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-f7b94be5ee3d28c711fa7d798052267efefce3ed977cdb3af2a358327ed9b3ea WatchSource:0}: Error finding container f7b94be5ee3d28c711fa7d798052267efefce3ed977cdb3af2a358327ed9b3ea: Status 404 returned error can't find the container with id f7b94be5ee3d28c711fa7d798052267efefce3ed977cdb3af2a358327ed9b3ea Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.628072 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lcghp" event={"ID":"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89","Type":"ContainerStarted","Data":"2a34458d731d1a630714b5c1a0c0eb00a13487635fabafaa656baf66673b5824"} Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.637951 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:51 crc kubenswrapper[4672]: E1206 09:08:51.653916 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:52.153893912 +0000 UTC m=+149.898154199 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.718065 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-x9m9h" event={"ID":"77d1174d-bfc2-4145-9bf2-c2b648f903e8","Type":"ContainerStarted","Data":"c65042e7a6dd71a53a6123e8acff445d5e72755972c628aee479fe2123ab9d59"} Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.732559 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rcnv8" event={"ID":"88233a33-81de-4a10-8e6b-bf8ae80beb22","Type":"ContainerStarted","Data":"f52f8376f8593ac130bf3f33e8dd24b6be4edb3b095d73ff0a541ccff607df51"} Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.734494 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-q66t7" event={"ID":"8ae93c21-14e1-4248-98cf-a250cc060f20","Type":"ContainerStarted","Data":"fbaf00a32458a4caf67b738e710125cb5b4767c71b2359d84983a7a6ee1dca1a"} Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.780740 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:51 crc kubenswrapper[4672]: E1206 09:08:51.780878 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:52.280859589 +0000 UTC m=+150.025119876 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.787432 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:51 crc kubenswrapper[4672]: E1206 09:08:51.788281 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:52.288259121 +0000 UTC m=+150.032519408 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.891409 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:51 crc kubenswrapper[4672]: E1206 09:08:51.892177 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:52.392157278 +0000 UTC m=+150.136417565 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.896297 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-q8xwg"] Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.906402 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4l2wz"] Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.917079 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lprjd"] Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.922202 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-445qb"] Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.923486 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.942632 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" podStartSLOduration=131.942614497 podStartE2EDuration="2m11.942614497s" podCreationTimestamp="2025-12-06 09:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:51.941747822 +0000 UTC m=+149.686008109" watchObservedRunningTime="2025-12-06 09:08:51.942614497 +0000 UTC m=+149.686874784" Dec 06 09:08:51 crc kubenswrapper[4672]: I1206 09:08:51.992959 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:51 crc kubenswrapper[4672]: E1206 09:08:51.993409 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:52.493393376 +0000 UTC m=+150.237653663 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:52 crc kubenswrapper[4672]: I1206 09:08:52.032482 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416860-wvm9x"] Dec 06 09:08:52 crc kubenswrapper[4672]: I1206 09:08:52.062811 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8phjh"] Dec 06 09:08:52 crc kubenswrapper[4672]: I1206 09:08:52.093583 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:52 crc kubenswrapper[4672]: E1206 09:08:52.094009 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:52.593982035 +0000 UTC m=+150.338242332 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:52 crc kubenswrapper[4672]: I1206 09:08:52.195501 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:52 crc kubenswrapper[4672]: E1206 09:08:52.196353 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:52.696340227 +0000 UTC m=+150.440600504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:52 crc kubenswrapper[4672]: W1206 09:08:52.212057 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9824eace_01c1_49c3_9094_3f926eda9487.slice/crio-ed26b2c8eb23f9d8a6d0896fa3217868395bc48feb838346662f291ddd06120e WatchSource:0}: Error finding container ed26b2c8eb23f9d8a6d0896fa3217868395bc48feb838346662f291ddd06120e: Status 404 returned error can't find the container with id ed26b2c8eb23f9d8a6d0896fa3217868395bc48feb838346662f291ddd06120e Dec 06 09:08:52 crc kubenswrapper[4672]: W1206 09:08:52.263881 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5458b22_c606_4b4f_934e_ecb972895455.slice/crio-28193d30c0846f88cac2ebd2f88ae8d6720181f2b01b7d673d5e3dba92f47481 WatchSource:0}: Error finding container 28193d30c0846f88cac2ebd2f88ae8d6720181f2b01b7d673d5e3dba92f47481: Status 404 returned error can't find the container with id 28193d30c0846f88cac2ebd2f88ae8d6720181f2b01b7d673d5e3dba92f47481 Dec 06 09:08:52 crc kubenswrapper[4672]: W1206 09:08:52.289554 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod475471e2_43d3_46f3_9aa1_b44f497b626f.slice/crio-77e3d58dc935385b419e79d990e3a09af339d5cc02472495150a4ccee6d6e131 WatchSource:0}: Error finding container 77e3d58dc935385b419e79d990e3a09af339d5cc02472495150a4ccee6d6e131: Status 404 returned error can't find the container with id 77e3d58dc935385b419e79d990e3a09af339d5cc02472495150a4ccee6d6e131 Dec 06 09:08:52 crc kubenswrapper[4672]: I1206 09:08:52.297988 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:52 crc kubenswrapper[4672]: E1206 09:08:52.299663 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:52.799632237 +0000 UTC m=+150.543892524 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:52 crc kubenswrapper[4672]: I1206 09:08:52.301928 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:52 crc kubenswrapper[4672]: E1206 09:08:52.302445 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:52.80242598 +0000 UTC m=+150.546686267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:52 crc kubenswrapper[4672]: I1206 09:08:52.314636 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-bm7cx"] Dec 06 09:08:52 crc kubenswrapper[4672]: W1206 09:08:52.369118 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8a761a8_3e6d_42eb_b0f8_db388dcf6952.slice/crio-81818567450fc7e4c6bde0f22e56da5186a9bba63c1fa437f01b1334ea4cd30a WatchSource:0}: Error finding container 81818567450fc7e4c6bde0f22e56da5186a9bba63c1fa437f01b1334ea4cd30a: Status 404 returned error can't find the container with id 81818567450fc7e4c6bde0f22e56da5186a9bba63c1fa437f01b1334ea4cd30a Dec 06 09:08:52 crc kubenswrapper[4672]: W1206 09:08:52.396801 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45b650d8_842d_4139_9858_94c3019f7be2.slice/crio-7843d15be5c06f961b83395d2c08ab0fcfb7ab33996cfc5666ddcdc5a886757f WatchSource:0}: Error finding container 7843d15be5c06f961b83395d2c08ab0fcfb7ab33996cfc5666ddcdc5a886757f: Status 404 returned error can't find the container with id 7843d15be5c06f961b83395d2c08ab0fcfb7ab33996cfc5666ddcdc5a886757f Dec 06 09:08:52 crc kubenswrapper[4672]: I1206 09:08:52.404822 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:52 crc kubenswrapper[4672]: E1206 09:08:52.405568 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:52.905540364 +0000 UTC m=+150.649800661 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:52 crc kubenswrapper[4672]: I1206 09:08:52.436115 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-pjlvh"] Dec 06 09:08:52 crc kubenswrapper[4672]: I1206 09:08:52.507887 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:52 crc kubenswrapper[4672]: E1206 09:08:52.508237 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:53.008224186 +0000 UTC m=+150.752484473 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:52 crc kubenswrapper[4672]: I1206 09:08:52.549985 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-znflx"] Dec 06 09:08:52 crc kubenswrapper[4672]: I1206 09:08:52.608950 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:52 crc kubenswrapper[4672]: E1206 09:08:52.609316 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:53.109296409 +0000 UTC m=+150.853556696 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:52 crc kubenswrapper[4672]: I1206 09:08:52.714010 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:52 crc kubenswrapper[4672]: E1206 09:08:52.714433 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:53.214417373 +0000 UTC m=+150.958677660 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:52 crc kubenswrapper[4672]: I1206 09:08:52.814932 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:52 crc kubenswrapper[4672]: E1206 09:08:52.815194 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:53.315162177 +0000 UTC m=+151.059422464 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:52 crc kubenswrapper[4672]: I1206 09:08:52.815561 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:52 crc kubenswrapper[4672]: E1206 09:08:52.815955 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:53.31594057 +0000 UTC m=+151.060200857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:52 crc kubenswrapper[4672]: I1206 09:08:52.851765 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8c2f9389e54d20137e3a126e669d837763b3ab6a2a48cdbc64783b71cf4f293e"} Dec 06 09:08:52 crc kubenswrapper[4672]: I1206 09:08:52.916444 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:52 crc kubenswrapper[4672]: E1206 09:08:52.916847 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:53.416825687 +0000 UTC m=+151.161085974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:52 crc kubenswrapper[4672]: I1206 09:08:52.954216 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lprjd" event={"ID":"285de502-3cec-4e87-b096-c9485f99ac4b","Type":"ContainerStarted","Data":"4bc0263d5928f151c44dcd2d3f4a7c673161e2ff6000a5997e7f9a2057756b4e"} Dec 06 09:08:52 crc kubenswrapper[4672]: W1206 09:08:52.954354 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0807b471_6f7b_4326_b33a_b4e274f94607.slice/crio-6578eac81cf21723285d225b5eda3c0eaeedfcd7f0cb46aa1b185ffb5246ed36 WatchSource:0}: Error finding container 6578eac81cf21723285d225b5eda3c0eaeedfcd7f0cb46aa1b185ffb5246ed36: Status 404 returned error can't find the container with id 6578eac81cf21723285d225b5eda3c0eaeedfcd7f0cb46aa1b185ffb5246ed36 Dec 06 09:08:52 crc kubenswrapper[4672]: I1206 09:08:52.987755 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zvxtd" event={"ID":"05e5af51-76dc-4825-bab8-a5048aea49e9","Type":"ContainerStarted","Data":"c51a4bd045d8936c25b4cdb38d689fd49fb82ef5f09a3502485ef0a3c991977c"} Dec 06 09:08:52 crc kubenswrapper[4672]: I1206 09:08:52.990135 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-zvxtd" Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.012852 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-wvm9x" event={"ID":"b8a761a8-3e6d-42eb-b0f8-db388dcf6952","Type":"ContainerStarted","Data":"81818567450fc7e4c6bde0f22e56da5186a9bba63c1fa437f01b1334ea4cd30a"} Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.020134 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:53 crc kubenswrapper[4672]: E1206 09:08:53.020518 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:53.520500788 +0000 UTC m=+151.264761075 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.023432 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-zvxtd" Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.051633 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhb2v" event={"ID":"f28bb046-9dd7-47e0-a498-1928568abe59","Type":"ContainerStarted","Data":"198b589eb7aa83ba9ff6eb6c07c05c2b2987b5d98b27a19188acf1de7f8c5db7"} Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.078183 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-x9m9h" event={"ID":"77d1174d-bfc2-4145-9bf2-c2b648f903e8","Type":"ContainerStarted","Data":"cdef4a60d085299da97edd591120a7c4ee8845ed9ccafdf9a47794d19e35e7f8"} Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.137071 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:53 crc kubenswrapper[4672]: E1206 09:08:53.138646 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:53.638614151 +0000 UTC m=+151.382874438 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.159145 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lds42" event={"ID":"a7a39312-812f-45a0-ab3f-362048a42c5f","Type":"ContainerStarted","Data":"c30cfd4ee5700aaff7926a95bb978e284473d3bec579a7523121598a99dcb9f6"} Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.166347 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lds42" Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.210289 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lds42" Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.220306 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"f7b94be5ee3d28c711fa7d798052267efefce3ed977cdb3af2a358327ed9b3ea"} Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.222693 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.238668 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:53 crc kubenswrapper[4672]: E1206 09:08:53.240547 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:53.740524089 +0000 UTC m=+151.484784386 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.258185 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vzn5" event={"ID":"637b32e8-5e9a-47ac-aeaf-60709cdfba63","Type":"ContainerStarted","Data":"931fd7037e721bc619c730bb47c933a8450fb5ebbf95326d622213fa09907f45"} Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.275450 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-bm7cx" event={"ID":"45b650d8-842d-4139-9858-94c3019f7be2","Type":"ContainerStarted","Data":"7843d15be5c06f961b83395d2c08ab0fcfb7ab33996cfc5666ddcdc5a886757f"} Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.283136 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-q8xwg" event={"ID":"9824eace-01c1-49c3-9094-3f926eda9487","Type":"ContainerStarted","Data":"ed26b2c8eb23f9d8a6d0896fa3217868395bc48feb838346662f291ddd06120e"} Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.312046 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4l2wz" event={"ID":"475471e2-43d3-46f3-9aa1-b44f497b626f","Type":"ContainerStarted","Data":"77e3d58dc935385b419e79d990e3a09af339d5cc02472495150a4ccee6d6e131"} Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.317924 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"3033fd0d876b463cae304a81e4868a1ca583c39c4e7837cde28eda53940214af"} Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.348543 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-47kpj" event={"ID":"21dbd9c3-6afd-44dd-aa63-c1094b853b5d","Type":"ContainerStarted","Data":"0680e63a87ec083a7d75312e290a1381fba19a463e11095f186fef50cf52e906"} Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.349847 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-x9m9h" Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.350958 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:53 crc kubenswrapper[4672]: E1206 09:08:53.351138 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:53.851114887 +0000 UTC m=+151.595375174 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.351339 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:53 crc kubenswrapper[4672]: E1206 09:08:53.354903 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:53.854857429 +0000 UTC m=+151.599117716 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.405013 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-x88bb" event={"ID":"64de6d79-f439-4a73-9ac6-605a71c8aab7","Type":"ContainerStarted","Data":"3c7d14b4030e24073ccb9a2dba7435077f45c8863cd2e04fdabd6a4d14f3f0f1"} Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.406249 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-x88bb" Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.421475 4672 patch_prober.go:28] interesting pod/downloads-7954f5f757-x88bb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.421567 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x88bb" podUID="64de6d79-f439-4a73-9ac6-605a71c8aab7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.436705 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dcdqg" event={"ID":"de34b8a9-076f-4aa5-acb7-52361b6deeb8","Type":"ContainerStarted","Data":"9fae834b2b00e17e329b219768b7f8d2e6e7eec03531b7b23664c3e9f4778c3e"} Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.443592 4672 patch_prober.go:28] interesting pod/router-default-5444994796-x9m9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 09:08:53 crc kubenswrapper[4672]: [-]has-synced failed: reason withheld Dec 06 09:08:53 crc kubenswrapper[4672]: [+]process-running ok Dec 06 09:08:53 crc kubenswrapper[4672]: healthz check failed Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.443717 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x9m9h" podUID="77d1174d-bfc2-4145-9bf2-c2b648f903e8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.456024 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.456329 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv" event={"ID":"211614db-3bf5-4db7-9146-cc91303fc217","Type":"ContainerStarted","Data":"1626201f60efd7891ab74d06d02ebc0c51b8a4baaf04879d9d02542e646e2d8c"} Dec 06 09:08:53 crc kubenswrapper[4672]: E1206 09:08:53.473607 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:53.973508599 +0000 UTC m=+151.717768886 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.473973 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:53 crc kubenswrapper[4672]: E1206 09:08:53.476324 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:53.976313783 +0000 UTC m=+151.720574070 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.586010 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:53 crc kubenswrapper[4672]: E1206 09:08:53.586381 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:54.086360694 +0000 UTC m=+151.830620981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.586698 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pjlvh" event={"ID":"7116521c-c2a3-4c0e-bacf-9d83f4c59087","Type":"ContainerStarted","Data":"cc660e699adcd872c380c9c5f7978a6ac3380a793d55f0839ea90a867f281141"} Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.604221 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8phjh" event={"ID":"dda9e216-90cd-466b-a9bd-03fd4914b1b8","Type":"ContainerStarted","Data":"251212f6c1caf3093ae04cfceb6024feda63fdc893e171016a2d7c4ba7b71922"} Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.630681 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-q66t7" event={"ID":"8ae93c21-14e1-4248-98cf-a250cc060f20","Type":"ContainerStarted","Data":"3ed0193eca59a2e3c7aa3e1bf763848c5c56b1235e46aa01f3d9ebc010f3a0d0"} Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.631197 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-q66t7" Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.638922 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-b8m6z" event={"ID":"87e773f5-6efb-4613-9af8-f05c7af849e1","Type":"ContainerStarted","Data":"51893beeaab20556d1667ec6f8327a3cfc665cad4b269abc5a8edf1523451d73"} Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.653631 4672 generic.go:334] "Generic (PLEG): container finished" podID="3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89" containerID="cccf92159b84af2f6f5ba712fba34e0da92062ec99efee8b7687ce93c0fbee5f" exitCode=0 Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.653756 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lcghp" event={"ID":"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89","Type":"ContainerDied","Data":"cccf92159b84af2f6f5ba712fba34e0da92062ec99efee8b7687ce93c0fbee5f"} Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.662834 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-g4dbv" event={"ID":"79f3925d-9a36-418b-bd25-50dd03106705","Type":"ContainerStarted","Data":"67ed982ec99e34cb2aa8354ef3314a9bb76679f0bc54c1cd3e919e2bd885216e"} Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.679780 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-445qb" event={"ID":"c5458b22-c606-4b4f-934e-ecb972895455","Type":"ContainerStarted","Data":"28193d30c0846f88cac2ebd2f88ae8d6720181f2b01b7d673d5e3dba92f47481"} Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.690824 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:53 crc kubenswrapper[4672]: E1206 09:08:53.692117 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:54.192102287 +0000 UTC m=+151.936362574 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.776641 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-47kpj" podStartSLOduration=132.776617255 podStartE2EDuration="2m12.776617255s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:53.775020687 +0000 UTC m=+151.519280974" watchObservedRunningTime="2025-12-06 09:08:53.776617255 +0000 UTC m=+151.520877542" Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.795722 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-dcdqg" podStartSLOduration=132.795696336 podStartE2EDuration="2m12.795696336s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:53.639258356 +0000 UTC m=+151.383518653" watchObservedRunningTime="2025-12-06 09:08:53.795696336 +0000 UTC m=+151.539956623" Dec 06 09:08:53 crc kubenswrapper[4672]: E1206 09:08:53.792077 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:54.292055986 +0000 UTC m=+152.036316273 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.792001 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.837819 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:53 crc kubenswrapper[4672]: E1206 09:08:53.838316 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:54.33829934 +0000 UTC m=+152.082559627 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:53 crc kubenswrapper[4672]: I1206 09:08:53.939806 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:53 crc kubenswrapper[4672]: E1206 09:08:53.940591 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:54.440564868 +0000 UTC m=+152.184825155 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.017829 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lds42" podStartSLOduration=133.017801839 podStartE2EDuration="2m13.017801839s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:53.964215966 +0000 UTC m=+151.708476253" watchObservedRunningTime="2025-12-06 09:08:54.017801839 +0000 UTC m=+151.762062126" Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.044332 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhb2v" podStartSLOduration=133.044304641 podStartE2EDuration="2m13.044304641s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:54.00481983 +0000 UTC m=+151.749080147" watchObservedRunningTime="2025-12-06 09:08:54.044304641 +0000 UTC m=+151.788564928" Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.073819 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5ss48"] Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.075506 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:54 crc kubenswrapper[4672]: E1206 09:08:54.089757 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:54.58971954 +0000 UTC m=+152.333979827 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.180011 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:54 crc kubenswrapper[4672]: E1206 09:08:54.180729 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:54.680705911 +0000 UTC m=+152.424966198 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.242489 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-x9m9h" podStartSLOduration=133.242465388 podStartE2EDuration="2m13.242465388s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:54.164936979 +0000 UTC m=+151.909197266" watchObservedRunningTime="2025-12-06 09:08:54.242465388 +0000 UTC m=+151.986725675" Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.262293 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-zvxtd" podStartSLOduration=133.262269701 podStartE2EDuration="2m13.262269701s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:54.222550873 +0000 UTC m=+151.966811160" watchObservedRunningTime="2025-12-06 09:08:54.262269701 +0000 UTC m=+152.006529988" Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.321566 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c2g6x"] Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.329756 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-x88bb" podStartSLOduration=133.329739049 podStartE2EDuration="2m13.329739049s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:54.327705799 +0000 UTC m=+152.071966076" watchObservedRunningTime="2025-12-06 09:08:54.329739049 +0000 UTC m=+152.073999336" Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.382275 4672 patch_prober.go:28] interesting pod/router-default-5444994796-x9m9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 09:08:54 crc kubenswrapper[4672]: [-]has-synced failed: reason withheld Dec 06 09:08:54 crc kubenswrapper[4672]: [+]process-running ok Dec 06 09:08:54 crc kubenswrapper[4672]: healthz check failed Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.392750 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x9m9h" podUID="77d1174d-bfc2-4145-9bf2-c2b648f903e8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.394826 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:54 crc kubenswrapper[4672]: E1206 09:08:54.395256 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:54.895235758 +0000 UTC m=+152.639496045 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.396744 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pzl96"] Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.428399 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-q66t7" podStartSLOduration=134.428373789 podStartE2EDuration="2m14.428373789s" podCreationTimestamp="2025-12-06 09:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:54.409290709 +0000 UTC m=+152.153550996" watchObservedRunningTime="2025-12-06 09:08:54.428373789 +0000 UTC m=+152.172634076" Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.433017 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dvs5c"] Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.497278 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:54 crc kubenswrapper[4672]: E1206 09:08:54.497812 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:54.997795275 +0000 UTC m=+152.742055562 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.600412 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:54 crc kubenswrapper[4672]: E1206 09:08:54.600811 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:55.100794587 +0000 UTC m=+152.845054874 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.633521 4672 patch_prober.go:28] interesting pod/console-operator-58897d9998-q66t7 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.633573 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-q66t7" podUID="8ae93c21-14e1-4248-98cf-a250cc060f20" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.651849 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rqdv8"] Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.701223 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:54 crc kubenswrapper[4672]: E1206 09:08:54.701703 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:55.201683294 +0000 UTC m=+152.945943581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.739267 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vrlvb"] Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.792387 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dvs5c" event={"ID":"459ffd9e-e358-41cb-b902-43e162b2c9d9","Type":"ContainerStarted","Data":"bd6babd99081929924e1cbf6db4ac7daeed6c41cd1c5d1b59d57073244c290b1"} Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.806026 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:54 crc kubenswrapper[4672]: E1206 09:08:54.806557 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:55.30653031 +0000 UTC m=+153.050790767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.853369 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-znflx" event={"ID":"0807b471-6f7b-4326-b33a-b4e274f94607","Type":"ContainerStarted","Data":"6578eac81cf21723285d225b5eda3c0eaeedfcd7f0cb46aa1b185ffb5246ed36"} Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.895844 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xqnzx"] Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.908167 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:54 crc kubenswrapper[4672]: E1206 09:08:54.908591 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:55.408569432 +0000 UTC m=+153.152829719 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.936007 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c2g6x" event={"ID":"0e864b8d-8d39-4fb3-97ad-30963169ecde","Type":"ContainerStarted","Data":"7c21ed05add640f9624cdaefd5a4e12f15f9a2832ae5fb2e8c8036d703d8355b"} Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.972977 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2ed83fa9404f5b15f1ddea9adf54c6e5bf55b98eb65f0daff2773741d8de19f6"} Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.979547 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5ss48" event={"ID":"b2b8e997-a3d6-43cc-a637-11f8c0a710ec","Type":"ContainerStarted","Data":"0d24061010dfd9fc63f8b1ae91f1b4036fde234226f4ef7d24feb34cd25888b5"} Dec 06 09:08:54 crc kubenswrapper[4672]: I1206 09:08:54.981479 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r6dhb" event={"ID":"0def7e4f-7c51-4814-9ceb-7ba90a4699ad","Type":"ContainerStarted","Data":"27042e2c73b42e857a0a55f25434f1086c78453172ebd02aeb9757ac62679805"} Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.004486 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-26xdk"] Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.019844 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.021883 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pzl96" event={"ID":"efef9ae7-ed6d-40fc-9b40-70b1d55383df","Type":"ContainerStarted","Data":"0b9af545de58dbe9ccbb8ee56da2778d5294f0d1391638b1d211eb7af7810da9"} Dec 06 09:08:55 crc kubenswrapper[4672]: E1206 09:08:55.022207 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:55.52219099 +0000 UTC m=+153.266451267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.046221 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"f6ba10bba24d6456d1e927479356398a968fc397a70b8c176385492204344c7f"} Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.047481 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4h2gq"] Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.049495 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv" event={"ID":"211614db-3bf5-4db7-9146-cc91303fc217","Type":"ContainerStarted","Data":"5c6c509ca639e7fdbdd190de1e3fa0dbb48caf263a1b0d45f67faf9f39808637"} Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.050478 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv" Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.051644 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-24dsx" event={"ID":"acfb03ae-0ebb-47ec-8433-7de29e729cac","Type":"ContainerStarted","Data":"8bde53bfbb9a20ef5a36d07ac3dffe4624ff0991a30503ccf87b1d8112d341ca"} Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.088972 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pjlvh" event={"ID":"7116521c-c2a3-4c0e-bacf-9d83f4c59087","Type":"ContainerStarted","Data":"a2542fed5c5d1aa6385b51307b1e85f0dfa9bc7a483f40130079c38526aadd9a"} Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.091485 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wt47x"] Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.125319 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:55 crc kubenswrapper[4672]: E1206 09:08:55.127736 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:55.627708617 +0000 UTC m=+153.371968904 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.136142 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8phjh" event={"ID":"dda9e216-90cd-466b-a9bd-03fd4914b1b8","Type":"ContainerStarted","Data":"8506d23e4e2fdcb742b2fc9b8f187e95c40fb27de59952f313d49107e989a2bb"} Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.172259 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r6dhb" podStartSLOduration=134.172234439 podStartE2EDuration="2m14.172234439s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:55.107106001 +0000 UTC m=+152.851366288" watchObservedRunningTime="2025-12-06 09:08:55.172234439 +0000 UTC m=+152.916494726" Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.173855 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rcnv8" event={"ID":"88233a33-81de-4a10-8e6b-bf8ae80beb22","Type":"ContainerStarted","Data":"a177b0e0bfaa84d15213858262b1358698711a6c47731a84dc19e7b8023fa76b"} Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.215830 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vzn5" event={"ID":"637b32e8-5e9a-47ac-aeaf-60709cdfba63","Type":"ContainerStarted","Data":"65545d3a7232c7ccbe447d2b12c0a21b686e094015e7782a45a522446be54546"} Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.180074 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-g4dbv" podStartSLOduration=7.180039973 podStartE2EDuration="7.180039973s" podCreationTimestamp="2025-12-06 09:08:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:55.16857024 +0000 UTC m=+152.912830527" watchObservedRunningTime="2025-12-06 09:08:55.180039973 +0000 UTC m=+152.924300260" Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.228033 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:55 crc kubenswrapper[4672]: E1206 09:08:55.256949 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:55.756921892 +0000 UTC m=+153.501182179 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.269494 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qff28" event={"ID":"1babc617-f6e7-4ec3-a4a2-82cd7ca080fb","Type":"ContainerStarted","Data":"2ffcacffb9122cfedc5b9efce6231b415a2fe9ec6802562063b9ee666ea04d03"} Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.281743 4672 patch_prober.go:28] interesting pod/downloads-7954f5f757-x88bb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.288216 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x88bb" podUID="64de6d79-f439-4a73-9ac6-605a71c8aab7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.349077 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:55 crc kubenswrapper[4672]: E1206 09:08:55.354859 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:55.85481431 +0000 UTC m=+153.599074597 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.373402 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.380756 4672 patch_prober.go:28] interesting pod/router-default-5444994796-x9m9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 09:08:55 crc kubenswrapper[4672]: [-]has-synced failed: reason withheld Dec 06 09:08:55 crc kubenswrapper[4672]: [+]process-running ok Dec 06 09:08:55 crc kubenswrapper[4672]: healthz check failed Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.380826 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x9m9h" podUID="77d1174d-bfc2-4145-9bf2-c2b648f903e8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 09:08:55 crc kubenswrapper[4672]: E1206 09:08:55.382050 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:55.882030354 +0000 UTC m=+153.626290641 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.407551 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8phjh" podStartSLOduration=134.407526537 podStartE2EDuration="2m14.407526537s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:55.284196368 +0000 UTC m=+153.028456655" watchObservedRunningTime="2025-12-06 09:08:55.407526537 +0000 UTC m=+153.151786844" Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.409238 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv" podStartSLOduration=134.409232918 podStartE2EDuration="2m14.409232918s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:55.408383062 +0000 UTC m=+153.152643349" watchObservedRunningTime="2025-12-06 09:08:55.409232918 +0000 UTC m=+153.153493205" Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.412966 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4r8t9"] Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.419927 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zl58f"] Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.474519 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:55 crc kubenswrapper[4672]: E1206 09:08:55.474900 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:55.974878891 +0000 UTC m=+153.719139168 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.576502 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:55 crc kubenswrapper[4672]: E1206 09:08:55.576995 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:56.076979295 +0000 UTC m=+153.821239582 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.624370 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7gxtl"] Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.677958 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:55 crc kubenswrapper[4672]: E1206 09:08:55.678293 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:56.178273315 +0000 UTC m=+153.922533602 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.779078 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:55 crc kubenswrapper[4672]: E1206 09:08:55.779582 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:56.279566725 +0000 UTC m=+154.023827012 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.880395 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:55 crc kubenswrapper[4672]: E1206 09:08:55.880732 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:56.38070992 +0000 UTC m=+154.124970207 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.881112 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:55 crc kubenswrapper[4672]: E1206 09:08:55.881515 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:56.381507774 +0000 UTC m=+154.125768061 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.981880 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:55 crc kubenswrapper[4672]: E1206 09:08:55.982130 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:56.482090362 +0000 UTC m=+154.226350649 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:55 crc kubenswrapper[4672]: I1206 09:08:55.982662 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:55 crc kubenswrapper[4672]: E1206 09:08:55.983131 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:56.483122063 +0000 UTC m=+154.227382350 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.052194 4672 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-5pxtv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.052274 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv" podUID="211614db-3bf5-4db7-9146-cc91303fc217" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.084252 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:56 crc kubenswrapper[4672]: E1206 09:08:56.084749 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:56.584726973 +0000 UTC m=+154.328987260 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.186431 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:56 crc kubenswrapper[4672]: E1206 09:08:56.187375 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:56.687359502 +0000 UTC m=+154.431619789 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.283951 4672 patch_prober.go:28] interesting pod/console-operator-58897d9998-q66t7 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.284524 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-q66t7" podUID="8ae93c21-14e1-4248-98cf-a250cc060f20" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.287414 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:56 crc kubenswrapper[4672]: E1206 09:08:56.293778 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:56.793720043 +0000 UTC m=+154.537980330 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.294002 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:56 crc kubenswrapper[4672]: E1206 09:08:56.294501 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:56.794482717 +0000 UTC m=+154.538743004 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.337560 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rcnv8" event={"ID":"88233a33-81de-4a10-8e6b-bf8ae80beb22","Type":"ContainerStarted","Data":"4ad014051e1e14e647e5b161e3cbde6f2b1f2e8024e627201e33bf26873abcea"} Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.346872 4672 patch_prober.go:28] interesting pod/router-default-5444994796-x9m9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 09:08:56 crc kubenswrapper[4672]: [-]has-synced failed: reason withheld Dec 06 09:08:56 crc kubenswrapper[4672]: [+]process-running ok Dec 06 09:08:56 crc kubenswrapper[4672]: healthz check failed Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.347299 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x9m9h" podUID="77d1174d-bfc2-4145-9bf2-c2b648f903e8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.357132 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zl58f" event={"ID":"56a63b61-c938-4435-9112-a02277f6caa4","Type":"ContainerStarted","Data":"76dd66145fb1ffa2fb298aa9dd6ffbedac5b6512bc2e3f30b70ff02cb98c2fc9"} Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.359973 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4r8t9" event={"ID":"8760ee59-3419-480a-a540-6640481b8e1e","Type":"ContainerStarted","Data":"075c722d0857b845b8ad36d0556c878855928f61077cc543e35e030606bda6ea"} Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.375924 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-znflx" event={"ID":"0807b471-6f7b-4326-b33a-b4e274f94607","Type":"ContainerStarted","Data":"731fc0a5fb404e753ba21108fbbf74af3f88c3149cafa27be58e0560e0c4b447"} Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.377687 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-znflx" Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.390329 4672 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-znflx container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.390701 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-znflx" podUID="0807b471-6f7b-4326-b33a-b4e274f94607" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.399962 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:56 crc kubenswrapper[4672]: E1206 09:08:56.400582 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:56.900539048 +0000 UTC m=+154.644799325 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.402136 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-bm7cx" event={"ID":"45b650d8-842d-4139-9858-94c3019f7be2","Type":"ContainerStarted","Data":"f183c702a08505050cfb999c0e77d32bc47266a5919a5c9cd0ac994f944f12d7"} Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.427022 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lcghp" event={"ID":"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89","Type":"ContainerStarted","Data":"b98ca545382bc4d0e04509c374d30e424d36a1bc2250888fb8e94514882710f9"} Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.437880 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lprjd" event={"ID":"285de502-3cec-4e87-b096-c9485f99ac4b","Type":"ContainerStarted","Data":"a55d90e6e9e58e769c1fe8a28d4771192b32914e5011a7b5d84f85a087b07288"} Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.467503 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-q8xwg" event={"ID":"9824eace-01c1-49c3-9094-3f926eda9487","Type":"ContainerStarted","Data":"babdf311fbd2afa731129b92e3c9f284f98f6718231f89ed521aeecf8a487ce1"} Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.494245 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-445qb" event={"ID":"c5458b22-c606-4b4f-934e-ecb972895455","Type":"ContainerStarted","Data":"d53263de9c49289637b124960e013a0bf5d036f0c39aa1ca19aff78606fb6e9c"} Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.507196 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:56 crc kubenswrapper[4672]: E1206 09:08:56.509432 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:57.009409485 +0000 UTC m=+154.753669772 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.513966 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dvs5c" event={"ID":"459ffd9e-e358-41cb-b902-43e162b2c9d9","Type":"ContainerStarted","Data":"aa5eef64b8a0e8410e5c55c4141d9f9dcf28d753e114000a494dc6bde75b808c"} Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.548421 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-znflx" podStartSLOduration=135.548401841 podStartE2EDuration="2m15.548401841s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:56.547065762 +0000 UTC m=+154.291326049" watchObservedRunningTime="2025-12-06 09:08:56.548401841 +0000 UTC m=+154.292662128" Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.549699 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rcnv8" podStartSLOduration=136.54969072 podStartE2EDuration="2m16.54969072s" podCreationTimestamp="2025-12-06 09:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:56.451790712 +0000 UTC m=+154.196050989" watchObservedRunningTime="2025-12-06 09:08:56.54969072 +0000 UTC m=+154.293951007" Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.608903 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:56 crc kubenswrapper[4672]: E1206 09:08:56.610477 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:57.110455717 +0000 UTC m=+154.854716004 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.616842 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lprjd" podStartSLOduration=135.616820638 podStartE2EDuration="2m15.616820638s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:56.612237931 +0000 UTC m=+154.356498208" watchObservedRunningTime="2025-12-06 09:08:56.616820638 +0000 UTC m=+154.361080935" Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.682825 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qff28" event={"ID":"1babc617-f6e7-4ec3-a4a2-82cd7ca080fb","Type":"ContainerStarted","Data":"798c2061de37d84f0841d94e2c9c388f7745b028068003e5bd28664882b9a93b"} Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.711215 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-bm7cx" podStartSLOduration=134.71119054 podStartE2EDuration="2m14.71119054s" podCreationTimestamp="2025-12-06 09:06:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:56.708019755 +0000 UTC m=+154.452280052" watchObservedRunningTime="2025-12-06 09:08:56.71119054 +0000 UTC m=+154.455450827" Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.716261 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:56 crc kubenswrapper[4672]: E1206 09:08:56.716733 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:57.216717225 +0000 UTC m=+154.960977512 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.740767 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-b8m6z" event={"ID":"87e773f5-6efb-4613-9af8-f05c7af849e1","Type":"ContainerStarted","Data":"ebb01384923b60e4123ab6bc20781ea95b44f4f37659d7631de6c0fc6c28c041"} Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.805464 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-445qb" podStartSLOduration=135.805438929 podStartE2EDuration="2m15.805438929s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:56.803296515 +0000 UTC m=+154.547556802" watchObservedRunningTime="2025-12-06 09:08:56.805438929 +0000 UTC m=+154.549699216" Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.817653 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:56 crc kubenswrapper[4672]: E1206 09:08:56.817824 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:57.317754367 +0000 UTC m=+155.062014654 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.824185 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:56 crc kubenswrapper[4672]: E1206 09:08:56.825135 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:57.325113368 +0000 UTC m=+155.069373655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.895781 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xqnzx" event={"ID":"2dfe938c-2f3d-4e4c-9156-d2d87b4478fe","Type":"ContainerStarted","Data":"9b0b85a30439922f0c7b1d34df7c8da11cab18082c28030bd073f9e39ae3f35d"} Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.903187 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qff28" podStartSLOduration=135.903163403 podStartE2EDuration="2m15.903163403s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:56.901263636 +0000 UTC m=+154.645523923" watchObservedRunningTime="2025-12-06 09:08:56.903163403 +0000 UTC m=+154.647423690" Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.926133 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:56 crc kubenswrapper[4672]: E1206 09:08:56.926775 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:57.426747988 +0000 UTC m=+155.171008275 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:56 crc kubenswrapper[4672]: I1206 09:08:56.958434 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pjlvh" event={"ID":"7116521c-c2a3-4c0e-bacf-9d83f4c59087","Type":"ContainerStarted","Data":"ca48ad7a40d9223e86b7410cbc9c4bbaeb3948e52529e8d6b85e6512c92284a1"} Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.027923 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:57 crc kubenswrapper[4672]: E1206 09:08:57.028381 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:57.528365678 +0000 UTC m=+155.272625965 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.029441 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rqdv8" event={"ID":"4c007303-368b-4322-8e3c-7b89c9f29c9e","Type":"ContainerStarted","Data":"cea801ae87d2babc1fd813a55761f69341aa0b81b8cf37a7e1277a7996eea05c"} Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.069788 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" event={"ID":"2fe5591f-8503-4eea-9b4f-e85419856dd6","Type":"ContainerStarted","Data":"8a7c75dad771fe1f8931d0bdc4912d59a32d8096b9c6568cd08b3e17ce581d68"} Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.085845 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vrlvb" event={"ID":"9a2d76b4-eb44-49ba-ad51-fbe3022af615","Type":"ContainerStarted","Data":"a0e39cdeddcd0b94dc8846f24fc78e4ca406fd63d6fef5eb085dae73130a5007"} Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.085963 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vrlvb" event={"ID":"9a2d76b4-eb44-49ba-ad51-fbe3022af615","Type":"ContainerStarted","Data":"3e318eed237cf1fa1a0b20e4b46d665cf62f439717438819912afe3c56b7dcb7"} Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.100244 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4h2gq" event={"ID":"1bc7e85b-9077-40c2-a33d-daa00d8b1d47","Type":"ContainerStarted","Data":"35131ac30f6f9d7805d93d15b607d0ae0c611d0a83738783a5e36046f1a23e64"} Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.114232 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-b8m6z" podStartSLOduration=136.114208265 podStartE2EDuration="2m16.114208265s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:57.103670239 +0000 UTC m=+154.847930526" watchObservedRunningTime="2025-12-06 09:08:57.114208265 +0000 UTC m=+154.858468552" Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.129137 4672 generic.go:334] "Generic (PLEG): container finished" podID="637b32e8-5e9a-47ac-aeaf-60709cdfba63" containerID="65545d3a7232c7ccbe447d2b12c0a21b686e094015e7782a45a522446be54546" exitCode=0 Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.129264 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vzn5" event={"ID":"637b32e8-5e9a-47ac-aeaf-60709cdfba63","Type":"ContainerDied","Data":"65545d3a7232c7ccbe447d2b12c0a21b686e094015e7782a45a522446be54546"} Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.136339 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:57 crc kubenswrapper[4672]: E1206 09:08:57.138174 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:57.638147081 +0000 UTC m=+155.382407528 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.187300 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d8c6e8204ccab1a62332095fce7a6518ff470cd10406274d8ee4733f6a97e8e8"} Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.187851 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pjlvh" podStartSLOduration=136.187832587 podStartE2EDuration="2m16.187832587s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:57.186810797 +0000 UTC m=+154.931071084" watchObservedRunningTime="2025-12-06 09:08:57.187832587 +0000 UTC m=+154.932092874" Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.239809 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:57 crc kubenswrapper[4672]: E1206 09:08:57.241352 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:57.741326048 +0000 UTC m=+155.485586335 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.282157 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-wvm9x" event={"ID":"b8a761a8-3e6d-42eb-b0f8-db388dcf6952","Type":"ContainerStarted","Data":"713878a4e078961632df76317843927d6c71d5d3568ea257d4d5594048833128"} Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.312513 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4l2wz" event={"ID":"475471e2-43d3-46f3-9aa1-b44f497b626f","Type":"ContainerStarted","Data":"8285ba82616bc5a408fcdd0db295903de8fd7a1000a48df38a75158abcacfe2a"} Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.328956 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-26xdk" event={"ID":"0eaff321-3498-4f22-9b82-89f459cb982c","Type":"ContainerStarted","Data":"89fd1b55f70e5c838b291caebb82e1d2ba3afeb1669ea2ecf05daf3fdac2b792"} Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.343240 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:57 crc kubenswrapper[4672]: E1206 09:08:57.343435 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:57.843402201 +0000 UTC m=+155.587662488 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.344021 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:57 crc kubenswrapper[4672]: E1206 09:08:57.346075 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:57.84605272 +0000 UTC m=+155.590313007 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.358304 4672 patch_prober.go:28] interesting pod/router-default-5444994796-x9m9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 09:08:57 crc kubenswrapper[4672]: [-]has-synced failed: reason withheld Dec 06 09:08:57 crc kubenswrapper[4672]: [+]process-running ok Dec 06 09:08:57 crc kubenswrapper[4672]: healthz check failed Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.358373 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x9m9h" podUID="77d1174d-bfc2-4145-9bf2-c2b648f903e8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.377757 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vrlvb" podStartSLOduration=136.377730027 podStartE2EDuration="2m16.377730027s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:57.273584662 +0000 UTC m=+155.017844949" watchObservedRunningTime="2025-12-06 09:08:57.377730027 +0000 UTC m=+155.121990314" Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.392622 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-g4dbv" event={"ID":"79f3925d-9a36-418b-bd25-50dd03106705","Type":"ContainerStarted","Data":"122c99023ef067551aaff4483cab95a3fbf7a4530d6bf128de656c955f0030de"} Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.407147 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wt47x" event={"ID":"3a64ce15-c29c-46af-91a1-309857493594","Type":"ContainerStarted","Data":"595fe28dacd1475f4bde5e1900dbf77456d3f17deafa78bfe8361c37f2ec67e0"} Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.437271 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" podStartSLOduration=136.437244808 podStartE2EDuration="2m16.437244808s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:57.378401998 +0000 UTC m=+155.122662285" watchObservedRunningTime="2025-12-06 09:08:57.437244808 +0000 UTC m=+155.181505095" Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.448345 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.449777 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7gxtl" event={"ID":"aecee641-b961-499e-9387-0a5d008a205c","Type":"ContainerStarted","Data":"ad27f1f2e0427fba8c59277a5eadcd431beb7f546caf28c5f70d2c7f37544671"} Dec 06 09:08:57 crc kubenswrapper[4672]: E1206 09:08:57.450289 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:57.950265937 +0000 UTC m=+155.694526224 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.471954 4672 patch_prober.go:28] interesting pod/downloads-7954f5f757-x88bb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.472033 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x88bb" podUID="64de6d79-f439-4a73-9ac6-605a71c8aab7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.481195 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv" Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.552681 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:57 crc kubenswrapper[4672]: E1206 09:08:57.559526 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:58.059505644 +0000 UTC m=+155.803765931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.642320 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4l2wz" podStartSLOduration=136.642291031 podStartE2EDuration="2m16.642291031s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:57.641818097 +0000 UTC m=+155.386078384" watchObservedRunningTime="2025-12-06 09:08:57.642291031 +0000 UTC m=+155.386551318" Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.656894 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:57 crc kubenswrapper[4672]: E1206 09:08:57.657327 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:58.15730173 +0000 UTC m=+155.901562017 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.705418 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5ss48" podStartSLOduration=136.705389808 podStartE2EDuration="2m16.705389808s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:57.702689807 +0000 UTC m=+155.446950094" watchObservedRunningTime="2025-12-06 09:08:57.705389808 +0000 UTC m=+155.449650095" Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.772620 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:57 crc kubenswrapper[4672]: E1206 09:08:57.773103 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:58.273083572 +0000 UTC m=+156.017343859 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.790232 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-wvm9x" podStartSLOduration=137.790215166 podStartE2EDuration="2m17.790215166s" podCreationTimestamp="2025-12-06 09:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:57.78837646 +0000 UTC m=+155.532636747" watchObservedRunningTime="2025-12-06 09:08:57.790215166 +0000 UTC m=+155.534475453" Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.878673 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:57 crc kubenswrapper[4672]: E1206 09:08:57.879295 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:58.379266998 +0000 UTC m=+156.123527285 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.963320 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7gxtl" podStartSLOduration=9.963284082 podStartE2EDuration="9.963284082s" podCreationTimestamp="2025-12-06 09:08:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:57.953424587 +0000 UTC m=+155.697684874" watchObservedRunningTime="2025-12-06 09:08:57.963284082 +0000 UTC m=+155.707544369" Dec 06 09:08:57 crc kubenswrapper[4672]: I1206 09:08:57.981330 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:57 crc kubenswrapper[4672]: E1206 09:08:57.981912 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:58.481859168 +0000 UTC m=+156.226119455 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.082658 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:58 crc kubenswrapper[4672]: E1206 09:08:58.083134 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:58.583115706 +0000 UTC m=+156.327375993 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.184509 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:58 crc kubenswrapper[4672]: E1206 09:08:58.185217 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:58.68520148 +0000 UTC m=+156.429461767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.295717 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:58 crc kubenswrapper[4672]: E1206 09:08:58.296533 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:58.796512769 +0000 UTC m=+156.540773056 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.344504 4672 patch_prober.go:28] interesting pod/router-default-5444994796-x9m9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 09:08:58 crc kubenswrapper[4672]: [-]has-synced failed: reason withheld Dec 06 09:08:58 crc kubenswrapper[4672]: [+]process-running ok Dec 06 09:08:58 crc kubenswrapper[4672]: healthz check failed Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.344573 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x9m9h" podUID="77d1174d-bfc2-4145-9bf2-c2b648f903e8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.398381 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:58 crc kubenswrapper[4672]: E1206 09:08:58.398724 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:58.898712416 +0000 UTC m=+156.642972703 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.467468 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vzn5" event={"ID":"637b32e8-5e9a-47ac-aeaf-60709cdfba63","Type":"ContainerStarted","Data":"568f1c9147524bd0eeb23c48c10220a94c7f59a6e5c06bb1a9e648b544d810a9"} Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.468100 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vzn5" Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.480341 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-q8xwg" event={"ID":"9824eace-01c1-49c3-9094-3f926eda9487","Type":"ContainerStarted","Data":"f087278d00db2abd805331eb5650ca961b87534e89d76e6f78d379f9243c0815"} Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.487299 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zl58f" event={"ID":"56a63b61-c938-4435-9112-a02277f6caa4","Type":"ContainerStarted","Data":"d7f5beda68662c559815aaf788d39b9bde7babcf4d747480424d7d82ec48fe14"} Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.491917 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5ss48" event={"ID":"b2b8e997-a3d6-43cc-a637-11f8c0a710ec","Type":"ContainerStarted","Data":"cb2d1d836191b9b9f47ab2ee1bcc7a9b320a57c96795237608849cdf8405c9f6"} Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.499471 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:58 crc kubenswrapper[4672]: E1206 09:08:58.499663 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:58.999637275 +0000 UTC m=+156.743897562 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.500165 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:58 crc kubenswrapper[4672]: E1206 09:08:58.500517 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:59.000507171 +0000 UTC m=+156.744767458 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.507368 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xqnzx" event={"ID":"2dfe938c-2f3d-4e4c-9156-d2d87b4478fe","Type":"ContainerStarted","Data":"deda8a0fd768b7c3b6f51063462d2096a2bc074793f79e14570c2fdb7e3eacd1"} Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.507610 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xqnzx" Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.509394 4672 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xqnzx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.509444 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-xqnzx" podUID="2dfe938c-2f3d-4e4c-9156-d2d87b4478fe" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.511423 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rqdv8" event={"ID":"4c007303-368b-4322-8e3c-7b89c9f29c9e","Type":"ContainerStarted","Data":"dcddb94bcfa7caa24185499497b518aadc5f6b208962f36d58bfd1e516f2e1e9"} Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.511458 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rqdv8" event={"ID":"4c007303-368b-4322-8e3c-7b89c9f29c9e","Type":"ContainerStarted","Data":"1901406410a95574f95514c877eb639271a14541ca51164217a2486a2605ba56"} Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.511866 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-rqdv8" Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.516011 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-26xdk" event={"ID":"0eaff321-3498-4f22-9b82-89f459cb982c","Type":"ContainerStarted","Data":"e9f908d0dfd76775dc4eee882ff28ea40ed5d24e0d71a0ab540711dde58e30e2"} Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.522106 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-24dsx" event={"ID":"acfb03ae-0ebb-47ec-8433-7de29e729cac","Type":"ContainerStarted","Data":"a19a7d0a81520f4f141f52ccc83b73ad6e5900c56f82a456569897cd851c49f4"} Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.528835 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4r8t9" event={"ID":"8760ee59-3419-480a-a540-6640481b8e1e","Type":"ContainerStarted","Data":"cbaa7b45c7c174b10bfa4d52cc9db37e371e49d449e6a70c226365c11d7dafdd"} Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.528880 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4r8t9" event={"ID":"8760ee59-3419-480a-a540-6640481b8e1e","Type":"ContainerStarted","Data":"4527a74ca1dd8608ea4608e8caabc2ca3b293c83180fc5a809d60dd60e06b73e"} Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.537722 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c2g6x" event={"ID":"0e864b8d-8d39-4fb3-97ad-30963169ecde","Type":"ContainerStarted","Data":"710100019cd51b15f5f79c02442c5c3f40b87b7501e52c3b74e7332f0be3b0e8"} Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.538203 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c2g6x" event={"ID":"0e864b8d-8d39-4fb3-97ad-30963169ecde","Type":"ContainerStarted","Data":"034b4934bb6ead53690a72a808f27e747ff47a617034d9060607fcf5a520d036"} Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.538337 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c2g6x" Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.548694 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pzl96" event={"ID":"efef9ae7-ed6d-40fc-9b40-70b1d55383df","Type":"ContainerStarted","Data":"42ee7d55da1017896735e03c58a28fb3901f9041f501cd08e41dcb4af56ad4ee"} Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.549256 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pzl96" Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.550807 4672 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-pzl96 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:5443/healthz\": dial tcp 10.217.0.31:5443: connect: connection refused" start-of-body= Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.550966 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pzl96" podUID="efef9ae7-ed6d-40fc-9b40-70b1d55383df" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.31:5443/healthz\": dial tcp 10.217.0.31:5443: connect: connection refused" Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.570397 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7gxtl" event={"ID":"aecee641-b961-499e-9387-0a5d008a205c","Type":"ContainerStarted","Data":"27613795cca7177d7e5f9cdd90ac1d18b936495a61ddd2e3501075f7cd610d8d"} Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.572080 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4h2gq" event={"ID":"1bc7e85b-9077-40c2-a33d-daa00d8b1d47","Type":"ContainerStarted","Data":"72aea703444c701ec2f9e8acbdcb8eca7403e7e28e68f484c6f27ce600bb8b09"} Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.587219 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vzn5" podStartSLOduration=138.587193294 podStartE2EDuration="2m18.587193294s" podCreationTimestamp="2025-12-06 09:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:58.585372429 +0000 UTC m=+156.329632726" watchObservedRunningTime="2025-12-06 09:08:58.587193294 +0000 UTC m=+156.331453581" Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.587759 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lcghp" event={"ID":"3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89","Type":"ContainerStarted","Data":"44d30a900c689bdad76550582931614d026c9df328f8ec3d0bdad2c7cd7abcb7"} Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.601093 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dvs5c" event={"ID":"459ffd9e-e358-41cb-b902-43e162b2c9d9","Type":"ContainerStarted","Data":"1ceb4601cefa7ff0bc45cd49e59359bcc606fcc3cbf18d8f93ee01253ada4d34"} Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.602250 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:58 crc kubenswrapper[4672]: E1206 09:08:58.605429 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:59.105409008 +0000 UTC m=+156.849669295 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.617758 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-znflx" Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.705776 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:58 crc kubenswrapper[4672]: E1206 09:08:58.714262 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:59.214243633 +0000 UTC m=+156.958503920 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.753298 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c2g6x" podStartSLOduration=137.753263431 podStartE2EDuration="2m17.753263431s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:58.751015763 +0000 UTC m=+156.495276080" watchObservedRunningTime="2025-12-06 09:08:58.753263431 +0000 UTC m=+156.497523728" Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.754383 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-q8xwg" podStartSLOduration=137.754376494 podStartE2EDuration="2m17.754376494s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:58.660194387 +0000 UTC m=+156.404454674" watchObservedRunningTime="2025-12-06 09:08:58.754376494 +0000 UTC m=+156.498636771" Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.806969 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:58 crc kubenswrapper[4672]: E1206 09:08:58.807395 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:59.3073741 +0000 UTC m=+157.051634377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.826436 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-xqnzx" podStartSLOduration=137.826417149 podStartE2EDuration="2m17.826417149s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:58.824918564 +0000 UTC m=+156.569178851" watchObservedRunningTime="2025-12-06 09:08:58.826417149 +0000 UTC m=+156.570677436" Dec 06 09:08:58 crc kubenswrapper[4672]: I1206 09:08:58.908769 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:58 crc kubenswrapper[4672]: E1206 09:08:58.909250 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:59.409218176 +0000 UTC m=+157.153478453 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:59 crc kubenswrapper[4672]: I1206 09:08:59.010650 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:59 crc kubenswrapper[4672]: E1206 09:08:59.011016 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:59.51099539 +0000 UTC m=+157.255255677 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:59 crc kubenswrapper[4672]: I1206 09:08:59.071307 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zl58f" podStartSLOduration=138.071272883 podStartE2EDuration="2m18.071272883s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:58.888271229 +0000 UTC m=+156.632531536" watchObservedRunningTime="2025-12-06 09:08:59.071272883 +0000 UTC m=+156.815533170" Dec 06 09:08:59 crc kubenswrapper[4672]: I1206 09:08:59.072842 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-24dsx" podStartSLOduration=139.072835749 podStartE2EDuration="2m19.072835749s" podCreationTimestamp="2025-12-06 09:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:59.07019045 +0000 UTC m=+156.814450737" watchObservedRunningTime="2025-12-06 09:08:59.072835749 +0000 UTC m=+156.817096036" Dec 06 09:08:59 crc kubenswrapper[4672]: I1206 09:08:59.112744 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:59 crc kubenswrapper[4672]: E1206 09:08:59.113292 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:59.613266479 +0000 UTC m=+157.357526766 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:59 crc kubenswrapper[4672]: I1206 09:08:59.214350 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:59 crc kubenswrapper[4672]: E1206 09:08:59.214841 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:59.714815866 +0000 UTC m=+157.459076153 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:59 crc kubenswrapper[4672]: I1206 09:08:59.316267 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:59 crc kubenswrapper[4672]: E1206 09:08:59.316759 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:59.816743125 +0000 UTC m=+157.561003412 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:59 crc kubenswrapper[4672]: I1206 09:08:59.350786 4672 patch_prober.go:28] interesting pod/router-default-5444994796-x9m9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 09:08:59 crc kubenswrapper[4672]: [-]has-synced failed: reason withheld Dec 06 09:08:59 crc kubenswrapper[4672]: [+]process-running ok Dec 06 09:08:59 crc kubenswrapper[4672]: healthz check failed Dec 06 09:08:59 crc kubenswrapper[4672]: I1206 09:08:59.350859 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x9m9h" podUID="77d1174d-bfc2-4145-9bf2-c2b648f903e8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 09:08:59 crc kubenswrapper[4672]: I1206 09:08:59.418644 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:59 crc kubenswrapper[4672]: E1206 09:08:59.418923 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:08:59.91885709 +0000 UTC m=+157.663117377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:59 crc kubenswrapper[4672]: I1206 09:08:59.419343 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:59 crc kubenswrapper[4672]: E1206 09:08:59.419779 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:08:59.919769436 +0000 UTC m=+157.664029723 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:59 crc kubenswrapper[4672]: I1206 09:08:59.520945 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:59 crc kubenswrapper[4672]: E1206 09:08:59.521372 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:09:00.021335394 +0000 UTC m=+157.765595681 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:59 crc kubenswrapper[4672]: I1206 09:08:59.607147 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wt47x" event={"ID":"3a64ce15-c29c-46af-91a1-309857493594","Type":"ContainerStarted","Data":"a5604c9598fa68d83cac857f7fef0c5f4da096c9ee7afdae2998320eb87ddb80"} Dec 06 09:08:59 crc kubenswrapper[4672]: I1206 09:08:59.611166 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4h2gq" event={"ID":"1bc7e85b-9077-40c2-a33d-daa00d8b1d47","Type":"ContainerStarted","Data":"0d70b6eb046d20dce3494715a8929db2626dc7ed84a6c77b1efcfeecef30c92e"} Dec 06 09:08:59 crc kubenswrapper[4672]: I1206 09:08:59.613060 4672 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xqnzx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Dec 06 09:08:59 crc kubenswrapper[4672]: I1206 09:08:59.613132 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-xqnzx" podUID="2dfe938c-2f3d-4e4c-9156-d2d87b4478fe" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Dec 06 09:08:59 crc kubenswrapper[4672]: I1206 09:08:59.622787 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:59 crc kubenswrapper[4672]: E1206 09:08:59.623174 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:09:00.123160181 +0000 UTC m=+157.867420458 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:59 crc kubenswrapper[4672]: I1206 09:08:59.633465 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rqdv8" podStartSLOduration=11.633440777 podStartE2EDuration="11.633440777s" podCreationTimestamp="2025-12-06 09:08:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:59.631517541 +0000 UTC m=+157.375777828" watchObservedRunningTime="2025-12-06 09:08:59.633440777 +0000 UTC m=+157.377701064" Dec 06 09:08:59 crc kubenswrapper[4672]: I1206 09:08:59.634853 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4r8t9" podStartSLOduration=138.63484667 podStartE2EDuration="2m18.63484667s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:59.421488628 +0000 UTC m=+157.165748915" watchObservedRunningTime="2025-12-06 09:08:59.63484667 +0000 UTC m=+157.379106957" Dec 06 09:08:59 crc kubenswrapper[4672]: I1206 09:08:59.710169 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pzl96" podStartSLOduration=138.710139262 podStartE2EDuration="2m18.710139262s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:59.705172113 +0000 UTC m=+157.449432400" watchObservedRunningTime="2025-12-06 09:08:59.710139262 +0000 UTC m=+157.454399549" Dec 06 09:08:59 crc kubenswrapper[4672]: E1206 09:08:59.723976 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:09:00.223943245 +0000 UTC m=+157.968203532 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:59 crc kubenswrapper[4672]: I1206 09:08:59.723834 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:59 crc kubenswrapper[4672]: I1206 09:08:59.725906 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:59 crc kubenswrapper[4672]: E1206 09:08:59.726437 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:09:00.226412709 +0000 UTC m=+157.970672986 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:59 crc kubenswrapper[4672]: I1206 09:08:59.828712 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:59 crc kubenswrapper[4672]: E1206 09:08:59.828937 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:09:00.328895594 +0000 UTC m=+158.073155881 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:59 crc kubenswrapper[4672]: I1206 09:08:59.829005 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:08:59 crc kubenswrapper[4672]: E1206 09:08:59.829371 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:09:00.329355237 +0000 UTC m=+158.073615524 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:59 crc kubenswrapper[4672]: I1206 09:08:59.832273 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-26xdk" podStartSLOduration=138.832259044 podStartE2EDuration="2m18.832259044s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:59.831661037 +0000 UTC m=+157.575921324" watchObservedRunningTime="2025-12-06 09:08:59.832259044 +0000 UTC m=+157.576519331" Dec 06 09:08:59 crc kubenswrapper[4672]: I1206 09:08:59.929872 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:08:59 crc kubenswrapper[4672]: E1206 09:08:59.930447 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:09:00.4304201 +0000 UTC m=+158.174680387 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:08:59 crc kubenswrapper[4672]: I1206 09:08:59.963885 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dvs5c" podStartSLOduration=138.9638577 podStartE2EDuration="2m18.9638577s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:59.893926619 +0000 UTC m=+157.638186916" watchObservedRunningTime="2025-12-06 09:08:59.9638577 +0000 UTC m=+157.708117987" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.021814 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.021892 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.031931 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:09:00 crc kubenswrapper[4672]: E1206 09:09:00.032358 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:09:00.532341849 +0000 UTC m=+158.276602136 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.041783 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.076056 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-dcdqg" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.076113 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-dcdqg" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.078280 4672 patch_prober.go:28] interesting pod/console-f9d7485db-dcdqg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.078371 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-dcdqg" podUID="de34b8a9-076f-4aa5-acb7-52361b6deeb8" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.116583 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-lcghp" podStartSLOduration=139.116556748 podStartE2EDuration="2m19.116556748s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:09:00.101708995 +0000 UTC m=+157.845969282" watchObservedRunningTime="2025-12-06 09:09:00.116556748 +0000 UTC m=+157.860817035" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.117152 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-4h2gq" podStartSLOduration=139.117147916 podStartE2EDuration="2m19.117147916s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:59.967120799 +0000 UTC m=+157.711381106" watchObservedRunningTime="2025-12-06 09:09:00.117147916 +0000 UTC m=+157.861408203" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.133365 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:09:00 crc kubenswrapper[4672]: E1206 09:09:00.133683 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:09:00.633629899 +0000 UTC m=+158.377890186 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.134006 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:09:00 crc kubenswrapper[4672]: E1206 09:09:00.134453 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:09:00.634433653 +0000 UTC m=+158.378693940 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.146469 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.146523 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.235490 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:09:00 crc kubenswrapper[4672]: E1206 09:09:00.235913 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:09:00.735893418 +0000 UTC m=+158.480153705 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.270781 4672 patch_prober.go:28] interesting pod/downloads-7954f5f757-x88bb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.271160 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x88bb" podUID="64de6d79-f439-4a73-9ac6-605a71c8aab7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.271327 4672 patch_prober.go:28] interesting pod/downloads-7954f5f757-x88bb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.271414 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-x88bb" podUID="64de6d79-f439-4a73-9ac6-605a71c8aab7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.339665 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-x9m9h" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.343526 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:09:00 crc kubenswrapper[4672]: E1206 09:09:00.343987 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:09:00.84397213 +0000 UTC m=+158.588232417 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.353546 4672 patch_prober.go:28] interesting pod/router-default-5444994796-x9m9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 09:09:00 crc kubenswrapper[4672]: [-]has-synced failed: reason withheld Dec 06 09:09:00 crc kubenswrapper[4672]: [+]process-running ok Dec 06 09:09:00 crc kubenswrapper[4672]: healthz check failed Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.353646 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x9m9h" podUID="77d1174d-bfc2-4145-9bf2-c2b648f903e8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.360030 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-q66t7" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.448379 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:09:00 crc kubenswrapper[4672]: E1206 09:09:00.448832 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:09:00.948788016 +0000 UTC m=+158.693048303 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.449139 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:09:00 crc kubenswrapper[4672]: E1206 09:09:00.449546 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:09:00.949539308 +0000 UTC m=+158.693799595 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.523587 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bpn8p"] Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.525153 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bpn8p" Dec 06 09:09:00 crc kubenswrapper[4672]: W1206 09:09:00.538912 4672 reflector.go:561] object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g": failed to list *v1.Secret: secrets "certified-operators-dockercfg-4rs5g" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-marketplace": no relationship found between node 'crc' and this object Dec 06 09:09:00 crc kubenswrapper[4672]: E1206 09:09:00.538969 4672 reflector.go:158] "Unhandled Error" err="object-\"openshift-marketplace\"/\"certified-operators-dockercfg-4rs5g\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"certified-operators-dockercfg-4rs5g\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-marketplace\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.551330 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:09:00 crc kubenswrapper[4672]: E1206 09:09:00.551473 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:09:01.051447456 +0000 UTC m=+158.795707743 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.551731 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:09:00 crc kubenswrapper[4672]: E1206 09:09:00.552089 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:09:01.052081325 +0000 UTC m=+158.796341612 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.587116 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bpn8p"] Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.615690 4672 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-pzl96 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.615765 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pzl96" podUID="efef9ae7-ed6d-40fc-9b40-70b1d55383df" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.31:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.637523 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wt47x" event={"ID":"3a64ce15-c29c-46af-91a1-309857493594","Type":"ContainerStarted","Data":"13ae96d3e3b2ae46456674c1a9bcd71e93237a15be3c98e5aa3c30118f27552d"} Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.653289 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.653659 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fef1798-dde5-4ef8-a4fa-5a5997738964-catalog-content\") pod \"certified-operators-bpn8p\" (UID: \"9fef1798-dde5-4ef8-a4fa-5a5997738964\") " pod="openshift-marketplace/certified-operators-bpn8p" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.653699 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nlb2\" (UniqueName: \"kubernetes.io/projected/9fef1798-dde5-4ef8-a4fa-5a5997738964-kube-api-access-5nlb2\") pod \"certified-operators-bpn8p\" (UID: \"9fef1798-dde5-4ef8-a4fa-5a5997738964\") " pod="openshift-marketplace/certified-operators-bpn8p" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.653730 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fef1798-dde5-4ef8-a4fa-5a5997738964-utilities\") pod \"certified-operators-bpn8p\" (UID: \"9fef1798-dde5-4ef8-a4fa-5a5997738964\") " pod="openshift-marketplace/certified-operators-bpn8p" Dec 06 09:09:00 crc kubenswrapper[4672]: E1206 09:09:00.653844 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:09:01.153823139 +0000 UTC m=+158.898083416 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.676942 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5fpzc" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.694340 4672 patch_prober.go:28] interesting pod/apiserver-76f77b778f-lcghp container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 06 09:09:00 crc kubenswrapper[4672]: [+]log ok Dec 06 09:09:00 crc kubenswrapper[4672]: [+]etcd ok Dec 06 09:09:00 crc kubenswrapper[4672]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 06 09:09:00 crc kubenswrapper[4672]: [+]poststarthook/generic-apiserver-start-informers ok Dec 06 09:09:00 crc kubenswrapper[4672]: [-]poststarthook/max-in-flight-filter failed: reason withheld Dec 06 09:09:00 crc kubenswrapper[4672]: [-]poststarthook/storage-object-count-tracker-hook failed: reason withheld Dec 06 09:09:00 crc kubenswrapper[4672]: [-]poststarthook/image.openshift.io-apiserver-caches failed: reason withheld Dec 06 09:09:00 crc kubenswrapper[4672]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 06 09:09:00 crc kubenswrapper[4672]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 06 09:09:00 crc kubenswrapper[4672]: [+]poststarthook/project.openshift.io-projectcache ok Dec 06 09:09:00 crc kubenswrapper[4672]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 06 09:09:00 crc kubenswrapper[4672]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Dec 06 09:09:00 crc kubenswrapper[4672]: [-]poststarthook/openshift.io-restmapperupdater failed: reason withheld Dec 06 09:09:00 crc kubenswrapper[4672]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 06 09:09:00 crc kubenswrapper[4672]: livez check failed Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.694459 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-lcghp" podUID="3da7f241-4fe4-4c1b-bfa8-b273bd6e7a89" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.728064 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vtk7z"] Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.729062 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vtk7z" Dec 06 09:09:00 crc kubenswrapper[4672]: W1206 09:09:00.737734 4672 reflector.go:561] object-"openshift-marketplace"/"community-operators-dockercfg-dmngl": failed to list *v1.Secret: secrets "community-operators-dockercfg-dmngl" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-marketplace": no relationship found between node 'crc' and this object Dec 06 09:09:00 crc kubenswrapper[4672]: E1206 09:09:00.737797 4672 reflector.go:158] "Unhandled Error" err="object-\"openshift-marketplace\"/\"community-operators-dockercfg-dmngl\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"community-operators-dockercfg-dmngl\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-marketplace\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.754899 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nlb2\" (UniqueName: \"kubernetes.io/projected/9fef1798-dde5-4ef8-a4fa-5a5997738964-kube-api-access-5nlb2\") pod \"certified-operators-bpn8p\" (UID: \"9fef1798-dde5-4ef8-a4fa-5a5997738964\") " pod="openshift-marketplace/certified-operators-bpn8p" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.756395 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fef1798-dde5-4ef8-a4fa-5a5997738964-utilities\") pod \"certified-operators-bpn8p\" (UID: \"9fef1798-dde5-4ef8-a4fa-5a5997738964\") " pod="openshift-marketplace/certified-operators-bpn8p" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.756874 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fef1798-dde5-4ef8-a4fa-5a5997738964-utilities\") pod \"certified-operators-bpn8p\" (UID: \"9fef1798-dde5-4ef8-a4fa-5a5997738964\") " pod="openshift-marketplace/certified-operators-bpn8p" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.757280 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:09:00 crc kubenswrapper[4672]: E1206 09:09:00.757646 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:09:01.257632124 +0000 UTC m=+159.001892411 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.760258 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fef1798-dde5-4ef8-a4fa-5a5997738964-catalog-content\") pod \"certified-operators-bpn8p\" (UID: \"9fef1798-dde5-4ef8-a4fa-5a5997738964\") " pod="openshift-marketplace/certified-operators-bpn8p" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.760680 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fef1798-dde5-4ef8-a4fa-5a5997738964-catalog-content\") pod \"certified-operators-bpn8p\" (UID: \"9fef1798-dde5-4ef8-a4fa-5a5997738964\") " pod="openshift-marketplace/certified-operators-bpn8p" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.850719 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nlb2\" (UniqueName: \"kubernetes.io/projected/9fef1798-dde5-4ef8-a4fa-5a5997738964-kube-api-access-5nlb2\") pod \"certified-operators-bpn8p\" (UID: \"9fef1798-dde5-4ef8-a4fa-5a5997738964\") " pod="openshift-marketplace/certified-operators-bpn8p" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.862211 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.862805 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlpl2\" (UniqueName: \"kubernetes.io/projected/88387d22-3fdc-4004-a9a1-4e6467c2c3f4-kube-api-access-dlpl2\") pod \"community-operators-vtk7z\" (UID: \"88387d22-3fdc-4004-a9a1-4e6467c2c3f4\") " pod="openshift-marketplace/community-operators-vtk7z" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.862986 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88387d22-3fdc-4004-a9a1-4e6467c2c3f4-utilities\") pod \"community-operators-vtk7z\" (UID: \"88387d22-3fdc-4004-a9a1-4e6467c2c3f4\") " pod="openshift-marketplace/community-operators-vtk7z" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.863186 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88387d22-3fdc-4004-a9a1-4e6467c2c3f4-catalog-content\") pod \"community-operators-vtk7z\" (UID: \"88387d22-3fdc-4004-a9a1-4e6467c2c3f4\") " pod="openshift-marketplace/community-operators-vtk7z" Dec 06 09:09:00 crc kubenswrapper[4672]: E1206 09:09:00.863409 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:09:01.363381227 +0000 UTC m=+159.107641524 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.931908 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zr2vx"] Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.933229 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zr2vx" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.964697 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88387d22-3fdc-4004-a9a1-4e6467c2c3f4-catalog-content\") pod \"community-operators-vtk7z\" (UID: \"88387d22-3fdc-4004-a9a1-4e6467c2c3f4\") " pod="openshift-marketplace/community-operators-vtk7z" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.965618 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlpl2\" (UniqueName: \"kubernetes.io/projected/88387d22-3fdc-4004-a9a1-4e6467c2c3f4-kube-api-access-dlpl2\") pod \"community-operators-vtk7z\" (UID: \"88387d22-3fdc-4004-a9a1-4e6467c2c3f4\") " pod="openshift-marketplace/community-operators-vtk7z" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.966239 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.965547 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88387d22-3fdc-4004-a9a1-4e6467c2c3f4-catalog-content\") pod \"community-operators-vtk7z\" (UID: \"88387d22-3fdc-4004-a9a1-4e6467c2c3f4\") " pod="openshift-marketplace/community-operators-vtk7z" Dec 06 09:09:00 crc kubenswrapper[4672]: E1206 09:09:00.966709 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:09:01.466692176 +0000 UTC m=+159.210952463 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.967285 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88387d22-3fdc-4004-a9a1-4e6467c2c3f4-utilities\") pod \"community-operators-vtk7z\" (UID: \"88387d22-3fdc-4004-a9a1-4e6467c2c3f4\") " pod="openshift-marketplace/community-operators-vtk7z" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.967707 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88387d22-3fdc-4004-a9a1-4e6467c2c3f4-utilities\") pod \"community-operators-vtk7z\" (UID: \"88387d22-3fdc-4004-a9a1-4e6467c2c3f4\") " pod="openshift-marketplace/community-operators-vtk7z" Dec 06 09:09:00 crc kubenswrapper[4672]: I1206 09:09:00.977871 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vtk7z"] Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.001111 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlpl2\" (UniqueName: \"kubernetes.io/projected/88387d22-3fdc-4004-a9a1-4e6467c2c3f4-kube-api-access-dlpl2\") pod \"community-operators-vtk7z\" (UID: \"88387d22-3fdc-4004-a9a1-4e6467c2c3f4\") " pod="openshift-marketplace/community-operators-vtk7z" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.035794 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zr2vx"] Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.069330 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.070042 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fe165df-a354-4ea2-a51b-dabdaeb654f2-utilities\") pod \"certified-operators-zr2vx\" (UID: \"4fe165df-a354-4ea2-a51b-dabdaeb654f2\") " pod="openshift-marketplace/certified-operators-zr2vx" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.070218 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w56l\" (UniqueName: \"kubernetes.io/projected/4fe165df-a354-4ea2-a51b-dabdaeb654f2-kube-api-access-2w56l\") pod \"certified-operators-zr2vx\" (UID: \"4fe165df-a354-4ea2-a51b-dabdaeb654f2\") " pod="openshift-marketplace/certified-operators-zr2vx" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.070384 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fe165df-a354-4ea2-a51b-dabdaeb654f2-catalog-content\") pod \"certified-operators-zr2vx\" (UID: \"4fe165df-a354-4ea2-a51b-dabdaeb654f2\") " pod="openshift-marketplace/certified-operators-zr2vx" Dec 06 09:09:01 crc kubenswrapper[4672]: E1206 09:09:01.070715 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:09:01.570690218 +0000 UTC m=+159.314950515 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.115994 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-69p9s"] Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.118143 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-69p9s" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.190096 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-69p9s"] Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.208699 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fe165df-a354-4ea2-a51b-dabdaeb654f2-catalog-content\") pod \"certified-operators-zr2vx\" (UID: \"4fe165df-a354-4ea2-a51b-dabdaeb654f2\") " pod="openshift-marketplace/certified-operators-zr2vx" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.209223 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fe165df-a354-4ea2-a51b-dabdaeb654f2-utilities\") pod \"certified-operators-zr2vx\" (UID: \"4fe165df-a354-4ea2-a51b-dabdaeb654f2\") " pod="openshift-marketplace/certified-operators-zr2vx" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.209380 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.209486 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w56l\" (UniqueName: \"kubernetes.io/projected/4fe165df-a354-4ea2-a51b-dabdaeb654f2-kube-api-access-2w56l\") pod \"certified-operators-zr2vx\" (UID: \"4fe165df-a354-4ea2-a51b-dabdaeb654f2\") " pod="openshift-marketplace/certified-operators-zr2vx" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.210818 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fe165df-a354-4ea2-a51b-dabdaeb654f2-catalog-content\") pod \"certified-operators-zr2vx\" (UID: \"4fe165df-a354-4ea2-a51b-dabdaeb654f2\") " pod="openshift-marketplace/certified-operators-zr2vx" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.211169 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fe165df-a354-4ea2-a51b-dabdaeb654f2-utilities\") pod \"certified-operators-zr2vx\" (UID: \"4fe165df-a354-4ea2-a51b-dabdaeb654f2\") " pod="openshift-marketplace/certified-operators-zr2vx" Dec 06 09:09:01 crc kubenswrapper[4672]: E1206 09:09:01.211444 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:09:01.711426217 +0000 UTC m=+159.455686504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.255549 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.257873 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.264097 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.264585 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.291732 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.311031 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.311528 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx465\" (UniqueName: \"kubernetes.io/projected/de182d83-c1ee-4b2a-827b-90fbf7d1e626-kube-api-access-zx465\") pod \"community-operators-69p9s\" (UID: \"de182d83-c1ee-4b2a-827b-90fbf7d1e626\") " pod="openshift-marketplace/community-operators-69p9s" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.311692 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de182d83-c1ee-4b2a-827b-90fbf7d1e626-utilities\") pod \"community-operators-69p9s\" (UID: \"de182d83-c1ee-4b2a-827b-90fbf7d1e626\") " pod="openshift-marketplace/community-operators-69p9s" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.311839 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de182d83-c1ee-4b2a-827b-90fbf7d1e626-catalog-content\") pod \"community-operators-69p9s\" (UID: \"de182d83-c1ee-4b2a-827b-90fbf7d1e626\") " pod="openshift-marketplace/community-operators-69p9s" Dec 06 09:09:01 crc kubenswrapper[4672]: E1206 09:09:01.312197 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:09:01.812082128 +0000 UTC m=+159.556342415 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.313399 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w56l\" (UniqueName: \"kubernetes.io/projected/4fe165df-a354-4ea2-a51b-dabdaeb654f2-kube-api-access-2w56l\") pod \"certified-operators-zr2vx\" (UID: \"4fe165df-a354-4ea2-a51b-dabdaeb654f2\") " pod="openshift-marketplace/certified-operators-zr2vx" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.330268 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pzl96" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.352690 4672 patch_prober.go:28] interesting pod/router-default-5444994796-x9m9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 09:09:01 crc kubenswrapper[4672]: [-]has-synced failed: reason withheld Dec 06 09:09:01 crc kubenswrapper[4672]: [+]process-running ok Dec 06 09:09:01 crc kubenswrapper[4672]: healthz check failed Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.353300 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x9m9h" podUID="77d1174d-bfc2-4145-9bf2-c2b648f903e8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.413564 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e09a9ce1-4aea-4258-9886-fa3312588ab8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"e09a9ce1-4aea-4258-9886-fa3312588ab8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.414081 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:09:01 crc kubenswrapper[4672]: E1206 09:09:01.414629 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:09:01.914589914 +0000 UTC m=+159.658850201 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.414999 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de182d83-c1ee-4b2a-827b-90fbf7d1e626-utilities\") pod \"community-operators-69p9s\" (UID: \"de182d83-c1ee-4b2a-827b-90fbf7d1e626\") " pod="openshift-marketplace/community-operators-69p9s" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.415667 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e09a9ce1-4aea-4258-9886-fa3312588ab8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"e09a9ce1-4aea-4258-9886-fa3312588ab8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.415866 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de182d83-c1ee-4b2a-827b-90fbf7d1e626-catalog-content\") pod \"community-operators-69p9s\" (UID: \"de182d83-c1ee-4b2a-827b-90fbf7d1e626\") " pod="openshift-marketplace/community-operators-69p9s" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.416259 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx465\" (UniqueName: \"kubernetes.io/projected/de182d83-c1ee-4b2a-827b-90fbf7d1e626-kube-api-access-zx465\") pod \"community-operators-69p9s\" (UID: \"de182d83-c1ee-4b2a-827b-90fbf7d1e626\") " pod="openshift-marketplace/community-operators-69p9s" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.416195 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de182d83-c1ee-4b2a-827b-90fbf7d1e626-catalog-content\") pod \"community-operators-69p9s\" (UID: \"de182d83-c1ee-4b2a-827b-90fbf7d1e626\") " pod="openshift-marketplace/community-operators-69p9s" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.415637 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de182d83-c1ee-4b2a-827b-90fbf7d1e626-utilities\") pod \"community-operators-69p9s\" (UID: \"de182d83-c1ee-4b2a-827b-90fbf7d1e626\") " pod="openshift-marketplace/community-operators-69p9s" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.420465 4672 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xqnzx container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.420724 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-xqnzx" podUID="2dfe938c-2f3d-4e4c-9156-d2d87b4478fe" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.421232 4672 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xqnzx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.421382 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-xqnzx" podUID="2dfe938c-2f3d-4e4c-9156-d2d87b4478fe" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.450029 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx465\" (UniqueName: \"kubernetes.io/projected/de182d83-c1ee-4b2a-827b-90fbf7d1e626-kube-api-access-zx465\") pod \"community-operators-69p9s\" (UID: \"de182d83-c1ee-4b2a-827b-90fbf7d1e626\") " pod="openshift-marketplace/community-operators-69p9s" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.477979 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.480625 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bpn8p" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.518143 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.518831 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e09a9ce1-4aea-4258-9886-fa3312588ab8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"e09a9ce1-4aea-4258-9886-fa3312588ab8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.519053 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e09a9ce1-4aea-4258-9886-fa3312588ab8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"e09a9ce1-4aea-4258-9886-fa3312588ab8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 09:09:01 crc kubenswrapper[4672]: E1206 09:09:01.519727 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:09:02.019707248 +0000 UTC m=+159.763967535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.519879 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e09a9ce1-4aea-4258-9886-fa3312588ab8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"e09a9ce1-4aea-4258-9886-fa3312588ab8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.552914 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zr2vx" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.557365 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e09a9ce1-4aea-4258-9886-fa3312588ab8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"e09a9ce1-4aea-4258-9886-fa3312588ab8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.590059 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.621094 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:09:01 crc kubenswrapper[4672]: E1206 09:09:01.621841 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:09:02.121826252 +0000 UTC m=+159.866086539 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.722363 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:09:01 crc kubenswrapper[4672]: E1206 09:09:01.723185 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:09:02.223162903 +0000 UTC m=+159.967423180 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.824762 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:09:01 crc kubenswrapper[4672]: E1206 09:09:01.825136 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:09:02.325120883 +0000 UTC m=+160.069381170 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:09:01 crc kubenswrapper[4672]: I1206 09:09:01.930952 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:09:01 crc kubenswrapper[4672]: E1206 09:09:01.931485 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:09:02.431462484 +0000 UTC m=+160.175722771 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:09:02 crc kubenswrapper[4672]: I1206 09:09:02.037818 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:09:02 crc kubenswrapper[4672]: E1206 09:09:02.038960 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:09:02.538938899 +0000 UTC m=+160.283199226 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:09:02 crc kubenswrapper[4672]: I1206 09:09:02.047739 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 06 09:09:02 crc kubenswrapper[4672]: I1206 09:09:02.047945 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-69p9s" Dec 06 09:09:02 crc kubenswrapper[4672]: I1206 09:09:02.063741 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vtk7z" Dec 06 09:09:02 crc kubenswrapper[4672]: I1206 09:09:02.141302 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:09:02 crc kubenswrapper[4672]: E1206 09:09:02.141689 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:09:02.641672742 +0000 UTC m=+160.385933029 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:09:02 crc kubenswrapper[4672]: I1206 09:09:02.243110 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:09:02 crc kubenswrapper[4672]: E1206 09:09:02.243793 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:09:02.743777595 +0000 UTC m=+160.488037882 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:09:02 crc kubenswrapper[4672]: I1206 09:09:02.326141 4672 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 06 09:09:02 crc kubenswrapper[4672]: I1206 09:09:02.352326 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:09:02 crc kubenswrapper[4672]: E1206 09:09:02.352856 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:09:02.852835748 +0000 UTC m=+160.597096035 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:09:02 crc kubenswrapper[4672]: I1206 09:09:02.395831 4672 patch_prober.go:28] interesting pod/router-default-5444994796-x9m9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 09:09:02 crc kubenswrapper[4672]: [-]has-synced failed: reason withheld Dec 06 09:09:02 crc kubenswrapper[4672]: [+]process-running ok Dec 06 09:09:02 crc kubenswrapper[4672]: healthz check failed Dec 06 09:09:02 crc kubenswrapper[4672]: I1206 09:09:02.396275 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x9m9h" podUID="77d1174d-bfc2-4145-9bf2-c2b648f903e8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 09:09:02 crc kubenswrapper[4672]: I1206 09:09:02.458595 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:09:02 crc kubenswrapper[4672]: E1206 09:09:02.458961 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:09:02.958947912 +0000 UTC m=+160.703208199 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:09:02 crc kubenswrapper[4672]: I1206 09:09:02.484923 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bpn8p"] Dec 06 09:09:02 crc kubenswrapper[4672]: I1206 09:09:02.510962 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vzn5" Dec 06 09:09:02 crc kubenswrapper[4672]: I1206 09:09:02.568421 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:09:02 crc kubenswrapper[4672]: E1206 09:09:02.570492 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 09:09:03.070461067 +0000 UTC m=+160.814721354 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:09:02 crc kubenswrapper[4672]: I1206 09:09:02.685817 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:09:02 crc kubenswrapper[4672]: E1206 09:09:02.686978 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 09:09:03.186961501 +0000 UTC m=+160.931221788 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbpp6" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 09:09:02 crc kubenswrapper[4672]: I1206 09:09:02.694157 4672 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-06T09:09:02.326550982Z","Handler":null,"Name":""} Dec 06 09:09:02 crc kubenswrapper[4672]: I1206 09:09:02.723535 4672 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 06 09:09:02 crc kubenswrapper[4672]: I1206 09:09:02.723575 4672 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 06 09:09:02 crc kubenswrapper[4672]: I1206 09:09:02.747682 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nxl9d"] Dec 06 09:09:02 crc kubenswrapper[4672]: I1206 09:09:02.749007 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nxl9d" Dec 06 09:09:02 crc kubenswrapper[4672]: I1206 09:09:02.753811 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bpn8p" event={"ID":"9fef1798-dde5-4ef8-a4fa-5a5997738964","Type":"ContainerStarted","Data":"1b904ee9d69070c8143dee3bb0b495d315ab3654c8ddad31e7fdaa76b55872c1"} Dec 06 09:09:02 crc kubenswrapper[4672]: I1206 09:09:02.756984 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 06 09:09:02 crc kubenswrapper[4672]: I1206 09:09:02.787763 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 09:09:02 crc kubenswrapper[4672]: I1206 09:09:02.791854 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wt47x" event={"ID":"3a64ce15-c29c-46af-91a1-309857493594","Type":"ContainerStarted","Data":"8d32f044715715e3675612af37d069da5e79a63fef3c85364d1d7f1b5366da38"} Dec 06 09:09:02 crc kubenswrapper[4672]: I1206 09:09:02.845674 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 06 09:09:02 crc kubenswrapper[4672]: I1206 09:09:02.893898 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0417e2e-2041-42b0-a404-236595aa99bd-utilities\") pod \"redhat-marketplace-nxl9d\" (UID: \"d0417e2e-2041-42b0-a404-236595aa99bd\") " pod="openshift-marketplace/redhat-marketplace-nxl9d" Dec 06 09:09:02 crc kubenswrapper[4672]: I1206 09:09:02.893987 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2s5f\" (UniqueName: \"kubernetes.io/projected/d0417e2e-2041-42b0-a404-236595aa99bd-kube-api-access-t2s5f\") pod \"redhat-marketplace-nxl9d\" (UID: \"d0417e2e-2041-42b0-a404-236595aa99bd\") " pod="openshift-marketplace/redhat-marketplace-nxl9d" Dec 06 09:09:02 crc kubenswrapper[4672]: I1206 09:09:02.894037 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:09:02 crc kubenswrapper[4672]: I1206 09:09:02.894074 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0417e2e-2041-42b0-a404-236595aa99bd-catalog-content\") pod \"redhat-marketplace-nxl9d\" (UID: \"d0417e2e-2041-42b0-a404-236595aa99bd\") " pod="openshift-marketplace/redhat-marketplace-nxl9d" Dec 06 09:09:02 crc kubenswrapper[4672]: I1206 09:09:02.941641 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nxl9d"] Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.004629 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0417e2e-2041-42b0-a404-236595aa99bd-utilities\") pod \"redhat-marketplace-nxl9d\" (UID: \"d0417e2e-2041-42b0-a404-236595aa99bd\") " pod="openshift-marketplace/redhat-marketplace-nxl9d" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.004689 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2s5f\" (UniqueName: \"kubernetes.io/projected/d0417e2e-2041-42b0-a404-236595aa99bd-kube-api-access-t2s5f\") pod \"redhat-marketplace-nxl9d\" (UID: \"d0417e2e-2041-42b0-a404-236595aa99bd\") " pod="openshift-marketplace/redhat-marketplace-nxl9d" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.004758 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0417e2e-2041-42b0-a404-236595aa99bd-catalog-content\") pod \"redhat-marketplace-nxl9d\" (UID: \"d0417e2e-2041-42b0-a404-236595aa99bd\") " pod="openshift-marketplace/redhat-marketplace-nxl9d" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.005363 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0417e2e-2041-42b0-a404-236595aa99bd-catalog-content\") pod \"redhat-marketplace-nxl9d\" (UID: \"d0417e2e-2041-42b0-a404-236595aa99bd\") " pod="openshift-marketplace/redhat-marketplace-nxl9d" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.005585 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0417e2e-2041-42b0-a404-236595aa99bd-utilities\") pod \"redhat-marketplace-nxl9d\" (UID: \"d0417e2e-2041-42b0-a404-236595aa99bd\") " pod="openshift-marketplace/redhat-marketplace-nxl9d" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.071764 4672 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.072089 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.081199 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2s5f\" (UniqueName: \"kubernetes.io/projected/d0417e2e-2041-42b0-a404-236595aa99bd-kube-api-access-t2s5f\") pod \"redhat-marketplace-nxl9d\" (UID: \"d0417e2e-2041-42b0-a404-236595aa99bd\") " pod="openshift-marketplace/redhat-marketplace-nxl9d" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.136721 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hqx6n"] Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.138662 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hqx6n" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.149790 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hqx6n"] Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.165555 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nxl9d" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.195033 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zr2vx"] Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.216018 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmnd2\" (UniqueName: \"kubernetes.io/projected/f512bc0f-8532-4696-aac9-746557867772-kube-api-access-hmnd2\") pod \"redhat-marketplace-hqx6n\" (UID: \"f512bc0f-8532-4696-aac9-746557867772\") " pod="openshift-marketplace/redhat-marketplace-hqx6n" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.216102 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f512bc0f-8532-4696-aac9-746557867772-catalog-content\") pod \"redhat-marketplace-hqx6n\" (UID: \"f512bc0f-8532-4696-aac9-746557867772\") " pod="openshift-marketplace/redhat-marketplace-hqx6n" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.216206 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f512bc0f-8532-4696-aac9-746557867772-utilities\") pod \"redhat-marketplace-hqx6n\" (UID: \"f512bc0f-8532-4696-aac9-746557867772\") " pod="openshift-marketplace/redhat-marketplace-hqx6n" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.317942 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f512bc0f-8532-4696-aac9-746557867772-catalog-content\") pod \"redhat-marketplace-hqx6n\" (UID: \"f512bc0f-8532-4696-aac9-746557867772\") " pod="openshift-marketplace/redhat-marketplace-hqx6n" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.319347 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fca5f829-3091-4191-abf5-2bece3ab91f7-metrics-certs\") pod \"network-metrics-daemon-w587t\" (UID: \"fca5f829-3091-4191-abf5-2bece3ab91f7\") " pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.319376 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f512bc0f-8532-4696-aac9-746557867772-utilities\") pod \"redhat-marketplace-hqx6n\" (UID: \"f512bc0f-8532-4696-aac9-746557867772\") " pod="openshift-marketplace/redhat-marketplace-hqx6n" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.319452 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmnd2\" (UniqueName: \"kubernetes.io/projected/f512bc0f-8532-4696-aac9-746557867772-kube-api-access-hmnd2\") pod \"redhat-marketplace-hqx6n\" (UID: \"f512bc0f-8532-4696-aac9-746557867772\") " pod="openshift-marketplace/redhat-marketplace-hqx6n" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.319801 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f512bc0f-8532-4696-aac9-746557867772-catalog-content\") pod \"redhat-marketplace-hqx6n\" (UID: \"f512bc0f-8532-4696-aac9-746557867772\") " pod="openshift-marketplace/redhat-marketplace-hqx6n" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.325527 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fca5f829-3091-4191-abf5-2bece3ab91f7-metrics-certs\") pod \"network-metrics-daemon-w587t\" (UID: \"fca5f829-3091-4191-abf5-2bece3ab91f7\") " pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.355880 4672 patch_prober.go:28] interesting pod/router-default-5444994796-x9m9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 09:09:03 crc kubenswrapper[4672]: [-]has-synced failed: reason withheld Dec 06 09:09:03 crc kubenswrapper[4672]: [+]process-running ok Dec 06 09:09:03 crc kubenswrapper[4672]: healthz check failed Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.355967 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x9m9h" podUID="77d1174d-bfc2-4145-9bf2-c2b648f903e8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.356824 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f512bc0f-8532-4696-aac9-746557867772-utilities\") pod \"redhat-marketplace-hqx6n\" (UID: \"f512bc0f-8532-4696-aac9-746557867772\") " pod="openshift-marketplace/redhat-marketplace-hqx6n" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.359448 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmnd2\" (UniqueName: \"kubernetes.io/projected/f512bc0f-8532-4696-aac9-746557867772-kube-api-access-hmnd2\") pod \"redhat-marketplace-hqx6n\" (UID: \"f512bc0f-8532-4696-aac9-746557867772\") " pod="openshift-marketplace/redhat-marketplace-hqx6n" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.499795 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hqx6n" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.567949 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w587t" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.622563 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.759103 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w7htz"] Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.760549 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w7htz" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.801746 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.822719 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbpp6\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.845132 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ade26230-5c3c-4b75-bef5-9383cab17974-utilities\") pod \"redhat-operators-w7htz\" (UID: \"ade26230-5c3c-4b75-bef5-9383cab17974\") " pod="openshift-marketplace/redhat-operators-w7htz" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.846434 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bjxt\" (UniqueName: \"kubernetes.io/projected/ade26230-5c3c-4b75-bef5-9383cab17974-kube-api-access-2bjxt\") pod \"redhat-operators-w7htz\" (UID: \"ade26230-5c3c-4b75-bef5-9383cab17974\") " pod="openshift-marketplace/redhat-operators-w7htz" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.847535 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ade26230-5c3c-4b75-bef5-9383cab17974-catalog-content\") pod \"redhat-operators-w7htz\" (UID: \"ade26230-5c3c-4b75-bef5-9383cab17974\") " pod="openshift-marketplace/redhat-operators-w7htz" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.868615 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w7htz"] Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.889912 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zr2vx" event={"ID":"4fe165df-a354-4ea2-a51b-dabdaeb654f2","Type":"ContainerStarted","Data":"8827559ef78bbc4a7dde429eab3d2909ba3a2d8890cd228143be2063a6bc1798"} Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.890922 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.897012 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bpn8p" event={"ID":"9fef1798-dde5-4ef8-a4fa-5a5997738964","Type":"ContainerStarted","Data":"82dcf6166f8225d14c8fbfa6628b8cee496f9ca4766716ed9e9e6fe50b818bb0"} Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.910773 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e09a9ce1-4aea-4258-9886-fa3312588ab8","Type":"ContainerStarted","Data":"a97e0c8137d52094bcdd918bc5540218a31a958c4ae99e42159e3c230daa155f"} Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.914794 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vtk7z"] Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.948924 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ade26230-5c3c-4b75-bef5-9383cab17974-utilities\") pod \"redhat-operators-w7htz\" (UID: \"ade26230-5c3c-4b75-bef5-9383cab17974\") " pod="openshift-marketplace/redhat-operators-w7htz" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.948981 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bjxt\" (UniqueName: \"kubernetes.io/projected/ade26230-5c3c-4b75-bef5-9383cab17974-kube-api-access-2bjxt\") pod \"redhat-operators-w7htz\" (UID: \"ade26230-5c3c-4b75-bef5-9383cab17974\") " pod="openshift-marketplace/redhat-operators-w7htz" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.949026 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ade26230-5c3c-4b75-bef5-9383cab17974-catalog-content\") pod \"redhat-operators-w7htz\" (UID: \"ade26230-5c3c-4b75-bef5-9383cab17974\") " pod="openshift-marketplace/redhat-operators-w7htz" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.949905 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ade26230-5c3c-4b75-bef5-9383cab17974-utilities\") pod \"redhat-operators-w7htz\" (UID: \"ade26230-5c3c-4b75-bef5-9383cab17974\") " pod="openshift-marketplace/redhat-operators-w7htz" Dec 06 09:09:03 crc kubenswrapper[4672]: I1206 09:09:03.950218 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ade26230-5c3c-4b75-bef5-9383cab17974-catalog-content\") pod \"redhat-operators-w7htz\" (UID: \"ade26230-5c3c-4b75-bef5-9383cab17974\") " pod="openshift-marketplace/redhat-operators-w7htz" Dec 06 09:09:04 crc kubenswrapper[4672]: I1206 09:09:04.012452 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bjxt\" (UniqueName: \"kubernetes.io/projected/ade26230-5c3c-4b75-bef5-9383cab17974-kube-api-access-2bjxt\") pod \"redhat-operators-w7htz\" (UID: \"ade26230-5c3c-4b75-bef5-9383cab17974\") " pod="openshift-marketplace/redhat-operators-w7htz" Dec 06 09:09:04 crc kubenswrapper[4672]: I1206 09:09:04.063076 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nxl9d"] Dec 06 09:09:04 crc kubenswrapper[4672]: I1206 09:09:04.100748 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w7htz" Dec 06 09:09:04 crc kubenswrapper[4672]: I1206 09:09:04.106150 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jdkg9"] Dec 06 09:09:04 crc kubenswrapper[4672]: I1206 09:09:04.107470 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jdkg9" Dec 06 09:09:04 crc kubenswrapper[4672]: I1206 09:09:04.132648 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jdkg9"] Dec 06 09:09:04 crc kubenswrapper[4672]: I1206 09:09:04.191114 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-69p9s"] Dec 06 09:09:04 crc kubenswrapper[4672]: I1206 09:09:04.256985 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52c1e804-08d7-433b-b991-438c08d1bb62-utilities\") pod \"redhat-operators-jdkg9\" (UID: \"52c1e804-08d7-433b-b991-438c08d1bb62\") " pod="openshift-marketplace/redhat-operators-jdkg9" Dec 06 09:09:04 crc kubenswrapper[4672]: I1206 09:09:04.257040 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stsvh\" (UniqueName: \"kubernetes.io/projected/52c1e804-08d7-433b-b991-438c08d1bb62-kube-api-access-stsvh\") pod \"redhat-operators-jdkg9\" (UID: \"52c1e804-08d7-433b-b991-438c08d1bb62\") " pod="openshift-marketplace/redhat-operators-jdkg9" Dec 06 09:09:04 crc kubenswrapper[4672]: I1206 09:09:04.257095 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52c1e804-08d7-433b-b991-438c08d1bb62-catalog-content\") pod \"redhat-operators-jdkg9\" (UID: \"52c1e804-08d7-433b-b991-438c08d1bb62\") " pod="openshift-marketplace/redhat-operators-jdkg9" Dec 06 09:09:04 crc kubenswrapper[4672]: I1206 09:09:04.344430 4672 patch_prober.go:28] interesting pod/router-default-5444994796-x9m9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 09:09:04 crc kubenswrapper[4672]: [-]has-synced failed: reason withheld Dec 06 09:09:04 crc kubenswrapper[4672]: [+]process-running ok Dec 06 09:09:04 crc kubenswrapper[4672]: healthz check failed Dec 06 09:09:04 crc kubenswrapper[4672]: I1206 09:09:04.344708 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x9m9h" podUID="77d1174d-bfc2-4145-9bf2-c2b648f903e8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 09:09:04 crc kubenswrapper[4672]: I1206 09:09:04.362531 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52c1e804-08d7-433b-b991-438c08d1bb62-utilities\") pod \"redhat-operators-jdkg9\" (UID: \"52c1e804-08d7-433b-b991-438c08d1bb62\") " pod="openshift-marketplace/redhat-operators-jdkg9" Dec 06 09:09:04 crc kubenswrapper[4672]: I1206 09:09:04.362613 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stsvh\" (UniqueName: \"kubernetes.io/projected/52c1e804-08d7-433b-b991-438c08d1bb62-kube-api-access-stsvh\") pod \"redhat-operators-jdkg9\" (UID: \"52c1e804-08d7-433b-b991-438c08d1bb62\") " pod="openshift-marketplace/redhat-operators-jdkg9" Dec 06 09:09:04 crc kubenswrapper[4672]: I1206 09:09:04.362678 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52c1e804-08d7-433b-b991-438c08d1bb62-catalog-content\") pod \"redhat-operators-jdkg9\" (UID: \"52c1e804-08d7-433b-b991-438c08d1bb62\") " pod="openshift-marketplace/redhat-operators-jdkg9" Dec 06 09:09:04 crc kubenswrapper[4672]: I1206 09:09:04.365664 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52c1e804-08d7-433b-b991-438c08d1bb62-catalog-content\") pod \"redhat-operators-jdkg9\" (UID: \"52c1e804-08d7-433b-b991-438c08d1bb62\") " pod="openshift-marketplace/redhat-operators-jdkg9" Dec 06 09:09:04 crc kubenswrapper[4672]: I1206 09:09:04.365771 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52c1e804-08d7-433b-b991-438c08d1bb62-utilities\") pod \"redhat-operators-jdkg9\" (UID: \"52c1e804-08d7-433b-b991-438c08d1bb62\") " pod="openshift-marketplace/redhat-operators-jdkg9" Dec 06 09:09:04 crc kubenswrapper[4672]: E1206 09:09:04.436913 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fef1798_dde5_4ef8_a4fa_5a5997738964.slice/crio-conmon-82dcf6166f8225d14c8fbfa6628b8cee496f9ca4766716ed9e9e6fe50b818bb0.scope\": RecentStats: unable to find data in memory cache]" Dec 06 09:09:04 crc kubenswrapper[4672]: I1206 09:09:04.501169 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stsvh\" (UniqueName: \"kubernetes.io/projected/52c1e804-08d7-433b-b991-438c08d1bb62-kube-api-access-stsvh\") pod \"redhat-operators-jdkg9\" (UID: \"52c1e804-08d7-433b-b991-438c08d1bb62\") " pod="openshift-marketplace/redhat-operators-jdkg9" Dec 06 09:09:04 crc kubenswrapper[4672]: I1206 09:09:04.596257 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 06 09:09:04 crc kubenswrapper[4672]: I1206 09:09:04.597321 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hqx6n"] Dec 06 09:09:04 crc kubenswrapper[4672]: I1206 09:09:04.643043 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jdkg9" Dec 06 09:09:04 crc kubenswrapper[4672]: I1206 09:09:04.699785 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-w587t"] Dec 06 09:09:04 crc kubenswrapper[4672]: I1206 09:09:04.858072 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w7htz"] Dec 06 09:09:04 crc kubenswrapper[4672]: I1206 09:09:04.936719 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dbpp6"] Dec 06 09:09:04 crc kubenswrapper[4672]: I1206 09:09:04.969945 4672 generic.go:334] "Generic (PLEG): container finished" podID="88387d22-3fdc-4004-a9a1-4e6467c2c3f4" containerID="53dcfdc6e9ae892497324c12b3d0926f5143fae1cbbbf1d15f87a035cb7db4c4" exitCode=0 Dec 06 09:09:04 crc kubenswrapper[4672]: I1206 09:09:04.970396 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vtk7z" event={"ID":"88387d22-3fdc-4004-a9a1-4e6467c2c3f4","Type":"ContainerDied","Data":"53dcfdc6e9ae892497324c12b3d0926f5143fae1cbbbf1d15f87a035cb7db4c4"} Dec 06 09:09:04 crc kubenswrapper[4672]: I1206 09:09:04.970543 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vtk7z" event={"ID":"88387d22-3fdc-4004-a9a1-4e6467c2c3f4","Type":"ContainerStarted","Data":"3357306d01d41b4432de88f61aac1effd042a91062f27f8f1c05dcd7275d0300"} Dec 06 09:09:04 crc kubenswrapper[4672]: I1206 09:09:04.972953 4672 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 09:09:05 crc kubenswrapper[4672]: W1206 09:09:05.004728 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf874c07b_7566_441d_9546_6c3f7b64de13.slice/crio-884e6a4f2788b70182a892c58f356d22ceb1bfc18abcd28f5f6bf5d06079dd69 WatchSource:0}: Error finding container 884e6a4f2788b70182a892c58f356d22ceb1bfc18abcd28f5f6bf5d06079dd69: Status 404 returned error can't find the container with id 884e6a4f2788b70182a892c58f356d22ceb1bfc18abcd28f5f6bf5d06079dd69 Dec 06 09:09:05 crc kubenswrapper[4672]: I1206 09:09:05.005288 4672 generic.go:334] "Generic (PLEG): container finished" podID="9fef1798-dde5-4ef8-a4fa-5a5997738964" containerID="82dcf6166f8225d14c8fbfa6628b8cee496f9ca4766716ed9e9e6fe50b818bb0" exitCode=0 Dec 06 09:09:05 crc kubenswrapper[4672]: I1206 09:09:05.005348 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bpn8p" event={"ID":"9fef1798-dde5-4ef8-a4fa-5a5997738964","Type":"ContainerDied","Data":"82dcf6166f8225d14c8fbfa6628b8cee496f9ca4766716ed9e9e6fe50b818bb0"} Dec 06 09:09:05 crc kubenswrapper[4672]: I1206 09:09:05.044171 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wt47x" event={"ID":"3a64ce15-c29c-46af-91a1-309857493594","Type":"ContainerStarted","Data":"07909b25fa81305e1baef0a68a34008555e30fb20e6334774240f130792c3603"} Dec 06 09:09:05 crc kubenswrapper[4672]: I1206 09:09:05.074369 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-wt47x" podStartSLOduration=17.074349981 podStartE2EDuration="17.074349981s" podCreationTimestamp="2025-12-06 09:08:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:09:05.071117465 +0000 UTC m=+162.815377752" watchObservedRunningTime="2025-12-06 09:09:05.074349981 +0000 UTC m=+162.818610268" Dec 06 09:09:05 crc kubenswrapper[4672]: I1206 09:09:05.077870 4672 generic.go:334] "Generic (PLEG): container finished" podID="d0417e2e-2041-42b0-a404-236595aa99bd" containerID="65049d3f0a7a92b85a66828abf6e720274ec0a33a7dea6541feab55e794c104f" exitCode=0 Dec 06 09:09:05 crc kubenswrapper[4672]: I1206 09:09:05.077972 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nxl9d" event={"ID":"d0417e2e-2041-42b0-a404-236595aa99bd","Type":"ContainerDied","Data":"65049d3f0a7a92b85a66828abf6e720274ec0a33a7dea6541feab55e794c104f"} Dec 06 09:09:05 crc kubenswrapper[4672]: I1206 09:09:05.078003 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nxl9d" event={"ID":"d0417e2e-2041-42b0-a404-236595aa99bd","Type":"ContainerStarted","Data":"7737ee187c4aea6bad5e149d158948944e934c7bd16a4ccf03302f023c13859b"} Dec 06 09:09:05 crc kubenswrapper[4672]: I1206 09:09:05.106397 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e09a9ce1-4aea-4258-9886-fa3312588ab8","Type":"ContainerStarted","Data":"135d8da9e7c1cad3ac2d4e3be9c2e7a02c262c96459d48e740f26bb40c8e5884"} Dec 06 09:09:05 crc kubenswrapper[4672]: I1206 09:09:05.122223 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w7htz" event={"ID":"ade26230-5c3c-4b75-bef5-9383cab17974","Type":"ContainerStarted","Data":"eb163cf1d933c0e6932d94fcac7c4668e2dbd62160354e3cc58b69678cb43de2"} Dec 06 09:09:05 crc kubenswrapper[4672]: I1206 09:09:05.124177 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqx6n" event={"ID":"f512bc0f-8532-4696-aac9-746557867772","Type":"ContainerStarted","Data":"ffb8cd7dc1586d9b9d591d41fc728f4159f81977882de0534195d5ba9b170618"} Dec 06 09:09:05 crc kubenswrapper[4672]: I1206 09:09:05.137515 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=4.13749847 podStartE2EDuration="4.13749847s" podCreationTimestamp="2025-12-06 09:09:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:09:05.131372887 +0000 UTC m=+162.875633174" watchObservedRunningTime="2025-12-06 09:09:05.13749847 +0000 UTC m=+162.881758747" Dec 06 09:09:05 crc kubenswrapper[4672]: I1206 09:09:05.145799 4672 generic.go:334] "Generic (PLEG): container finished" podID="de182d83-c1ee-4b2a-827b-90fbf7d1e626" containerID="388c3efc1675c4721a2094c340dc6bc242f8189700922d99b508bef4508be516" exitCode=0 Dec 06 09:09:05 crc kubenswrapper[4672]: I1206 09:09:05.145915 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69p9s" event={"ID":"de182d83-c1ee-4b2a-827b-90fbf7d1e626","Type":"ContainerDied","Data":"388c3efc1675c4721a2094c340dc6bc242f8189700922d99b508bef4508be516"} Dec 06 09:09:05 crc kubenswrapper[4672]: I1206 09:09:05.145954 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69p9s" event={"ID":"de182d83-c1ee-4b2a-827b-90fbf7d1e626","Type":"ContainerStarted","Data":"d05d4592cef79d30f0e5985c7dd9b43b87c513b73f4830f263fdef51cd7309b3"} Dec 06 09:09:05 crc kubenswrapper[4672]: I1206 09:09:05.157516 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:09:05 crc kubenswrapper[4672]: I1206 09:09:05.168518 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-lcghp" Dec 06 09:09:05 crc kubenswrapper[4672]: I1206 09:09:05.171045 4672 generic.go:334] "Generic (PLEG): container finished" podID="4fe165df-a354-4ea2-a51b-dabdaeb654f2" containerID="f700390313e72449b5c7df7e410c296c7a28d52ea238d244a6a10c3fc67a3f6e" exitCode=0 Dec 06 09:09:05 crc kubenswrapper[4672]: I1206 09:09:05.171170 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zr2vx" event={"ID":"4fe165df-a354-4ea2-a51b-dabdaeb654f2","Type":"ContainerDied","Data":"f700390313e72449b5c7df7e410c296c7a28d52ea238d244a6a10c3fc67a3f6e"} Dec 06 09:09:05 crc kubenswrapper[4672]: I1206 09:09:05.184996 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-w587t" event={"ID":"fca5f829-3091-4191-abf5-2bece3ab91f7","Type":"ContainerStarted","Data":"97aaa54ffae7a4e834f535ef06cdda28ed11c14c7a90c77500d820aeee8fb9e5"} Dec 06 09:09:05 crc kubenswrapper[4672]: I1206 09:09:05.191218 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jdkg9"] Dec 06 09:09:05 crc kubenswrapper[4672]: I1206 09:09:05.368730 4672 patch_prober.go:28] interesting pod/router-default-5444994796-x9m9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 09:09:05 crc kubenswrapper[4672]: [-]has-synced failed: reason withheld Dec 06 09:09:05 crc kubenswrapper[4672]: [+]process-running ok Dec 06 09:09:05 crc kubenswrapper[4672]: healthz check failed Dec 06 09:09:05 crc kubenswrapper[4672]: I1206 09:09:05.368818 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x9m9h" podUID="77d1174d-bfc2-4145-9bf2-c2b648f903e8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.218900 4672 generic.go:334] "Generic (PLEG): container finished" podID="ade26230-5c3c-4b75-bef5-9383cab17974" containerID="ebb9a7c57ed11e45af592c7cf61910e7758db51a3df1139aeef6bc02d6658b4b" exitCode=0 Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.220125 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w7htz" event={"ID":"ade26230-5c3c-4b75-bef5-9383cab17974","Type":"ContainerDied","Data":"ebb9a7c57ed11e45af592c7cf61910e7758db51a3df1139aeef6bc02d6658b4b"} Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.261748 4672 generic.go:334] "Generic (PLEG): container finished" podID="f512bc0f-8532-4696-aac9-746557867772" containerID="e6f99c0bc1f16b51b08f4b34ff3e32b48a7744e0e2f061252344ddee13e9e04c" exitCode=0 Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.261851 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqx6n" event={"ID":"f512bc0f-8532-4696-aac9-746557867772","Type":"ContainerDied","Data":"e6f99c0bc1f16b51b08f4b34ff3e32b48a7744e0e2f061252344ddee13e9e04c"} Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.305165 4672 generic.go:334] "Generic (PLEG): container finished" podID="b8a761a8-3e6d-42eb-b0f8-db388dcf6952" containerID="713878a4e078961632df76317843927d6c71d5d3568ea257d4d5594048833128" exitCode=0 Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.305280 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-wvm9x" event={"ID":"b8a761a8-3e6d-42eb-b0f8-db388dcf6952","Type":"ContainerDied","Data":"713878a4e078961632df76317843927d6c71d5d3568ea257d4d5594048833128"} Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.344067 4672 patch_prober.go:28] interesting pod/router-default-5444994796-x9m9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 09:09:06 crc kubenswrapper[4672]: [-]has-synced failed: reason withheld Dec 06 09:09:06 crc kubenswrapper[4672]: [+]process-running ok Dec 06 09:09:06 crc kubenswrapper[4672]: healthz check failed Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.344188 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x9m9h" podUID="77d1174d-bfc2-4145-9bf2-c2b648f903e8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.352662 4672 generic.go:334] "Generic (PLEG): container finished" podID="52c1e804-08d7-433b-b991-438c08d1bb62" containerID="ec1376086a18dfb9310cdb9fb7e09b9a72485443c0b4d56696668e590cbf984c" exitCode=0 Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.352767 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jdkg9" event={"ID":"52c1e804-08d7-433b-b991-438c08d1bb62","Type":"ContainerDied","Data":"ec1376086a18dfb9310cdb9fb7e09b9a72485443c0b4d56696668e590cbf984c"} Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.352807 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jdkg9" event={"ID":"52c1e804-08d7-433b-b991-438c08d1bb62","Type":"ContainerStarted","Data":"7b0696877c6f16191cd733482b8987ab8adbc428a04e40e60df01bddb5c5897b"} Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.371933 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-w587t" event={"ID":"fca5f829-3091-4191-abf5-2bece3ab91f7","Type":"ContainerStarted","Data":"03b8769287dee3f620fcacc347e969305a4c78b86ca6a07139e8b500693a7901"} Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.395071 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.397782 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.399719 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" event={"ID":"f874c07b-7566-441d-9546-6c3f7b64de13","Type":"ContainerStarted","Data":"083aa75bae22861c8f4f062ed48c6719e3c88e04195da26346b88708b03ce037"} Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.399874 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" event={"ID":"f874c07b-7566-441d-9546-6c3f7b64de13","Type":"ContainerStarted","Data":"884e6a4f2788b70182a892c58f356d22ceb1bfc18abcd28f5f6bf5d06079dd69"} Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.400826 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.401268 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.407324 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.411505 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.417836 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-w587t" podStartSLOduration=146.417816055 podStartE2EDuration="2m26.417816055s" podCreationTimestamp="2025-12-06 09:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:09:06.417151046 +0000 UTC m=+164.161411343" watchObservedRunningTime="2025-12-06 09:09:06.417816055 +0000 UTC m=+164.162076342" Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.428265 4672 generic.go:334] "Generic (PLEG): container finished" podID="e09a9ce1-4aea-4258-9886-fa3312588ab8" containerID="135d8da9e7c1cad3ac2d4e3be9c2e7a02c262c96459d48e740f26bb40c8e5884" exitCode=0 Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.429531 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e09a9ce1-4aea-4258-9886-fa3312588ab8","Type":"ContainerDied","Data":"135d8da9e7c1cad3ac2d4e3be9c2e7a02c262c96459d48e740f26bb40c8e5884"} Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.481686 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" podStartSLOduration=145.481644695 podStartE2EDuration="2m25.481644695s" podCreationTimestamp="2025-12-06 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:09:06.471511881 +0000 UTC m=+164.215772188" watchObservedRunningTime="2025-12-06 09:09:06.481644695 +0000 UTC m=+164.225904982" Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.529774 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6812874b-5edb-455a-8d9b-656e501a9ff3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6812874b-5edb-455a-8d9b-656e501a9ff3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.529998 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6812874b-5edb-455a-8d9b-656e501a9ff3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6812874b-5edb-455a-8d9b-656e501a9ff3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.530362 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rqdv8" Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.632668 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6812874b-5edb-455a-8d9b-656e501a9ff3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6812874b-5edb-455a-8d9b-656e501a9ff3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.632804 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6812874b-5edb-455a-8d9b-656e501a9ff3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6812874b-5edb-455a-8d9b-656e501a9ff3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.633003 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6812874b-5edb-455a-8d9b-656e501a9ff3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6812874b-5edb-455a-8d9b-656e501a9ff3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.676795 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6812874b-5edb-455a-8d9b-656e501a9ff3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6812874b-5edb-455a-8d9b-656e501a9ff3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 09:09:06 crc kubenswrapper[4672]: I1206 09:09:06.749389 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 09:09:07 crc kubenswrapper[4672]: I1206 09:09:07.346973 4672 patch_prober.go:28] interesting pod/router-default-5444994796-x9m9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 09:09:07 crc kubenswrapper[4672]: [-]has-synced failed: reason withheld Dec 06 09:09:07 crc kubenswrapper[4672]: [+]process-running ok Dec 06 09:09:07 crc kubenswrapper[4672]: healthz check failed Dec 06 09:09:07 crc kubenswrapper[4672]: I1206 09:09:07.347488 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x9m9h" podUID="77d1174d-bfc2-4145-9bf2-c2b648f903e8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 09:09:07 crc kubenswrapper[4672]: I1206 09:09:07.498395 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-w587t" event={"ID":"fca5f829-3091-4191-abf5-2bece3ab91f7","Type":"ContainerStarted","Data":"3b83c86192d7a1a55b2ae780937246b8ef19b5664fdf2fbf0d1db9049e3a4ff5"} Dec 06 09:09:07 crc kubenswrapper[4672]: I1206 09:09:07.587699 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 06 09:09:08 crc kubenswrapper[4672]: I1206 09:09:08.094903 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-wvm9x" Dec 06 09:09:08 crc kubenswrapper[4672]: I1206 09:09:08.164658 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b8a761a8-3e6d-42eb-b0f8-db388dcf6952-config-volume\") pod \"b8a761a8-3e6d-42eb-b0f8-db388dcf6952\" (UID: \"b8a761a8-3e6d-42eb-b0f8-db388dcf6952\") " Dec 06 09:09:08 crc kubenswrapper[4672]: I1206 09:09:08.164828 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vwpz\" (UniqueName: \"kubernetes.io/projected/b8a761a8-3e6d-42eb-b0f8-db388dcf6952-kube-api-access-2vwpz\") pod \"b8a761a8-3e6d-42eb-b0f8-db388dcf6952\" (UID: \"b8a761a8-3e6d-42eb-b0f8-db388dcf6952\") " Dec 06 09:09:08 crc kubenswrapper[4672]: I1206 09:09:08.165112 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b8a761a8-3e6d-42eb-b0f8-db388dcf6952-secret-volume\") pod \"b8a761a8-3e6d-42eb-b0f8-db388dcf6952\" (UID: \"b8a761a8-3e6d-42eb-b0f8-db388dcf6952\") " Dec 06 09:09:08 crc kubenswrapper[4672]: I1206 09:09:08.166049 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8a761a8-3e6d-42eb-b0f8-db388dcf6952-config-volume" (OuterVolumeSpecName: "config-volume") pod "b8a761a8-3e6d-42eb-b0f8-db388dcf6952" (UID: "b8a761a8-3e6d-42eb-b0f8-db388dcf6952"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:09:08 crc kubenswrapper[4672]: I1206 09:09:08.170825 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 09:09:08 crc kubenswrapper[4672]: I1206 09:09:08.176768 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a761a8-3e6d-42eb-b0f8-db388dcf6952-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b8a761a8-3e6d-42eb-b0f8-db388dcf6952" (UID: "b8a761a8-3e6d-42eb-b0f8-db388dcf6952"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:09:08 crc kubenswrapper[4672]: I1206 09:09:08.176822 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8a761a8-3e6d-42eb-b0f8-db388dcf6952-kube-api-access-2vwpz" (OuterVolumeSpecName: "kube-api-access-2vwpz") pod "b8a761a8-3e6d-42eb-b0f8-db388dcf6952" (UID: "b8a761a8-3e6d-42eb-b0f8-db388dcf6952"). InnerVolumeSpecName "kube-api-access-2vwpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:09:08 crc kubenswrapper[4672]: I1206 09:09:08.267199 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e09a9ce1-4aea-4258-9886-fa3312588ab8-kubelet-dir\") pod \"e09a9ce1-4aea-4258-9886-fa3312588ab8\" (UID: \"e09a9ce1-4aea-4258-9886-fa3312588ab8\") " Dec 06 09:09:08 crc kubenswrapper[4672]: I1206 09:09:08.267323 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e09a9ce1-4aea-4258-9886-fa3312588ab8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e09a9ce1-4aea-4258-9886-fa3312588ab8" (UID: "e09a9ce1-4aea-4258-9886-fa3312588ab8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:09:08 crc kubenswrapper[4672]: I1206 09:09:08.267373 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e09a9ce1-4aea-4258-9886-fa3312588ab8-kube-api-access\") pod \"e09a9ce1-4aea-4258-9886-fa3312588ab8\" (UID: \"e09a9ce1-4aea-4258-9886-fa3312588ab8\") " Dec 06 09:09:08 crc kubenswrapper[4672]: I1206 09:09:08.268207 4672 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b8a761a8-3e6d-42eb-b0f8-db388dcf6952-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:08 crc kubenswrapper[4672]: I1206 09:09:08.268257 4672 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b8a761a8-3e6d-42eb-b0f8-db388dcf6952-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:08 crc kubenswrapper[4672]: I1206 09:09:08.268269 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vwpz\" (UniqueName: \"kubernetes.io/projected/b8a761a8-3e6d-42eb-b0f8-db388dcf6952-kube-api-access-2vwpz\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:08 crc kubenswrapper[4672]: I1206 09:09:08.268279 4672 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e09a9ce1-4aea-4258-9886-fa3312588ab8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:08 crc kubenswrapper[4672]: I1206 09:09:08.272380 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e09a9ce1-4aea-4258-9886-fa3312588ab8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e09a9ce1-4aea-4258-9886-fa3312588ab8" (UID: "e09a9ce1-4aea-4258-9886-fa3312588ab8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:09:08 crc kubenswrapper[4672]: I1206 09:09:08.365076 4672 patch_prober.go:28] interesting pod/router-default-5444994796-x9m9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 09:09:08 crc kubenswrapper[4672]: [-]has-synced failed: reason withheld Dec 06 09:09:08 crc kubenswrapper[4672]: [+]process-running ok Dec 06 09:09:08 crc kubenswrapper[4672]: healthz check failed Dec 06 09:09:08 crc kubenswrapper[4672]: I1206 09:09:08.365155 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x9m9h" podUID="77d1174d-bfc2-4145-9bf2-c2b648f903e8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 09:09:08 crc kubenswrapper[4672]: I1206 09:09:08.370266 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e09a9ce1-4aea-4258-9886-fa3312588ab8-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:08 crc kubenswrapper[4672]: I1206 09:09:08.583809 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 09:09:08 crc kubenswrapper[4672]: I1206 09:09:08.585787 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6812874b-5edb-455a-8d9b-656e501a9ff3","Type":"ContainerStarted","Data":"9d8467ead2db64672ce6ac5e43da870d4ace3149063e6c5687d28c52cfe4087a"} Dec 06 09:09:08 crc kubenswrapper[4672]: I1206 09:09:08.585842 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e09a9ce1-4aea-4258-9886-fa3312588ab8","Type":"ContainerDied","Data":"a97e0c8137d52094bcdd918bc5540218a31a958c4ae99e42159e3c230daa155f"} Dec 06 09:09:08 crc kubenswrapper[4672]: I1206 09:09:08.585863 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a97e0c8137d52094bcdd918bc5540218a31a958c4ae99e42159e3c230daa155f" Dec 06 09:09:08 crc kubenswrapper[4672]: I1206 09:09:08.591691 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-wvm9x" Dec 06 09:09:08 crc kubenswrapper[4672]: I1206 09:09:08.596048 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-wvm9x" event={"ID":"b8a761a8-3e6d-42eb-b0f8-db388dcf6952","Type":"ContainerDied","Data":"81818567450fc7e4c6bde0f22e56da5186a9bba63c1fa437f01b1334ea4cd30a"} Dec 06 09:09:08 crc kubenswrapper[4672]: I1206 09:09:08.596094 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81818567450fc7e4c6bde0f22e56da5186a9bba63c1fa437f01b1334ea4cd30a" Dec 06 09:09:09 crc kubenswrapper[4672]: I1206 09:09:09.344079 4672 patch_prober.go:28] interesting pod/router-default-5444994796-x9m9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 09:09:09 crc kubenswrapper[4672]: [-]has-synced failed: reason withheld Dec 06 09:09:09 crc kubenswrapper[4672]: [+]process-running ok Dec 06 09:09:09 crc kubenswrapper[4672]: healthz check failed Dec 06 09:09:09 crc kubenswrapper[4672]: I1206 09:09:09.344457 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x9m9h" podUID="77d1174d-bfc2-4145-9bf2-c2b648f903e8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 09:09:09 crc kubenswrapper[4672]: I1206 09:09:09.615279 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6812874b-5edb-455a-8d9b-656e501a9ff3","Type":"ContainerStarted","Data":"36a8c34e4a2325932f88d00f2bd4d0ecbdfef85cc6e66a606e85b1cf22b0618d"} Dec 06 09:09:10 crc kubenswrapper[4672]: I1206 09:09:10.076327 4672 patch_prober.go:28] interesting pod/console-f9d7485db-dcdqg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Dec 06 09:09:10 crc kubenswrapper[4672]: I1206 09:09:10.076429 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-dcdqg" podUID="de34b8a9-076f-4aa5-acb7-52361b6deeb8" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Dec 06 09:09:10 crc kubenswrapper[4672]: I1206 09:09:10.260887 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-x88bb" Dec 06 09:09:10 crc kubenswrapper[4672]: I1206 09:09:10.384887 4672 patch_prober.go:28] interesting pod/router-default-5444994796-x9m9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 09:09:10 crc kubenswrapper[4672]: [-]has-synced failed: reason withheld Dec 06 09:09:10 crc kubenswrapper[4672]: [+]process-running ok Dec 06 09:09:10 crc kubenswrapper[4672]: healthz check failed Dec 06 09:09:10 crc kubenswrapper[4672]: I1206 09:09:10.384974 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x9m9h" podUID="77d1174d-bfc2-4145-9bf2-c2b648f903e8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 09:09:10 crc kubenswrapper[4672]: I1206 09:09:10.647914 4672 generic.go:334] "Generic (PLEG): container finished" podID="6812874b-5edb-455a-8d9b-656e501a9ff3" containerID="36a8c34e4a2325932f88d00f2bd4d0ecbdfef85cc6e66a606e85b1cf22b0618d" exitCode=0 Dec 06 09:09:10 crc kubenswrapper[4672]: I1206 09:09:10.647972 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6812874b-5edb-455a-8d9b-656e501a9ff3","Type":"ContainerDied","Data":"36a8c34e4a2325932f88d00f2bd4d0ecbdfef85cc6e66a606e85b1cf22b0618d"} Dec 06 09:09:11 crc kubenswrapper[4672]: I1206 09:09:11.342477 4672 patch_prober.go:28] interesting pod/router-default-5444994796-x9m9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 09:09:11 crc kubenswrapper[4672]: [-]has-synced failed: reason withheld Dec 06 09:09:11 crc kubenswrapper[4672]: [+]process-running ok Dec 06 09:09:11 crc kubenswrapper[4672]: healthz check failed Dec 06 09:09:11 crc kubenswrapper[4672]: I1206 09:09:11.342558 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x9m9h" podUID="77d1174d-bfc2-4145-9bf2-c2b648f903e8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 09:09:11 crc kubenswrapper[4672]: I1206 09:09:11.429252 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xqnzx" Dec 06 09:09:12 crc kubenswrapper[4672]: I1206 09:09:12.241150 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 09:09:12 crc kubenswrapper[4672]: I1206 09:09:12.319681 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:09:12 crc kubenswrapper[4672]: I1206 09:09:12.319764 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:09:12 crc kubenswrapper[4672]: I1206 09:09:12.344499 4672 patch_prober.go:28] interesting pod/router-default-5444994796-x9m9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 09:09:12 crc kubenswrapper[4672]: [-]has-synced failed: reason withheld Dec 06 09:09:12 crc kubenswrapper[4672]: [+]process-running ok Dec 06 09:09:12 crc kubenswrapper[4672]: healthz check failed Dec 06 09:09:12 crc kubenswrapper[4672]: I1206 09:09:12.344567 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x9m9h" podUID="77d1174d-bfc2-4145-9bf2-c2b648f903e8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 09:09:12 crc kubenswrapper[4672]: I1206 09:09:12.412175 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6812874b-5edb-455a-8d9b-656e501a9ff3-kubelet-dir\") pod \"6812874b-5edb-455a-8d9b-656e501a9ff3\" (UID: \"6812874b-5edb-455a-8d9b-656e501a9ff3\") " Dec 06 09:09:12 crc kubenswrapper[4672]: I1206 09:09:12.412416 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6812874b-5edb-455a-8d9b-656e501a9ff3-kube-api-access\") pod \"6812874b-5edb-455a-8d9b-656e501a9ff3\" (UID: \"6812874b-5edb-455a-8d9b-656e501a9ff3\") " Dec 06 09:09:12 crc kubenswrapper[4672]: I1206 09:09:12.412549 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6812874b-5edb-455a-8d9b-656e501a9ff3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6812874b-5edb-455a-8d9b-656e501a9ff3" (UID: "6812874b-5edb-455a-8d9b-656e501a9ff3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:09:12 crc kubenswrapper[4672]: I1206 09:09:12.413971 4672 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6812874b-5edb-455a-8d9b-656e501a9ff3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:12 crc kubenswrapper[4672]: I1206 09:09:12.433681 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6812874b-5edb-455a-8d9b-656e501a9ff3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6812874b-5edb-455a-8d9b-656e501a9ff3" (UID: "6812874b-5edb-455a-8d9b-656e501a9ff3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:09:12 crc kubenswrapper[4672]: I1206 09:09:12.515304 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6812874b-5edb-455a-8d9b-656e501a9ff3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:12 crc kubenswrapper[4672]: I1206 09:09:12.713403 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6812874b-5edb-455a-8d9b-656e501a9ff3","Type":"ContainerDied","Data":"9d8467ead2db64672ce6ac5e43da870d4ace3149063e6c5687d28c52cfe4087a"} Dec 06 09:09:12 crc kubenswrapper[4672]: I1206 09:09:12.713471 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d8467ead2db64672ce6ac5e43da870d4ace3149063e6c5687d28c52cfe4087a" Dec 06 09:09:12 crc kubenswrapper[4672]: I1206 09:09:12.713546 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 09:09:13 crc kubenswrapper[4672]: I1206 09:09:13.343609 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-x9m9h" Dec 06 09:09:13 crc kubenswrapper[4672]: I1206 09:09:13.349762 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-x9m9h" Dec 06 09:09:20 crc kubenswrapper[4672]: I1206 09:09:20.080058 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-dcdqg" Dec 06 09:09:20 crc kubenswrapper[4672]: I1206 09:09:20.085732 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-dcdqg" Dec 06 09:09:23 crc kubenswrapper[4672]: I1206 09:09:23.898222 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:09:31 crc kubenswrapper[4672]: I1206 09:09:31.091999 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c2g6x" Dec 06 09:09:31 crc kubenswrapper[4672]: I1206 09:09:31.101961 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 09:09:40 crc kubenswrapper[4672]: E1206 09:09:40.064675 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 06 09:09:40 crc kubenswrapper[4672]: E1206 09:09:40.065750 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hmnd2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-hqx6n_openshift-marketplace(f512bc0f-8532-4696-aac9-746557867772): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 06 09:09:40 crc kubenswrapper[4672]: E1206 09:09:40.067045 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-hqx6n" podUID="f512bc0f-8532-4696-aac9-746557867772" Dec 06 09:09:42 crc kubenswrapper[4672]: I1206 09:09:42.044231 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 06 09:09:42 crc kubenswrapper[4672]: E1206 09:09:42.045035 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6812874b-5edb-455a-8d9b-656e501a9ff3" containerName="pruner" Dec 06 09:09:42 crc kubenswrapper[4672]: I1206 09:09:42.045052 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="6812874b-5edb-455a-8d9b-656e501a9ff3" containerName="pruner" Dec 06 09:09:42 crc kubenswrapper[4672]: E1206 09:09:42.045075 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e09a9ce1-4aea-4258-9886-fa3312588ab8" containerName="pruner" Dec 06 09:09:42 crc kubenswrapper[4672]: I1206 09:09:42.045084 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="e09a9ce1-4aea-4258-9886-fa3312588ab8" containerName="pruner" Dec 06 09:09:42 crc kubenswrapper[4672]: E1206 09:09:42.045100 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8a761a8-3e6d-42eb-b0f8-db388dcf6952" containerName="collect-profiles" Dec 06 09:09:42 crc kubenswrapper[4672]: I1206 09:09:42.045107 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a761a8-3e6d-42eb-b0f8-db388dcf6952" containerName="collect-profiles" Dec 06 09:09:42 crc kubenswrapper[4672]: I1206 09:09:42.045252 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="6812874b-5edb-455a-8d9b-656e501a9ff3" containerName="pruner" Dec 06 09:09:42 crc kubenswrapper[4672]: I1206 09:09:42.045265 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="e09a9ce1-4aea-4258-9886-fa3312588ab8" containerName="pruner" Dec 06 09:09:42 crc kubenswrapper[4672]: I1206 09:09:42.045273 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8a761a8-3e6d-42eb-b0f8-db388dcf6952" containerName="collect-profiles" Dec 06 09:09:42 crc kubenswrapper[4672]: I1206 09:09:42.045792 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 09:09:42 crc kubenswrapper[4672]: I1206 09:09:42.048696 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 06 09:09:42 crc kubenswrapper[4672]: I1206 09:09:42.051136 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 06 09:09:42 crc kubenswrapper[4672]: I1206 09:09:42.051934 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 06 09:09:42 crc kubenswrapper[4672]: E1206 09:09:42.072813 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-hqx6n" podUID="f512bc0f-8532-4696-aac9-746557867772" Dec 06 09:09:42 crc kubenswrapper[4672]: E1206 09:09:42.154496 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 06 09:09:42 crc kubenswrapper[4672]: E1206 09:09:42.155087 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t2s5f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-nxl9d_openshift-marketplace(d0417e2e-2041-42b0-a404-236595aa99bd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 06 09:09:42 crc kubenswrapper[4672]: E1206 09:09:42.156360 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-nxl9d" podUID="d0417e2e-2041-42b0-a404-236595aa99bd" Dec 06 09:09:42 crc kubenswrapper[4672]: I1206 09:09:42.172038 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0dc29443-7987-44c6-a536-853ed548e87d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0dc29443-7987-44c6-a536-853ed548e87d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 09:09:42 crc kubenswrapper[4672]: I1206 09:09:42.172171 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0dc29443-7987-44c6-a536-853ed548e87d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0dc29443-7987-44c6-a536-853ed548e87d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 09:09:42 crc kubenswrapper[4672]: I1206 09:09:42.274225 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0dc29443-7987-44c6-a536-853ed548e87d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0dc29443-7987-44c6-a536-853ed548e87d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 09:09:42 crc kubenswrapper[4672]: I1206 09:09:42.274373 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0dc29443-7987-44c6-a536-853ed548e87d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0dc29443-7987-44c6-a536-853ed548e87d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 09:09:42 crc kubenswrapper[4672]: I1206 09:09:42.274841 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0dc29443-7987-44c6-a536-853ed548e87d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0dc29443-7987-44c6-a536-853ed548e87d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 09:09:42 crc kubenswrapper[4672]: I1206 09:09:42.312834 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0dc29443-7987-44c6-a536-853ed548e87d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0dc29443-7987-44c6-a536-853ed548e87d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 09:09:42 crc kubenswrapper[4672]: I1206 09:09:42.319363 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:09:42 crc kubenswrapper[4672]: I1206 09:09:42.319423 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:09:42 crc kubenswrapper[4672]: I1206 09:09:42.370998 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 09:09:47 crc kubenswrapper[4672]: I1206 09:09:47.333069 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 06 09:09:47 crc kubenswrapper[4672]: I1206 09:09:47.334758 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 06 09:09:47 crc kubenswrapper[4672]: I1206 09:09:47.334854 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 06 09:09:47 crc kubenswrapper[4672]: E1206 09:09:47.445209 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-nxl9d" podUID="d0417e2e-2041-42b0-a404-236595aa99bd" Dec 06 09:09:47 crc kubenswrapper[4672]: I1206 09:09:47.453496 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a99b5768-a729-41cc-9cfb-9c6ed85c9fc9-var-lock\") pod \"installer-9-crc\" (UID: \"a99b5768-a729-41cc-9cfb-9c6ed85c9fc9\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 09:09:47 crc kubenswrapper[4672]: I1206 09:09:47.454026 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a99b5768-a729-41cc-9cfb-9c6ed85c9fc9-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a99b5768-a729-41cc-9cfb-9c6ed85c9fc9\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 09:09:47 crc kubenswrapper[4672]: I1206 09:09:47.454121 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a99b5768-a729-41cc-9cfb-9c6ed85c9fc9-kube-api-access\") pod \"installer-9-crc\" (UID: \"a99b5768-a729-41cc-9cfb-9c6ed85c9fc9\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 09:09:47 crc kubenswrapper[4672]: I1206 09:09:47.555723 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a99b5768-a729-41cc-9cfb-9c6ed85c9fc9-kube-api-access\") pod \"installer-9-crc\" (UID: \"a99b5768-a729-41cc-9cfb-9c6ed85c9fc9\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 09:09:47 crc kubenswrapper[4672]: I1206 09:09:47.555808 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a99b5768-a729-41cc-9cfb-9c6ed85c9fc9-var-lock\") pod \"installer-9-crc\" (UID: \"a99b5768-a729-41cc-9cfb-9c6ed85c9fc9\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 09:09:47 crc kubenswrapper[4672]: I1206 09:09:47.555867 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a99b5768-a729-41cc-9cfb-9c6ed85c9fc9-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a99b5768-a729-41cc-9cfb-9c6ed85c9fc9\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 09:09:47 crc kubenswrapper[4672]: I1206 09:09:47.555949 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a99b5768-a729-41cc-9cfb-9c6ed85c9fc9-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a99b5768-a729-41cc-9cfb-9c6ed85c9fc9\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 09:09:47 crc kubenswrapper[4672]: I1206 09:09:47.556050 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a99b5768-a729-41cc-9cfb-9c6ed85c9fc9-var-lock\") pod \"installer-9-crc\" (UID: \"a99b5768-a729-41cc-9cfb-9c6ed85c9fc9\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 09:09:47 crc kubenswrapper[4672]: I1206 09:09:47.589524 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a99b5768-a729-41cc-9cfb-9c6ed85c9fc9-kube-api-access\") pod \"installer-9-crc\" (UID: \"a99b5768-a729-41cc-9cfb-9c6ed85c9fc9\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 09:09:47 crc kubenswrapper[4672]: E1206 09:09:47.601893 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 06 09:09:47 crc kubenswrapper[4672]: E1206 09:09:47.602081 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-stsvh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-jdkg9_openshift-marketplace(52c1e804-08d7-433b-b991-438c08d1bb62): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 06 09:09:47 crc kubenswrapper[4672]: E1206 09:09:47.603359 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-jdkg9" podUID="52c1e804-08d7-433b-b991-438c08d1bb62" Dec 06 09:09:47 crc kubenswrapper[4672]: I1206 09:09:47.686397 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 06 09:09:51 crc kubenswrapper[4672]: E1206 09:09:51.244500 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-jdkg9" podUID="52c1e804-08d7-433b-b991-438c08d1bb62" Dec 06 09:09:51 crc kubenswrapper[4672]: E1206 09:09:51.372744 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 06 09:09:51 crc kubenswrapper[4672]: E1206 09:09:51.373527 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zx465,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-69p9s_openshift-marketplace(de182d83-c1ee-4b2a-827b-90fbf7d1e626): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 06 09:09:51 crc kubenswrapper[4672]: E1206 09:09:51.375014 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-69p9s" podUID="de182d83-c1ee-4b2a-827b-90fbf7d1e626" Dec 06 09:09:51 crc kubenswrapper[4672]: E1206 09:09:51.387980 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 06 09:09:51 crc kubenswrapper[4672]: E1206 09:09:51.388189 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2w56l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-zr2vx_openshift-marketplace(4fe165df-a354-4ea2-a51b-dabdaeb654f2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 06 09:09:51 crc kubenswrapper[4672]: E1206 09:09:51.389349 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-zr2vx" podUID="4fe165df-a354-4ea2-a51b-dabdaeb654f2" Dec 06 09:09:51 crc kubenswrapper[4672]: E1206 09:09:51.429889 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 06 09:09:51 crc kubenswrapper[4672]: E1206 09:09:51.430116 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5nlb2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bpn8p_openshift-marketplace(9fef1798-dde5-4ef8-a4fa-5a5997738964): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 06 09:09:51 crc kubenswrapper[4672]: E1206 09:09:51.432979 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-bpn8p" podUID="9fef1798-dde5-4ef8-a4fa-5a5997738964" Dec 06 09:09:51 crc kubenswrapper[4672]: E1206 09:09:51.465451 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 06 09:09:51 crc kubenswrapper[4672]: E1206 09:09:51.467620 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2bjxt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-w7htz_openshift-marketplace(ade26230-5c3c-4b75-bef5-9383cab17974): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 06 09:09:51 crc kubenswrapper[4672]: E1206 09:09:51.469173 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-w7htz" podUID="ade26230-5c3c-4b75-bef5-9383cab17974" Dec 06 09:09:51 crc kubenswrapper[4672]: I1206 09:09:51.620131 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 06 09:09:51 crc kubenswrapper[4672]: W1206 09:09:51.642385 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0dc29443_7987_44c6_a536_853ed548e87d.slice/crio-11ae73a31f39e9b4fbf9a638bb15d9e656d6b2106ee4299b5067a889fccd027e WatchSource:0}: Error finding container 11ae73a31f39e9b4fbf9a638bb15d9e656d6b2106ee4299b5067a889fccd027e: Status 404 returned error can't find the container with id 11ae73a31f39e9b4fbf9a638bb15d9e656d6b2106ee4299b5067a889fccd027e Dec 06 09:09:51 crc kubenswrapper[4672]: I1206 09:09:51.740810 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 06 09:09:52 crc kubenswrapper[4672]: I1206 09:09:52.350028 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a99b5768-a729-41cc-9cfb-9c6ed85c9fc9","Type":"ContainerStarted","Data":"5758bcbb33ca9ebbc408d7134193aed1b653558b2cff38e07ec1e595109b83c0"} Dec 06 09:09:52 crc kubenswrapper[4672]: I1206 09:09:52.353631 4672 generic.go:334] "Generic (PLEG): container finished" podID="88387d22-3fdc-4004-a9a1-4e6467c2c3f4" containerID="76b41f0806edc9db8e5cf7a9d902b73f2c71f8dbb4e98f6f4c95ff72614dc2fc" exitCode=0 Dec 06 09:09:52 crc kubenswrapper[4672]: I1206 09:09:52.353725 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vtk7z" event={"ID":"88387d22-3fdc-4004-a9a1-4e6467c2c3f4","Type":"ContainerDied","Data":"76b41f0806edc9db8e5cf7a9d902b73f2c71f8dbb4e98f6f4c95ff72614dc2fc"} Dec 06 09:09:52 crc kubenswrapper[4672]: I1206 09:09:52.357851 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0dc29443-7987-44c6-a536-853ed548e87d","Type":"ContainerStarted","Data":"11ae73a31f39e9b4fbf9a638bb15d9e656d6b2106ee4299b5067a889fccd027e"} Dec 06 09:09:52 crc kubenswrapper[4672]: E1206 09:09:52.359546 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-w7htz" podUID="ade26230-5c3c-4b75-bef5-9383cab17974" Dec 06 09:09:52 crc kubenswrapper[4672]: E1206 09:09:52.359793 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bpn8p" podUID="9fef1798-dde5-4ef8-a4fa-5a5997738964" Dec 06 09:09:52 crc kubenswrapper[4672]: E1206 09:09:52.360237 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-zr2vx" podUID="4fe165df-a354-4ea2-a51b-dabdaeb654f2" Dec 06 09:09:52 crc kubenswrapper[4672]: E1206 09:09:52.364722 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-69p9s" podUID="de182d83-c1ee-4b2a-827b-90fbf7d1e626" Dec 06 09:09:53 crc kubenswrapper[4672]: I1206 09:09:53.365858 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a99b5768-a729-41cc-9cfb-9c6ed85c9fc9","Type":"ContainerStarted","Data":"3a0ad79623a09617def88fec1e134c89610428cd08342f84c8e98092b21b5618"} Dec 06 09:09:53 crc kubenswrapper[4672]: I1206 09:09:53.372079 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vtk7z" event={"ID":"88387d22-3fdc-4004-a9a1-4e6467c2c3f4","Type":"ContainerStarted","Data":"50e6c69e5b31c9f343482f2aae087ca7e0ed2cd9a9afded7802d0de715b2a989"} Dec 06 09:09:53 crc kubenswrapper[4672]: I1206 09:09:53.380543 4672 generic.go:334] "Generic (PLEG): container finished" podID="0dc29443-7987-44c6-a536-853ed548e87d" containerID="616cc34160f441924c3602b9dbcaf321c47555200000320819cb0f12e488caf5" exitCode=0 Dec 06 09:09:53 crc kubenswrapper[4672]: I1206 09:09:53.380648 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0dc29443-7987-44c6-a536-853ed548e87d","Type":"ContainerDied","Data":"616cc34160f441924c3602b9dbcaf321c47555200000320819cb0f12e488caf5"} Dec 06 09:09:53 crc kubenswrapper[4672]: I1206 09:09:53.409485 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=6.409460056 podStartE2EDuration="6.409460056s" podCreationTimestamp="2025-12-06 09:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:09:53.38925146 +0000 UTC m=+211.133511747" watchObservedRunningTime="2025-12-06 09:09:53.409460056 +0000 UTC m=+211.153720353" Dec 06 09:09:53 crc kubenswrapper[4672]: I1206 09:09:53.431317 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vtk7z" podStartSLOduration=5.491477582 podStartE2EDuration="53.431301992s" podCreationTimestamp="2025-12-06 09:09:00 +0000 UTC" firstStartedPulling="2025-12-06 09:09:04.972711741 +0000 UTC m=+162.716972028" lastFinishedPulling="2025-12-06 09:09:52.912536151 +0000 UTC m=+210.656796438" observedRunningTime="2025-12-06 09:09:53.426479137 +0000 UTC m=+211.170739444" watchObservedRunningTime="2025-12-06 09:09:53.431301992 +0000 UTC m=+211.175562279" Dec 06 09:09:54 crc kubenswrapper[4672]: I1206 09:09:54.685801 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 09:09:54 crc kubenswrapper[4672]: I1206 09:09:54.874810 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0dc29443-7987-44c6-a536-853ed548e87d-kube-api-access\") pod \"0dc29443-7987-44c6-a536-853ed548e87d\" (UID: \"0dc29443-7987-44c6-a536-853ed548e87d\") " Dec 06 09:09:54 crc kubenswrapper[4672]: I1206 09:09:54.874917 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0dc29443-7987-44c6-a536-853ed548e87d-kubelet-dir\") pod \"0dc29443-7987-44c6-a536-853ed548e87d\" (UID: \"0dc29443-7987-44c6-a536-853ed548e87d\") " Dec 06 09:09:54 crc kubenswrapper[4672]: I1206 09:09:54.875075 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0dc29443-7987-44c6-a536-853ed548e87d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0dc29443-7987-44c6-a536-853ed548e87d" (UID: "0dc29443-7987-44c6-a536-853ed548e87d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:09:54 crc kubenswrapper[4672]: I1206 09:09:54.875564 4672 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0dc29443-7987-44c6-a536-853ed548e87d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:54 crc kubenswrapper[4672]: I1206 09:09:54.892851 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dc29443-7987-44c6-a536-853ed548e87d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0dc29443-7987-44c6-a536-853ed548e87d" (UID: "0dc29443-7987-44c6-a536-853ed548e87d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:09:54 crc kubenswrapper[4672]: I1206 09:09:54.977550 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0dc29443-7987-44c6-a536-853ed548e87d-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:55 crc kubenswrapper[4672]: I1206 09:09:55.396289 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0dc29443-7987-44c6-a536-853ed548e87d","Type":"ContainerDied","Data":"11ae73a31f39e9b4fbf9a638bb15d9e656d6b2106ee4299b5067a889fccd027e"} Dec 06 09:09:55 crc kubenswrapper[4672]: I1206 09:09:55.396371 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11ae73a31f39e9b4fbf9a638bb15d9e656d6b2106ee4299b5067a889fccd027e" Dec 06 09:09:55 crc kubenswrapper[4672]: I1206 09:09:55.396387 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 09:10:01 crc kubenswrapper[4672]: I1206 09:10:01.435526 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqx6n" event={"ID":"f512bc0f-8532-4696-aac9-746557867772","Type":"ContainerStarted","Data":"8dae60887ef3a02e084ff0231ead8081329268148dd7675bb56c1e8f1cb6f655"} Dec 06 09:10:02 crc kubenswrapper[4672]: I1206 09:10:02.064927 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vtk7z" Dec 06 09:10:02 crc kubenswrapper[4672]: I1206 09:10:02.065838 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vtk7z" Dec 06 09:10:02 crc kubenswrapper[4672]: I1206 09:10:02.204232 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vtk7z" Dec 06 09:10:02 crc kubenswrapper[4672]: I1206 09:10:02.448781 4672 generic.go:334] "Generic (PLEG): container finished" podID="f512bc0f-8532-4696-aac9-746557867772" containerID="8dae60887ef3a02e084ff0231ead8081329268148dd7675bb56c1e8f1cb6f655" exitCode=0 Dec 06 09:10:02 crc kubenswrapper[4672]: I1206 09:10:02.449725 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqx6n" event={"ID":"f512bc0f-8532-4696-aac9-746557867772","Type":"ContainerDied","Data":"8dae60887ef3a02e084ff0231ead8081329268148dd7675bb56c1e8f1cb6f655"} Dec 06 09:10:02 crc kubenswrapper[4672]: I1206 09:10:02.501083 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vtk7z" Dec 06 09:10:03 crc kubenswrapper[4672]: I1206 09:10:03.458478 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqx6n" event={"ID":"f512bc0f-8532-4696-aac9-746557867772","Type":"ContainerStarted","Data":"1f490049b504f6d70a5a02f9f33a041a1cd1b7c2035ba9e83688241c245736d4"} Dec 06 09:10:03 crc kubenswrapper[4672]: I1206 09:10:03.462886 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jdkg9" event={"ID":"52c1e804-08d7-433b-b991-438c08d1bb62","Type":"ContainerStarted","Data":"9f32272056e25861fc1dc782a4d7e389ce81cbd6972fa4f38b0cc58ba6a3fa01"} Dec 06 09:10:03 crc kubenswrapper[4672]: I1206 09:10:03.465444 4672 generic.go:334] "Generic (PLEG): container finished" podID="d0417e2e-2041-42b0-a404-236595aa99bd" containerID="660e618a7f1c3314d8869abd5de51ffcf4c444f15e559b3cdc18fca66303b514" exitCode=0 Dec 06 09:10:03 crc kubenswrapper[4672]: I1206 09:10:03.465550 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nxl9d" event={"ID":"d0417e2e-2041-42b0-a404-236595aa99bd","Type":"ContainerDied","Data":"660e618a7f1c3314d8869abd5de51ffcf4c444f15e559b3cdc18fca66303b514"} Dec 06 09:10:03 crc kubenswrapper[4672]: I1206 09:10:03.479932 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hqx6n" podStartSLOduration=4.287602387 podStartE2EDuration="1m0.479910754s" podCreationTimestamp="2025-12-06 09:09:03 +0000 UTC" firstStartedPulling="2025-12-06 09:09:06.279676124 +0000 UTC m=+164.023936411" lastFinishedPulling="2025-12-06 09:10:02.471984491 +0000 UTC m=+220.216244778" observedRunningTime="2025-12-06 09:10:03.478734383 +0000 UTC m=+221.222994680" watchObservedRunningTime="2025-12-06 09:10:03.479910754 +0000 UTC m=+221.224171041" Dec 06 09:10:03 crc kubenswrapper[4672]: I1206 09:10:03.500952 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hqx6n" Dec 06 09:10:03 crc kubenswrapper[4672]: I1206 09:10:03.501127 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hqx6n" Dec 06 09:10:04 crc kubenswrapper[4672]: I1206 09:10:04.472459 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nxl9d" event={"ID":"d0417e2e-2041-42b0-a404-236595aa99bd","Type":"ContainerStarted","Data":"6a3045ed0620a57837ea7015e6b574508029ee6d010a85310ab89287e27baef3"} Dec 06 09:10:04 crc kubenswrapper[4672]: I1206 09:10:04.474945 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zr2vx" event={"ID":"4fe165df-a354-4ea2-a51b-dabdaeb654f2","Type":"ContainerStarted","Data":"5ed13913c54cefcd86526e2776f8e0fd8f5ae248dfb2f0265ab4b6b9adc453bf"} Dec 06 09:10:04 crc kubenswrapper[4672]: I1206 09:10:04.476916 4672 generic.go:334] "Generic (PLEG): container finished" podID="52c1e804-08d7-433b-b991-438c08d1bb62" containerID="9f32272056e25861fc1dc782a4d7e389ce81cbd6972fa4f38b0cc58ba6a3fa01" exitCode=0 Dec 06 09:10:04 crc kubenswrapper[4672]: I1206 09:10:04.477322 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jdkg9" event={"ID":"52c1e804-08d7-433b-b991-438c08d1bb62","Type":"ContainerDied","Data":"9f32272056e25861fc1dc782a4d7e389ce81cbd6972fa4f38b0cc58ba6a3fa01"} Dec 06 09:10:04 crc kubenswrapper[4672]: I1206 09:10:04.506407 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nxl9d" podStartSLOduration=3.740036688 podStartE2EDuration="1m2.506372038s" podCreationTimestamp="2025-12-06 09:09:02 +0000 UTC" firstStartedPulling="2025-12-06 09:09:05.085410902 +0000 UTC m=+162.829671189" lastFinishedPulling="2025-12-06 09:10:03.851746252 +0000 UTC m=+221.596006539" observedRunningTime="2025-12-06 09:10:04.501028189 +0000 UTC m=+222.245288476" watchObservedRunningTime="2025-12-06 09:10:04.506372038 +0000 UTC m=+222.250632325" Dec 06 09:10:04 crc kubenswrapper[4672]: I1206 09:10:04.552448 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-hqx6n" podUID="f512bc0f-8532-4696-aac9-746557867772" containerName="registry-server" probeResult="failure" output=< Dec 06 09:10:04 crc kubenswrapper[4672]: timeout: failed to connect service ":50051" within 1s Dec 06 09:10:04 crc kubenswrapper[4672]: > Dec 06 09:10:05 crc kubenswrapper[4672]: I1206 09:10:05.485748 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bpn8p" event={"ID":"9fef1798-dde5-4ef8-a4fa-5a5997738964","Type":"ContainerStarted","Data":"5aaac56bec57d8bc5f4b4a295ddce3557d01b2d3952ea981ba94d01f5f2c5c8f"} Dec 06 09:10:05 crc kubenswrapper[4672]: I1206 09:10:05.489638 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zr2vx" event={"ID":"4fe165df-a354-4ea2-a51b-dabdaeb654f2","Type":"ContainerDied","Data":"5ed13913c54cefcd86526e2776f8e0fd8f5ae248dfb2f0265ab4b6b9adc453bf"} Dec 06 09:10:05 crc kubenswrapper[4672]: I1206 09:10:05.489590 4672 generic.go:334] "Generic (PLEG): container finished" podID="4fe165df-a354-4ea2-a51b-dabdaeb654f2" containerID="5ed13913c54cefcd86526e2776f8e0fd8f5ae248dfb2f0265ab4b6b9adc453bf" exitCode=0 Dec 06 09:10:05 crc kubenswrapper[4672]: I1206 09:10:05.492278 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jdkg9" event={"ID":"52c1e804-08d7-433b-b991-438c08d1bb62","Type":"ContainerStarted","Data":"58ff296b9bf3bc9a665e530361fdc84c3f01770439116552460fd15eed58e510"} Dec 06 09:10:05 crc kubenswrapper[4672]: I1206 09:10:05.532941 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jdkg9" podStartSLOduration=3.029447758 podStartE2EDuration="1m1.532912233s" podCreationTimestamp="2025-12-06 09:09:04 +0000 UTC" firstStartedPulling="2025-12-06 09:09:06.358075839 +0000 UTC m=+164.102336126" lastFinishedPulling="2025-12-06 09:10:04.861540314 +0000 UTC m=+222.605800601" observedRunningTime="2025-12-06 09:10:05.528167131 +0000 UTC m=+223.272427438" watchObservedRunningTime="2025-12-06 09:10:05.532912233 +0000 UTC m=+223.277172520" Dec 06 09:10:06 crc kubenswrapper[4672]: I1206 09:10:06.501236 4672 generic.go:334] "Generic (PLEG): container finished" podID="9fef1798-dde5-4ef8-a4fa-5a5997738964" containerID="5aaac56bec57d8bc5f4b4a295ddce3557d01b2d3952ea981ba94d01f5f2c5c8f" exitCode=0 Dec 06 09:10:06 crc kubenswrapper[4672]: I1206 09:10:06.501333 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bpn8p" event={"ID":"9fef1798-dde5-4ef8-a4fa-5a5997738964","Type":"ContainerDied","Data":"5aaac56bec57d8bc5f4b4a295ddce3557d01b2d3952ea981ba94d01f5f2c5c8f"} Dec 06 09:10:06 crc kubenswrapper[4672]: I1206 09:10:06.508277 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zr2vx" event={"ID":"4fe165df-a354-4ea2-a51b-dabdaeb654f2","Type":"ContainerStarted","Data":"b4ed48b70abb3e9449725bffbf3334c3f4ed6d0f5c2f6a3a94213968d6934175"} Dec 06 09:10:06 crc kubenswrapper[4672]: I1206 09:10:06.561519 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zr2vx" podStartSLOduration=5.841256175 podStartE2EDuration="1m6.561502213s" podCreationTimestamp="2025-12-06 09:09:00 +0000 UTC" firstStartedPulling="2025-12-06 09:09:05.181391783 +0000 UTC m=+162.925652070" lastFinishedPulling="2025-12-06 09:10:05.901637821 +0000 UTC m=+223.645898108" observedRunningTime="2025-12-06 09:10:06.555562558 +0000 UTC m=+224.299822845" watchObservedRunningTime="2025-12-06 09:10:06.561502213 +0000 UTC m=+224.305762500" Dec 06 09:10:07 crc kubenswrapper[4672]: I1206 09:10:07.516879 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w7htz" event={"ID":"ade26230-5c3c-4b75-bef5-9383cab17974","Type":"ContainerStarted","Data":"f83ea81228584af1950b6158648df5c54e300e48bb620bcda4c6cf90b2685971"} Dec 06 09:10:07 crc kubenswrapper[4672]: I1206 09:10:07.520304 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69p9s" event={"ID":"de182d83-c1ee-4b2a-827b-90fbf7d1e626","Type":"ContainerStarted","Data":"6cc7aa91ce5d9494b9fa52e131f888b29536be1e889d9c363e7af801bc054cee"} Dec 06 09:10:07 crc kubenswrapper[4672]: I1206 09:10:07.523026 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bpn8p" event={"ID":"9fef1798-dde5-4ef8-a4fa-5a5997738964","Type":"ContainerStarted","Data":"50743ccd030cdb26fb22870253afc2198d1077b51dba6f936f87a4f22d1c7a13"} Dec 06 09:10:07 crc kubenswrapper[4672]: I1206 09:10:07.583964 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bpn8p" podStartSLOduration=5.712955281 podStartE2EDuration="1m7.583945782s" podCreationTimestamp="2025-12-06 09:09:00 +0000 UTC" firstStartedPulling="2025-12-06 09:09:05.011358937 +0000 UTC m=+162.755619224" lastFinishedPulling="2025-12-06 09:10:06.882349438 +0000 UTC m=+224.626609725" observedRunningTime="2025-12-06 09:10:07.581641733 +0000 UTC m=+225.325902020" watchObservedRunningTime="2025-12-06 09:10:07.583945782 +0000 UTC m=+225.328206069" Dec 06 09:10:08 crc kubenswrapper[4672]: I1206 09:10:08.532614 4672 generic.go:334] "Generic (PLEG): container finished" podID="ade26230-5c3c-4b75-bef5-9383cab17974" containerID="f83ea81228584af1950b6158648df5c54e300e48bb620bcda4c6cf90b2685971" exitCode=0 Dec 06 09:10:08 crc kubenswrapper[4672]: I1206 09:10:08.532698 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w7htz" event={"ID":"ade26230-5c3c-4b75-bef5-9383cab17974","Type":"ContainerDied","Data":"f83ea81228584af1950b6158648df5c54e300e48bb620bcda4c6cf90b2685971"} Dec 06 09:10:08 crc kubenswrapper[4672]: I1206 09:10:08.535868 4672 generic.go:334] "Generic (PLEG): container finished" podID="de182d83-c1ee-4b2a-827b-90fbf7d1e626" containerID="6cc7aa91ce5d9494b9fa52e131f888b29536be1e889d9c363e7af801bc054cee" exitCode=0 Dec 06 09:10:08 crc kubenswrapper[4672]: I1206 09:10:08.535933 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69p9s" event={"ID":"de182d83-c1ee-4b2a-827b-90fbf7d1e626","Type":"ContainerDied","Data":"6cc7aa91ce5d9494b9fa52e131f888b29536be1e889d9c363e7af801bc054cee"} Dec 06 09:10:11 crc kubenswrapper[4672]: I1206 09:10:11.481614 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bpn8p" Dec 06 09:10:11 crc kubenswrapper[4672]: I1206 09:10:11.482242 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bpn8p" Dec 06 09:10:11 crc kubenswrapper[4672]: I1206 09:10:11.542685 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bpn8p" Dec 06 09:10:11 crc kubenswrapper[4672]: I1206 09:10:11.553574 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zr2vx" Dec 06 09:10:11 crc kubenswrapper[4672]: I1206 09:10:11.553645 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zr2vx" Dec 06 09:10:11 crc kubenswrapper[4672]: I1206 09:10:11.555160 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69p9s" event={"ID":"de182d83-c1ee-4b2a-827b-90fbf7d1e626","Type":"ContainerStarted","Data":"64501d48b9be42ae64d444a72901e8005d93ca7d8ec3cdda352e158b95df3c0a"} Dec 06 09:10:11 crc kubenswrapper[4672]: I1206 09:10:11.598642 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zr2vx" Dec 06 09:10:11 crc kubenswrapper[4672]: I1206 09:10:11.605541 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-69p9s" podStartSLOduration=5.347489472 podStartE2EDuration="1m10.605520641s" podCreationTimestamp="2025-12-06 09:09:01 +0000 UTC" firstStartedPulling="2025-12-06 09:09:05.149191989 +0000 UTC m=+162.893452276" lastFinishedPulling="2025-12-06 09:10:10.407223158 +0000 UTC m=+228.151483445" observedRunningTime="2025-12-06 09:10:11.601790105 +0000 UTC m=+229.346050392" watchObservedRunningTime="2025-12-06 09:10:11.605520641 +0000 UTC m=+229.349780928" Dec 06 09:10:12 crc kubenswrapper[4672]: I1206 09:10:12.049073 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-69p9s" Dec 06 09:10:12 crc kubenswrapper[4672]: I1206 09:10:12.049164 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-69p9s" Dec 06 09:10:12 crc kubenswrapper[4672]: I1206 09:10:12.333086 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:10:12 crc kubenswrapper[4672]: I1206 09:10:12.333145 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:10:12 crc kubenswrapper[4672]: I1206 09:10:12.333197 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 09:10:12 crc kubenswrapper[4672]: I1206 09:10:12.333827 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b"} pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 09:10:12 crc kubenswrapper[4672]: I1206 09:10:12.333927 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" containerID="cri-o://389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b" gracePeriod=600 Dec 06 09:10:12 crc kubenswrapper[4672]: I1206 09:10:12.565208 4672 generic.go:334] "Generic (PLEG): container finished" podID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerID="389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b" exitCode=0 Dec 06 09:10:12 crc kubenswrapper[4672]: I1206 09:10:12.565268 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerDied","Data":"389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b"} Dec 06 09:10:12 crc kubenswrapper[4672]: I1206 09:10:12.568716 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w7htz" event={"ID":"ade26230-5c3c-4b75-bef5-9383cab17974","Type":"ContainerStarted","Data":"3ada5976eb603ee3050396a060e37cf71e72f43e27eb2bbc60d7e6f2e567d770"} Dec 06 09:10:12 crc kubenswrapper[4672]: I1206 09:10:12.592126 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w7htz" podStartSLOduration=4.700729955 podStartE2EDuration="1m9.59210401s" podCreationTimestamp="2025-12-06 09:09:03 +0000 UTC" firstStartedPulling="2025-12-06 09:09:06.234120301 +0000 UTC m=+163.978380588" lastFinishedPulling="2025-12-06 09:10:11.125494356 +0000 UTC m=+228.869754643" observedRunningTime="2025-12-06 09:10:12.589299377 +0000 UTC m=+230.333559664" watchObservedRunningTime="2025-12-06 09:10:12.59210401 +0000 UTC m=+230.336364297" Dec 06 09:10:12 crc kubenswrapper[4672]: I1206 09:10:12.627155 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zr2vx" Dec 06 09:10:13 crc kubenswrapper[4672]: I1206 09:10:13.091748 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-69p9s" podUID="de182d83-c1ee-4b2a-827b-90fbf7d1e626" containerName="registry-server" probeResult="failure" output=< Dec 06 09:10:13 crc kubenswrapper[4672]: timeout: failed to connect service ":50051" within 1s Dec 06 09:10:13 crc kubenswrapper[4672]: > Dec 06 09:10:13 crc kubenswrapper[4672]: I1206 09:10:13.167018 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nxl9d" Dec 06 09:10:13 crc kubenswrapper[4672]: I1206 09:10:13.167090 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nxl9d" Dec 06 09:10:13 crc kubenswrapper[4672]: I1206 09:10:13.214561 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nxl9d" Dec 06 09:10:13 crc kubenswrapper[4672]: I1206 09:10:13.540564 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hqx6n" Dec 06 09:10:13 crc kubenswrapper[4672]: I1206 09:10:13.670623 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nxl9d" Dec 06 09:10:13 crc kubenswrapper[4672]: I1206 09:10:13.694428 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hqx6n" Dec 06 09:10:14 crc kubenswrapper[4672]: I1206 09:10:14.101946 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w7htz" Dec 06 09:10:14 crc kubenswrapper[4672]: I1206 09:10:14.102487 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w7htz" Dec 06 09:10:14 crc kubenswrapper[4672]: I1206 09:10:14.597054 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zr2vx"] Dec 06 09:10:14 crc kubenswrapper[4672]: I1206 09:10:14.597402 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zr2vx" podUID="4fe165df-a354-4ea2-a51b-dabdaeb654f2" containerName="registry-server" containerID="cri-o://b4ed48b70abb3e9449725bffbf3334c3f4ed6d0f5c2f6a3a94213968d6934175" gracePeriod=2 Dec 06 09:10:14 crc kubenswrapper[4672]: I1206 09:10:14.643479 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jdkg9" Dec 06 09:10:14 crc kubenswrapper[4672]: I1206 09:10:14.643549 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jdkg9" Dec 06 09:10:14 crc kubenswrapper[4672]: I1206 09:10:14.699009 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jdkg9" Dec 06 09:10:15 crc kubenswrapper[4672]: I1206 09:10:15.144981 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w7htz" podUID="ade26230-5c3c-4b75-bef5-9383cab17974" containerName="registry-server" probeResult="failure" output=< Dec 06 09:10:15 crc kubenswrapper[4672]: timeout: failed to connect service ":50051" within 1s Dec 06 09:10:15 crc kubenswrapper[4672]: > Dec 06 09:10:15 crc kubenswrapper[4672]: I1206 09:10:15.590253 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerStarted","Data":"c07965cc625156f67df18ec68f14cf89ea9bd464984c84ab0aa0cd0dd54f62ac"} Dec 06 09:10:15 crc kubenswrapper[4672]: I1206 09:10:15.592798 4672 generic.go:334] "Generic (PLEG): container finished" podID="4fe165df-a354-4ea2-a51b-dabdaeb654f2" containerID="b4ed48b70abb3e9449725bffbf3334c3f4ed6d0f5c2f6a3a94213968d6934175" exitCode=0 Dec 06 09:10:15 crc kubenswrapper[4672]: I1206 09:10:15.592899 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zr2vx" event={"ID":"4fe165df-a354-4ea2-a51b-dabdaeb654f2","Type":"ContainerDied","Data":"b4ed48b70abb3e9449725bffbf3334c3f4ed6d0f5c2f6a3a94213968d6934175"} Dec 06 09:10:15 crc kubenswrapper[4672]: I1206 09:10:15.662555 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jdkg9" Dec 06 09:10:16 crc kubenswrapper[4672]: I1206 09:10:16.680451 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zr2vx" Dec 06 09:10:16 crc kubenswrapper[4672]: I1206 09:10:16.733779 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w56l\" (UniqueName: \"kubernetes.io/projected/4fe165df-a354-4ea2-a51b-dabdaeb654f2-kube-api-access-2w56l\") pod \"4fe165df-a354-4ea2-a51b-dabdaeb654f2\" (UID: \"4fe165df-a354-4ea2-a51b-dabdaeb654f2\") " Dec 06 09:10:16 crc kubenswrapper[4672]: I1206 09:10:16.734251 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fe165df-a354-4ea2-a51b-dabdaeb654f2-utilities\") pod \"4fe165df-a354-4ea2-a51b-dabdaeb654f2\" (UID: \"4fe165df-a354-4ea2-a51b-dabdaeb654f2\") " Dec 06 09:10:16 crc kubenswrapper[4672]: I1206 09:10:16.734329 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fe165df-a354-4ea2-a51b-dabdaeb654f2-catalog-content\") pod \"4fe165df-a354-4ea2-a51b-dabdaeb654f2\" (UID: \"4fe165df-a354-4ea2-a51b-dabdaeb654f2\") " Dec 06 09:10:16 crc kubenswrapper[4672]: I1206 09:10:16.736373 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fe165df-a354-4ea2-a51b-dabdaeb654f2-utilities" (OuterVolumeSpecName: "utilities") pod "4fe165df-a354-4ea2-a51b-dabdaeb654f2" (UID: "4fe165df-a354-4ea2-a51b-dabdaeb654f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:10:16 crc kubenswrapper[4672]: I1206 09:10:16.749687 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fe165df-a354-4ea2-a51b-dabdaeb654f2-kube-api-access-2w56l" (OuterVolumeSpecName: "kube-api-access-2w56l") pod "4fe165df-a354-4ea2-a51b-dabdaeb654f2" (UID: "4fe165df-a354-4ea2-a51b-dabdaeb654f2"). InnerVolumeSpecName "kube-api-access-2w56l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:10:16 crc kubenswrapper[4672]: I1206 09:10:16.779362 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lm8cx"] Dec 06 09:10:16 crc kubenswrapper[4672]: I1206 09:10:16.838506 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w56l\" (UniqueName: \"kubernetes.io/projected/4fe165df-a354-4ea2-a51b-dabdaeb654f2-kube-api-access-2w56l\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:16 crc kubenswrapper[4672]: I1206 09:10:16.838545 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fe165df-a354-4ea2-a51b-dabdaeb654f2-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:16 crc kubenswrapper[4672]: I1206 09:10:16.847936 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fe165df-a354-4ea2-a51b-dabdaeb654f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4fe165df-a354-4ea2-a51b-dabdaeb654f2" (UID: "4fe165df-a354-4ea2-a51b-dabdaeb654f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:10:16 crc kubenswrapper[4672]: I1206 09:10:16.939876 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fe165df-a354-4ea2-a51b-dabdaeb654f2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:16 crc kubenswrapper[4672]: I1206 09:10:16.996651 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jdkg9"] Dec 06 09:10:17 crc kubenswrapper[4672]: I1206 09:10:17.193685 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hqx6n"] Dec 06 09:10:17 crc kubenswrapper[4672]: I1206 09:10:17.193955 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hqx6n" podUID="f512bc0f-8532-4696-aac9-746557867772" containerName="registry-server" containerID="cri-o://1f490049b504f6d70a5a02f9f33a041a1cd1b7c2035ba9e83688241c245736d4" gracePeriod=2 Dec 06 09:10:17 crc kubenswrapper[4672]: I1206 09:10:17.607077 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zr2vx" event={"ID":"4fe165df-a354-4ea2-a51b-dabdaeb654f2","Type":"ContainerDied","Data":"8827559ef78bbc4a7dde429eab3d2909ba3a2d8890cd228143be2063a6bc1798"} Dec 06 09:10:17 crc kubenswrapper[4672]: I1206 09:10:17.607149 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zr2vx" Dec 06 09:10:17 crc kubenswrapper[4672]: I1206 09:10:17.607586 4672 scope.go:117] "RemoveContainer" containerID="b4ed48b70abb3e9449725bffbf3334c3f4ed6d0f5c2f6a3a94213968d6934175" Dec 06 09:10:17 crc kubenswrapper[4672]: I1206 09:10:17.607230 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jdkg9" podUID="52c1e804-08d7-433b-b991-438c08d1bb62" containerName="registry-server" containerID="cri-o://58ff296b9bf3bc9a665e530361fdc84c3f01770439116552460fd15eed58e510" gracePeriod=2 Dec 06 09:10:17 crc kubenswrapper[4672]: I1206 09:10:17.630723 4672 scope.go:117] "RemoveContainer" containerID="5ed13913c54cefcd86526e2776f8e0fd8f5ae248dfb2f0265ab4b6b9adc453bf" Dec 06 09:10:17 crc kubenswrapper[4672]: I1206 09:10:17.645415 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zr2vx"] Dec 06 09:10:17 crc kubenswrapper[4672]: I1206 09:10:17.658358 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zr2vx"] Dec 06 09:10:17 crc kubenswrapper[4672]: I1206 09:10:17.665926 4672 scope.go:117] "RemoveContainer" containerID="f700390313e72449b5c7df7e410c296c7a28d52ea238d244a6a10c3fc67a3f6e" Dec 06 09:10:18 crc kubenswrapper[4672]: I1206 09:10:18.566193 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fe165df-a354-4ea2-a51b-dabdaeb654f2" path="/var/lib/kubelet/pods/4fe165df-a354-4ea2-a51b-dabdaeb654f2/volumes" Dec 06 09:10:18 crc kubenswrapper[4672]: I1206 09:10:18.619164 4672 generic.go:334] "Generic (PLEG): container finished" podID="f512bc0f-8532-4696-aac9-746557867772" containerID="1f490049b504f6d70a5a02f9f33a041a1cd1b7c2035ba9e83688241c245736d4" exitCode=0 Dec 06 09:10:18 crc kubenswrapper[4672]: I1206 09:10:18.619268 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqx6n" event={"ID":"f512bc0f-8532-4696-aac9-746557867772","Type":"ContainerDied","Data":"1f490049b504f6d70a5a02f9f33a041a1cd1b7c2035ba9e83688241c245736d4"} Dec 06 09:10:18 crc kubenswrapper[4672]: I1206 09:10:18.623040 4672 generic.go:334] "Generic (PLEG): container finished" podID="52c1e804-08d7-433b-b991-438c08d1bb62" containerID="58ff296b9bf3bc9a665e530361fdc84c3f01770439116552460fd15eed58e510" exitCode=0 Dec 06 09:10:18 crc kubenswrapper[4672]: I1206 09:10:18.623089 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jdkg9" event={"ID":"52c1e804-08d7-433b-b991-438c08d1bb62","Type":"ContainerDied","Data":"58ff296b9bf3bc9a665e530361fdc84c3f01770439116552460fd15eed58e510"} Dec 06 09:10:18 crc kubenswrapper[4672]: I1206 09:10:18.980544 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hqx6n" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.073408 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f512bc0f-8532-4696-aac9-746557867772-utilities\") pod \"f512bc0f-8532-4696-aac9-746557867772\" (UID: \"f512bc0f-8532-4696-aac9-746557867772\") " Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.074216 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f512bc0f-8532-4696-aac9-746557867772-catalog-content\") pod \"f512bc0f-8532-4696-aac9-746557867772\" (UID: \"f512bc0f-8532-4696-aac9-746557867772\") " Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.074307 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmnd2\" (UniqueName: \"kubernetes.io/projected/f512bc0f-8532-4696-aac9-746557867772-kube-api-access-hmnd2\") pod \"f512bc0f-8532-4696-aac9-746557867772\" (UID: \"f512bc0f-8532-4696-aac9-746557867772\") " Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.074533 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f512bc0f-8532-4696-aac9-746557867772-utilities" (OuterVolumeSpecName: "utilities") pod "f512bc0f-8532-4696-aac9-746557867772" (UID: "f512bc0f-8532-4696-aac9-746557867772"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.074808 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f512bc0f-8532-4696-aac9-746557867772-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.086734 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f512bc0f-8532-4696-aac9-746557867772-kube-api-access-hmnd2" (OuterVolumeSpecName: "kube-api-access-hmnd2") pod "f512bc0f-8532-4696-aac9-746557867772" (UID: "f512bc0f-8532-4696-aac9-746557867772"). InnerVolumeSpecName "kube-api-access-hmnd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.133031 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f512bc0f-8532-4696-aac9-746557867772-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f512bc0f-8532-4696-aac9-746557867772" (UID: "f512bc0f-8532-4696-aac9-746557867772"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.176570 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmnd2\" (UniqueName: \"kubernetes.io/projected/f512bc0f-8532-4696-aac9-746557867772-kube-api-access-hmnd2\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.177057 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f512bc0f-8532-4696-aac9-746557867772-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.188800 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jdkg9" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.277874 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stsvh\" (UniqueName: \"kubernetes.io/projected/52c1e804-08d7-433b-b991-438c08d1bb62-kube-api-access-stsvh\") pod \"52c1e804-08d7-433b-b991-438c08d1bb62\" (UID: \"52c1e804-08d7-433b-b991-438c08d1bb62\") " Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.277923 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52c1e804-08d7-433b-b991-438c08d1bb62-catalog-content\") pod \"52c1e804-08d7-433b-b991-438c08d1bb62\" (UID: \"52c1e804-08d7-433b-b991-438c08d1bb62\") " Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.278025 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52c1e804-08d7-433b-b991-438c08d1bb62-utilities\") pod \"52c1e804-08d7-433b-b991-438c08d1bb62\" (UID: \"52c1e804-08d7-433b-b991-438c08d1bb62\") " Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.279083 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52c1e804-08d7-433b-b991-438c08d1bb62-utilities" (OuterVolumeSpecName: "utilities") pod "52c1e804-08d7-433b-b991-438c08d1bb62" (UID: "52c1e804-08d7-433b-b991-438c08d1bb62"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.287291 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52c1e804-08d7-433b-b991-438c08d1bb62-kube-api-access-stsvh" (OuterVolumeSpecName: "kube-api-access-stsvh") pod "52c1e804-08d7-433b-b991-438c08d1bb62" (UID: "52c1e804-08d7-433b-b991-438c08d1bb62"). InnerVolumeSpecName "kube-api-access-stsvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.379859 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52c1e804-08d7-433b-b991-438c08d1bb62-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.379905 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stsvh\" (UniqueName: \"kubernetes.io/projected/52c1e804-08d7-433b-b991-438c08d1bb62-kube-api-access-stsvh\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.394137 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52c1e804-08d7-433b-b991-438c08d1bb62-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52c1e804-08d7-433b-b991-438c08d1bb62" (UID: "52c1e804-08d7-433b-b991-438c08d1bb62"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.450197 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bpn8p"] Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.450614 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bpn8p" podUID="9fef1798-dde5-4ef8-a4fa-5a5997738964" containerName="registry-server" containerID="cri-o://50743ccd030cdb26fb22870253afc2198d1077b51dba6f936f87a4f22d1c7a13" gracePeriod=30 Dec 06 09:10:19 crc kubenswrapper[4672]: E1206 09:10:19.458898 4672 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="50743ccd030cdb26fb22870253afc2198d1077b51dba6f936f87a4f22d1c7a13" cmd=["grpc_health_probe","-addr=:50051"] Dec 06 09:10:19 crc kubenswrapper[4672]: E1206 09:10:19.463025 4672 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="50743ccd030cdb26fb22870253afc2198d1077b51dba6f936f87a4f22d1c7a13" cmd=["grpc_health_probe","-addr=:50051"] Dec 06 09:10:19 crc kubenswrapper[4672]: E1206 09:10:19.465587 4672 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="50743ccd030cdb26fb22870253afc2198d1077b51dba6f936f87a4f22d1c7a13" cmd=["grpc_health_probe","-addr=:50051"] Dec 06 09:10:19 crc kubenswrapper[4672]: E1206 09:10:19.465679 4672 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-marketplace/certified-operators-bpn8p" podUID="9fef1798-dde5-4ef8-a4fa-5a5997738964" containerName="registry-server" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.467698 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-69p9s"] Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.468039 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-69p9s" podUID="de182d83-c1ee-4b2a-827b-90fbf7d1e626" containerName="registry-server" containerID="cri-o://64501d48b9be42ae64d444a72901e8005d93ca7d8ec3cdda352e158b95df3c0a" gracePeriod=30 Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.476188 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vtk7z"] Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.476543 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vtk7z" podUID="88387d22-3fdc-4004-a9a1-4e6467c2c3f4" containerName="registry-server" containerID="cri-o://50e6c69e5b31c9f343482f2aae087ca7e0ed2cd9a9afded7802d0de715b2a989" gracePeriod=30 Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.484675 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52c1e804-08d7-433b-b991-438c08d1bb62-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.495455 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xqnzx"] Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.499848 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-xqnzx" podUID="2dfe938c-2f3d-4e4c-9156-d2d87b4478fe" containerName="marketplace-operator" containerID="cri-o://deda8a0fd768b7c3b6f51063462d2096a2bc074793f79e14570c2fdb7e3eacd1" gracePeriod=30 Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.519056 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nxl9d"] Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.519411 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nxl9d" podUID="d0417e2e-2041-42b0-a404-236595aa99bd" containerName="registry-server" containerID="cri-o://6a3045ed0620a57837ea7015e6b574508029ee6d010a85310ab89287e27baef3" gracePeriod=30 Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.539484 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w7htz"] Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.539829 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w7htz" podUID="ade26230-5c3c-4b75-bef5-9383cab17974" containerName="registry-server" containerID="cri-o://3ada5976eb603ee3050396a060e37cf71e72f43e27eb2bbc60d7e6f2e567d770" gracePeriod=30 Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.553417 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zhbdf"] Dec 06 09:10:19 crc kubenswrapper[4672]: E1206 09:10:19.554233 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f512bc0f-8532-4696-aac9-746557867772" containerName="extract-utilities" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.554265 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="f512bc0f-8532-4696-aac9-746557867772" containerName="extract-utilities" Dec 06 09:10:19 crc kubenswrapper[4672]: E1206 09:10:19.554282 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fe165df-a354-4ea2-a51b-dabdaeb654f2" containerName="extract-content" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.554291 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fe165df-a354-4ea2-a51b-dabdaeb654f2" containerName="extract-content" Dec 06 09:10:19 crc kubenswrapper[4672]: E1206 09:10:19.554309 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c1e804-08d7-433b-b991-438c08d1bb62" containerName="extract-utilities" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.554329 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c1e804-08d7-433b-b991-438c08d1bb62" containerName="extract-utilities" Dec 06 09:10:19 crc kubenswrapper[4672]: E1206 09:10:19.554338 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fe165df-a354-4ea2-a51b-dabdaeb654f2" containerName="registry-server" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.554346 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fe165df-a354-4ea2-a51b-dabdaeb654f2" containerName="registry-server" Dec 06 09:10:19 crc kubenswrapper[4672]: E1206 09:10:19.554357 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dc29443-7987-44c6-a536-853ed548e87d" containerName="pruner" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.554364 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dc29443-7987-44c6-a536-853ed548e87d" containerName="pruner" Dec 06 09:10:19 crc kubenswrapper[4672]: E1206 09:10:19.554383 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fe165df-a354-4ea2-a51b-dabdaeb654f2" containerName="extract-utilities" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.554392 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fe165df-a354-4ea2-a51b-dabdaeb654f2" containerName="extract-utilities" Dec 06 09:10:19 crc kubenswrapper[4672]: E1206 09:10:19.554402 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c1e804-08d7-433b-b991-438c08d1bb62" containerName="extract-content" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.554409 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c1e804-08d7-433b-b991-438c08d1bb62" containerName="extract-content" Dec 06 09:10:19 crc kubenswrapper[4672]: E1206 09:10:19.554421 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f512bc0f-8532-4696-aac9-746557867772" containerName="extract-content" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.554431 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="f512bc0f-8532-4696-aac9-746557867772" containerName="extract-content" Dec 06 09:10:19 crc kubenswrapper[4672]: E1206 09:10:19.554439 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c1e804-08d7-433b-b991-438c08d1bb62" containerName="registry-server" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.554447 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c1e804-08d7-433b-b991-438c08d1bb62" containerName="registry-server" Dec 06 09:10:19 crc kubenswrapper[4672]: E1206 09:10:19.554461 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f512bc0f-8532-4696-aac9-746557867772" containerName="registry-server" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.554471 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="f512bc0f-8532-4696-aac9-746557867772" containerName="registry-server" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.556236 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c1e804-08d7-433b-b991-438c08d1bb62" containerName="registry-server" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.556278 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dc29443-7987-44c6-a536-853ed548e87d" containerName="pruner" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.556291 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fe165df-a354-4ea2-a51b-dabdaeb654f2" containerName="registry-server" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.556303 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="f512bc0f-8532-4696-aac9-746557867772" containerName="registry-server" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.557157 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zhbdf" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.568989 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zhbdf"] Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.591422 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-722vs\" (UniqueName: \"kubernetes.io/projected/6f374204-77e4-4b75-afaf-43579bc0506a-kube-api-access-722vs\") pod \"marketplace-operator-79b997595-zhbdf\" (UID: \"6f374204-77e4-4b75-afaf-43579bc0506a\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhbdf" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.591481 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6f374204-77e4-4b75-afaf-43579bc0506a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zhbdf\" (UID: \"6f374204-77e4-4b75-afaf-43579bc0506a\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhbdf" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.591535 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f374204-77e4-4b75-afaf-43579bc0506a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zhbdf\" (UID: \"6f374204-77e4-4b75-afaf-43579bc0506a\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhbdf" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.662261 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqx6n" event={"ID":"f512bc0f-8532-4696-aac9-746557867772","Type":"ContainerDied","Data":"ffb8cd7dc1586d9b9d591d41fc728f4159f81977882de0534195d5ba9b170618"} Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.662406 4672 scope.go:117] "RemoveContainer" containerID="1f490049b504f6d70a5a02f9f33a041a1cd1b7c2035ba9e83688241c245736d4" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.662567 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hqx6n" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.673145 4672 generic.go:334] "Generic (PLEG): container finished" podID="de182d83-c1ee-4b2a-827b-90fbf7d1e626" containerID="64501d48b9be42ae64d444a72901e8005d93ca7d8ec3cdda352e158b95df3c0a" exitCode=0 Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.673291 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69p9s" event={"ID":"de182d83-c1ee-4b2a-827b-90fbf7d1e626","Type":"ContainerDied","Data":"64501d48b9be42ae64d444a72901e8005d93ca7d8ec3cdda352e158b95df3c0a"} Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.678714 4672 generic.go:334] "Generic (PLEG): container finished" podID="9fef1798-dde5-4ef8-a4fa-5a5997738964" containerID="50743ccd030cdb26fb22870253afc2198d1077b51dba6f936f87a4f22d1c7a13" exitCode=0 Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.678995 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bpn8p" event={"ID":"9fef1798-dde5-4ef8-a4fa-5a5997738964","Type":"ContainerDied","Data":"50743ccd030cdb26fb22870253afc2198d1077b51dba6f936f87a4f22d1c7a13"} Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.685262 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jdkg9" event={"ID":"52c1e804-08d7-433b-b991-438c08d1bb62","Type":"ContainerDied","Data":"7b0696877c6f16191cd733482b8987ab8adbc428a04e40e60df01bddb5c5897b"} Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.685430 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jdkg9" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.693453 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f374204-77e4-4b75-afaf-43579bc0506a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zhbdf\" (UID: \"6f374204-77e4-4b75-afaf-43579bc0506a\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhbdf" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.693541 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-722vs\" (UniqueName: \"kubernetes.io/projected/6f374204-77e4-4b75-afaf-43579bc0506a-kube-api-access-722vs\") pod \"marketplace-operator-79b997595-zhbdf\" (UID: \"6f374204-77e4-4b75-afaf-43579bc0506a\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhbdf" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.693583 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6f374204-77e4-4b75-afaf-43579bc0506a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zhbdf\" (UID: \"6f374204-77e4-4b75-afaf-43579bc0506a\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhbdf" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.703896 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f374204-77e4-4b75-afaf-43579bc0506a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zhbdf\" (UID: \"6f374204-77e4-4b75-afaf-43579bc0506a\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhbdf" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.708854 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6f374204-77e4-4b75-afaf-43579bc0506a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zhbdf\" (UID: \"6f374204-77e4-4b75-afaf-43579bc0506a\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhbdf" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.712413 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-722vs\" (UniqueName: \"kubernetes.io/projected/6f374204-77e4-4b75-afaf-43579bc0506a-kube-api-access-722vs\") pod \"marketplace-operator-79b997595-zhbdf\" (UID: \"6f374204-77e4-4b75-afaf-43579bc0506a\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhbdf" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.717854 4672 generic.go:334] "Generic (PLEG): container finished" podID="88387d22-3fdc-4004-a9a1-4e6467c2c3f4" containerID="50e6c69e5b31c9f343482f2aae087ca7e0ed2cd9a9afded7802d0de715b2a989" exitCode=0 Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.717910 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vtk7z" event={"ID":"88387d22-3fdc-4004-a9a1-4e6467c2c3f4","Type":"ContainerDied","Data":"50e6c69e5b31c9f343482f2aae087ca7e0ed2cd9a9afded7802d0de715b2a989"} Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.812081 4672 scope.go:117] "RemoveContainer" containerID="8dae60887ef3a02e084ff0231ead8081329268148dd7675bb56c1e8f1cb6f655" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.822832 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zhbdf" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.835428 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hqx6n"] Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.835489 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hqx6n"] Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.852889 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jdkg9"] Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.874292 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jdkg9"] Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.885920 4672 scope.go:117] "RemoveContainer" containerID="e6f99c0bc1f16b51b08f4b34ff3e32b48a7744e0e2f061252344ddee13e9e04c" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.911723 4672 scope.go:117] "RemoveContainer" containerID="58ff296b9bf3bc9a665e530361fdc84c3f01770439116552460fd15eed58e510" Dec 06 09:10:19 crc kubenswrapper[4672]: I1206 09:10:19.948802 4672 scope.go:117] "RemoveContainer" containerID="9f32272056e25861fc1dc782a4d7e389ce81cbd6972fa4f38b0cc58ba6a3fa01" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.009067 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vtk7z" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.019712 4672 scope.go:117] "RemoveContainer" containerID="ec1376086a18dfb9310cdb9fb7e09b9a72485443c0b4d56696668e590cbf984c" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.097445 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88387d22-3fdc-4004-a9a1-4e6467c2c3f4-catalog-content\") pod \"88387d22-3fdc-4004-a9a1-4e6467c2c3f4\" (UID: \"88387d22-3fdc-4004-a9a1-4e6467c2c3f4\") " Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.097504 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlpl2\" (UniqueName: \"kubernetes.io/projected/88387d22-3fdc-4004-a9a1-4e6467c2c3f4-kube-api-access-dlpl2\") pod \"88387d22-3fdc-4004-a9a1-4e6467c2c3f4\" (UID: \"88387d22-3fdc-4004-a9a1-4e6467c2c3f4\") " Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.097583 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88387d22-3fdc-4004-a9a1-4e6467c2c3f4-utilities\") pod \"88387d22-3fdc-4004-a9a1-4e6467c2c3f4\" (UID: \"88387d22-3fdc-4004-a9a1-4e6467c2c3f4\") " Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.114984 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88387d22-3fdc-4004-a9a1-4e6467c2c3f4-utilities" (OuterVolumeSpecName: "utilities") pod "88387d22-3fdc-4004-a9a1-4e6467c2c3f4" (UID: "88387d22-3fdc-4004-a9a1-4e6467c2c3f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.125901 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88387d22-3fdc-4004-a9a1-4e6467c2c3f4-kube-api-access-dlpl2" (OuterVolumeSpecName: "kube-api-access-dlpl2") pod "88387d22-3fdc-4004-a9a1-4e6467c2c3f4" (UID: "88387d22-3fdc-4004-a9a1-4e6467c2c3f4"). InnerVolumeSpecName "kube-api-access-dlpl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.165912 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-69p9s" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.179268 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nxl9d" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.195501 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xqnzx" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.195827 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bpn8p" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.200283 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0417e2e-2041-42b0-a404-236595aa99bd-utilities\") pod \"d0417e2e-2041-42b0-a404-236595aa99bd\" (UID: \"d0417e2e-2041-42b0-a404-236595aa99bd\") " Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.200375 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de182d83-c1ee-4b2a-827b-90fbf7d1e626-catalog-content\") pod \"de182d83-c1ee-4b2a-827b-90fbf7d1e626\" (UID: \"de182d83-c1ee-4b2a-827b-90fbf7d1e626\") " Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.200509 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de182d83-c1ee-4b2a-827b-90fbf7d1e626-utilities\") pod \"de182d83-c1ee-4b2a-827b-90fbf7d1e626\" (UID: \"de182d83-c1ee-4b2a-827b-90fbf7d1e626\") " Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.200613 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0417e2e-2041-42b0-a404-236595aa99bd-catalog-content\") pod \"d0417e2e-2041-42b0-a404-236595aa99bd\" (UID: \"d0417e2e-2041-42b0-a404-236595aa99bd\") " Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.200643 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zx465\" (UniqueName: \"kubernetes.io/projected/de182d83-c1ee-4b2a-827b-90fbf7d1e626-kube-api-access-zx465\") pod \"de182d83-c1ee-4b2a-827b-90fbf7d1e626\" (UID: \"de182d83-c1ee-4b2a-827b-90fbf7d1e626\") " Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.200673 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2s5f\" (UniqueName: \"kubernetes.io/projected/d0417e2e-2041-42b0-a404-236595aa99bd-kube-api-access-t2s5f\") pod \"d0417e2e-2041-42b0-a404-236595aa99bd\" (UID: \"d0417e2e-2041-42b0-a404-236595aa99bd\") " Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.201006 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlpl2\" (UniqueName: \"kubernetes.io/projected/88387d22-3fdc-4004-a9a1-4e6467c2c3f4-kube-api-access-dlpl2\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.201025 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88387d22-3fdc-4004-a9a1-4e6467c2c3f4-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.201062 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0417e2e-2041-42b0-a404-236595aa99bd-utilities" (OuterVolumeSpecName: "utilities") pod "d0417e2e-2041-42b0-a404-236595aa99bd" (UID: "d0417e2e-2041-42b0-a404-236595aa99bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.202408 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de182d83-c1ee-4b2a-827b-90fbf7d1e626-utilities" (OuterVolumeSpecName: "utilities") pod "de182d83-c1ee-4b2a-827b-90fbf7d1e626" (UID: "de182d83-c1ee-4b2a-827b-90fbf7d1e626"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.202652 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w7htz" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.209951 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de182d83-c1ee-4b2a-827b-90fbf7d1e626-kube-api-access-zx465" (OuterVolumeSpecName: "kube-api-access-zx465") pod "de182d83-c1ee-4b2a-827b-90fbf7d1e626" (UID: "de182d83-c1ee-4b2a-827b-90fbf7d1e626"). InnerVolumeSpecName "kube-api-access-zx465". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.220418 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88387d22-3fdc-4004-a9a1-4e6467c2c3f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88387d22-3fdc-4004-a9a1-4e6467c2c3f4" (UID: "88387d22-3fdc-4004-a9a1-4e6467c2c3f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.227948 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0417e2e-2041-42b0-a404-236595aa99bd-kube-api-access-t2s5f" (OuterVolumeSpecName: "kube-api-access-t2s5f") pod "d0417e2e-2041-42b0-a404-236595aa99bd" (UID: "d0417e2e-2041-42b0-a404-236595aa99bd"). InnerVolumeSpecName "kube-api-access-t2s5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.234696 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0417e2e-2041-42b0-a404-236595aa99bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0417e2e-2041-42b0-a404-236595aa99bd" (UID: "d0417e2e-2041-42b0-a404-236595aa99bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.308367 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ade26230-5c3c-4b75-bef5-9383cab17974-utilities\") pod \"ade26230-5c3c-4b75-bef5-9383cab17974\" (UID: \"ade26230-5c3c-4b75-bef5-9383cab17974\") " Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.308466 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhnt8\" (UniqueName: \"kubernetes.io/projected/2dfe938c-2f3d-4e4c-9156-d2d87b4478fe-kube-api-access-nhnt8\") pod \"2dfe938c-2f3d-4e4c-9156-d2d87b4478fe\" (UID: \"2dfe938c-2f3d-4e4c-9156-d2d87b4478fe\") " Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.308495 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ade26230-5c3c-4b75-bef5-9383cab17974-catalog-content\") pod \"ade26230-5c3c-4b75-bef5-9383cab17974\" (UID: \"ade26230-5c3c-4b75-bef5-9383cab17974\") " Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.308535 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nlb2\" (UniqueName: \"kubernetes.io/projected/9fef1798-dde5-4ef8-a4fa-5a5997738964-kube-api-access-5nlb2\") pod \"9fef1798-dde5-4ef8-a4fa-5a5997738964\" (UID: \"9fef1798-dde5-4ef8-a4fa-5a5997738964\") " Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.308644 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fef1798-dde5-4ef8-a4fa-5a5997738964-utilities\") pod \"9fef1798-dde5-4ef8-a4fa-5a5997738964\" (UID: \"9fef1798-dde5-4ef8-a4fa-5a5997738964\") " Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.308698 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bjxt\" (UniqueName: \"kubernetes.io/projected/ade26230-5c3c-4b75-bef5-9383cab17974-kube-api-access-2bjxt\") pod \"ade26230-5c3c-4b75-bef5-9383cab17974\" (UID: \"ade26230-5c3c-4b75-bef5-9383cab17974\") " Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.308735 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2dfe938c-2f3d-4e4c-9156-d2d87b4478fe-marketplace-operator-metrics\") pod \"2dfe938c-2f3d-4e4c-9156-d2d87b4478fe\" (UID: \"2dfe938c-2f3d-4e4c-9156-d2d87b4478fe\") " Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.308764 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2dfe938c-2f3d-4e4c-9156-d2d87b4478fe-marketplace-trusted-ca\") pod \"2dfe938c-2f3d-4e4c-9156-d2d87b4478fe\" (UID: \"2dfe938c-2f3d-4e4c-9156-d2d87b4478fe\") " Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.308831 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fef1798-dde5-4ef8-a4fa-5a5997738964-catalog-content\") pod \"9fef1798-dde5-4ef8-a4fa-5a5997738964\" (UID: \"9fef1798-dde5-4ef8-a4fa-5a5997738964\") " Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.309325 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0417e2e-2041-42b0-a404-236595aa99bd-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.309346 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88387d22-3fdc-4004-a9a1-4e6467c2c3f4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.309359 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de182d83-c1ee-4b2a-827b-90fbf7d1e626-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.309373 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0417e2e-2041-42b0-a404-236595aa99bd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.309413 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zx465\" (UniqueName: \"kubernetes.io/projected/de182d83-c1ee-4b2a-827b-90fbf7d1e626-kube-api-access-zx465\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.309423 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2s5f\" (UniqueName: \"kubernetes.io/projected/d0417e2e-2041-42b0-a404-236595aa99bd-kube-api-access-t2s5f\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.314839 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fef1798-dde5-4ef8-a4fa-5a5997738964-utilities" (OuterVolumeSpecName: "utilities") pod "9fef1798-dde5-4ef8-a4fa-5a5997738964" (UID: "9fef1798-dde5-4ef8-a4fa-5a5997738964"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.315870 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ade26230-5c3c-4b75-bef5-9383cab17974-utilities" (OuterVolumeSpecName: "utilities") pod "ade26230-5c3c-4b75-bef5-9383cab17974" (UID: "ade26230-5c3c-4b75-bef5-9383cab17974"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.322363 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dfe938c-2f3d-4e4c-9156-d2d87b4478fe-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "2dfe938c-2f3d-4e4c-9156-d2d87b4478fe" (UID: "2dfe938c-2f3d-4e4c-9156-d2d87b4478fe"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.323029 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fef1798-dde5-4ef8-a4fa-5a5997738964-kube-api-access-5nlb2" (OuterVolumeSpecName: "kube-api-access-5nlb2") pod "9fef1798-dde5-4ef8-a4fa-5a5997738964" (UID: "9fef1798-dde5-4ef8-a4fa-5a5997738964"). InnerVolumeSpecName "kube-api-access-5nlb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.324307 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dfe938c-2f3d-4e4c-9156-d2d87b4478fe-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "2dfe938c-2f3d-4e4c-9156-d2d87b4478fe" (UID: "2dfe938c-2f3d-4e4c-9156-d2d87b4478fe"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.337806 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dfe938c-2f3d-4e4c-9156-d2d87b4478fe-kube-api-access-nhnt8" (OuterVolumeSpecName: "kube-api-access-nhnt8") pod "2dfe938c-2f3d-4e4c-9156-d2d87b4478fe" (UID: "2dfe938c-2f3d-4e4c-9156-d2d87b4478fe"). InnerVolumeSpecName "kube-api-access-nhnt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.354179 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de182d83-c1ee-4b2a-827b-90fbf7d1e626-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de182d83-c1ee-4b2a-827b-90fbf7d1e626" (UID: "de182d83-c1ee-4b2a-827b-90fbf7d1e626"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.357836 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ade26230-5c3c-4b75-bef5-9383cab17974-kube-api-access-2bjxt" (OuterVolumeSpecName: "kube-api-access-2bjxt") pod "ade26230-5c3c-4b75-bef5-9383cab17974" (UID: "ade26230-5c3c-4b75-bef5-9383cab17974"). InnerVolumeSpecName "kube-api-access-2bjxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.404438 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fef1798-dde5-4ef8-a4fa-5a5997738964-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9fef1798-dde5-4ef8-a4fa-5a5997738964" (UID: "9fef1798-dde5-4ef8-a4fa-5a5997738964"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.410450 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bjxt\" (UniqueName: \"kubernetes.io/projected/ade26230-5c3c-4b75-bef5-9383cab17974-kube-api-access-2bjxt\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.410482 4672 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2dfe938c-2f3d-4e4c-9156-d2d87b4478fe-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.410501 4672 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2dfe938c-2f3d-4e4c-9156-d2d87b4478fe-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.410514 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fef1798-dde5-4ef8-a4fa-5a5997738964-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.410529 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ade26230-5c3c-4b75-bef5-9383cab17974-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.410544 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhnt8\" (UniqueName: \"kubernetes.io/projected/2dfe938c-2f3d-4e4c-9156-d2d87b4478fe-kube-api-access-nhnt8\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.410558 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nlb2\" (UniqueName: \"kubernetes.io/projected/9fef1798-dde5-4ef8-a4fa-5a5997738964-kube-api-access-5nlb2\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.410570 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de182d83-c1ee-4b2a-827b-90fbf7d1e626-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.410582 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fef1798-dde5-4ef8-a4fa-5a5997738964-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.448282 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ade26230-5c3c-4b75-bef5-9383cab17974-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ade26230-5c3c-4b75-bef5-9383cab17974" (UID: "ade26230-5c3c-4b75-bef5-9383cab17974"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.490491 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zhbdf"] Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.511612 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ade26230-5c3c-4b75-bef5-9383cab17974-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.565872 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52c1e804-08d7-433b-b991-438c08d1bb62" path="/var/lib/kubelet/pods/52c1e804-08d7-433b-b991-438c08d1bb62/volumes" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.566493 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f512bc0f-8532-4696-aac9-746557867772" path="/var/lib/kubelet/pods/f512bc0f-8532-4696-aac9-746557867772/volumes" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.727120 4672 generic.go:334] "Generic (PLEG): container finished" podID="d0417e2e-2041-42b0-a404-236595aa99bd" containerID="6a3045ed0620a57837ea7015e6b574508029ee6d010a85310ab89287e27baef3" exitCode=0 Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.727225 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nxl9d" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.727245 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nxl9d" event={"ID":"d0417e2e-2041-42b0-a404-236595aa99bd","Type":"ContainerDied","Data":"6a3045ed0620a57837ea7015e6b574508029ee6d010a85310ab89287e27baef3"} Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.727333 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nxl9d" event={"ID":"d0417e2e-2041-42b0-a404-236595aa99bd","Type":"ContainerDied","Data":"7737ee187c4aea6bad5e149d158948944e934c7bd16a4ccf03302f023c13859b"} Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.727360 4672 scope.go:117] "RemoveContainer" containerID="6a3045ed0620a57837ea7015e6b574508029ee6d010a85310ab89287e27baef3" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.731918 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vtk7z" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.731772 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vtk7z" event={"ID":"88387d22-3fdc-4004-a9a1-4e6467c2c3f4","Type":"ContainerDied","Data":"3357306d01d41b4432de88f61aac1effd042a91062f27f8f1c05dcd7275d0300"} Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.736973 4672 generic.go:334] "Generic (PLEG): container finished" podID="2dfe938c-2f3d-4e4c-9156-d2d87b4478fe" containerID="deda8a0fd768b7c3b6f51063462d2096a2bc074793f79e14570c2fdb7e3eacd1" exitCode=0 Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.737076 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xqnzx" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.737252 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xqnzx" event={"ID":"2dfe938c-2f3d-4e4c-9156-d2d87b4478fe","Type":"ContainerDied","Data":"deda8a0fd768b7c3b6f51063462d2096a2bc074793f79e14570c2fdb7e3eacd1"} Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.737368 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xqnzx" event={"ID":"2dfe938c-2f3d-4e4c-9156-d2d87b4478fe","Type":"ContainerDied","Data":"9b0b85a30439922f0c7b1d34df7c8da11cab18082c28030bd073f9e39ae3f35d"} Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.745357 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bpn8p" event={"ID":"9fef1798-dde5-4ef8-a4fa-5a5997738964","Type":"ContainerDied","Data":"1b904ee9d69070c8143dee3bb0b495d315ab3654c8ddad31e7fdaa76b55872c1"} Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.745713 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bpn8p" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.761313 4672 scope.go:117] "RemoveContainer" containerID="660e618a7f1c3314d8869abd5de51ffcf4c444f15e559b3cdc18fca66303b514" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.762772 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zhbdf" event={"ID":"6f374204-77e4-4b75-afaf-43579bc0506a","Type":"ContainerStarted","Data":"d35a1143b972b6b272beeabcfdb32da2f6259c377dfc9a4184c9ece2aeb1154f"} Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.762824 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zhbdf" event={"ID":"6f374204-77e4-4b75-afaf-43579bc0506a","Type":"ContainerStarted","Data":"c8b8783a6064976c1795078aa6e5dbc17143ea3bd1b8e36c5c216e8aa157cf2c"} Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.763057 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zhbdf" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.763167 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nxl9d"] Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.776791 4672 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zhbdf container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.56:8080/healthz\": dial tcp 10.217.0.56:8080: connect: connection refused" start-of-body= Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.777331 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zhbdf" podUID="6f374204-77e4-4b75-afaf-43579bc0506a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.56:8080/healthz\": dial tcp 10.217.0.56:8080: connect: connection refused" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.780964 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nxl9d"] Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.781362 4672 generic.go:334] "Generic (PLEG): container finished" podID="ade26230-5c3c-4b75-bef5-9383cab17974" containerID="3ada5976eb603ee3050396a060e37cf71e72f43e27eb2bbc60d7e6f2e567d770" exitCode=0 Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.781914 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w7htz" event={"ID":"ade26230-5c3c-4b75-bef5-9383cab17974","Type":"ContainerDied","Data":"3ada5976eb603ee3050396a060e37cf71e72f43e27eb2bbc60d7e6f2e567d770"} Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.782007 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w7htz" event={"ID":"ade26230-5c3c-4b75-bef5-9383cab17974","Type":"ContainerDied","Data":"eb163cf1d933c0e6932d94fcac7c4668e2dbd62160354e3cc58b69678cb43de2"} Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.782225 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w7htz" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.792305 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69p9s" event={"ID":"de182d83-c1ee-4b2a-827b-90fbf7d1e626","Type":"ContainerDied","Data":"d05d4592cef79d30f0e5985c7dd9b43b87c513b73f4830f263fdef51cd7309b3"} Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.792724 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-69p9s" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.795103 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vtk7z"] Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.798350 4672 scope.go:117] "RemoveContainer" containerID="65049d3f0a7a92b85a66828abf6e720274ec0a33a7dea6541feab55e794c104f" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.800047 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vtk7z"] Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.802915 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xqnzx"] Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.809109 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xqnzx"] Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.813127 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bpn8p"] Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.824956 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bpn8p"] Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.829784 4672 scope.go:117] "RemoveContainer" containerID="6a3045ed0620a57837ea7015e6b574508029ee6d010a85310ab89287e27baef3" Dec 06 09:10:20 crc kubenswrapper[4672]: E1206 09:10:20.830568 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a3045ed0620a57837ea7015e6b574508029ee6d010a85310ab89287e27baef3\": container with ID starting with 6a3045ed0620a57837ea7015e6b574508029ee6d010a85310ab89287e27baef3 not found: ID does not exist" containerID="6a3045ed0620a57837ea7015e6b574508029ee6d010a85310ab89287e27baef3" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.830710 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a3045ed0620a57837ea7015e6b574508029ee6d010a85310ab89287e27baef3"} err="failed to get container status \"6a3045ed0620a57837ea7015e6b574508029ee6d010a85310ab89287e27baef3\": rpc error: code = NotFound desc = could not find container \"6a3045ed0620a57837ea7015e6b574508029ee6d010a85310ab89287e27baef3\": container with ID starting with 6a3045ed0620a57837ea7015e6b574508029ee6d010a85310ab89287e27baef3 not found: ID does not exist" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.830824 4672 scope.go:117] "RemoveContainer" containerID="660e618a7f1c3314d8869abd5de51ffcf4c444f15e559b3cdc18fca66303b514" Dec 06 09:10:20 crc kubenswrapper[4672]: E1206 09:10:20.834277 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"660e618a7f1c3314d8869abd5de51ffcf4c444f15e559b3cdc18fca66303b514\": container with ID starting with 660e618a7f1c3314d8869abd5de51ffcf4c444f15e559b3cdc18fca66303b514 not found: ID does not exist" containerID="660e618a7f1c3314d8869abd5de51ffcf4c444f15e559b3cdc18fca66303b514" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.834530 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"660e618a7f1c3314d8869abd5de51ffcf4c444f15e559b3cdc18fca66303b514"} err="failed to get container status \"660e618a7f1c3314d8869abd5de51ffcf4c444f15e559b3cdc18fca66303b514\": rpc error: code = NotFound desc = could not find container \"660e618a7f1c3314d8869abd5de51ffcf4c444f15e559b3cdc18fca66303b514\": container with ID starting with 660e618a7f1c3314d8869abd5de51ffcf4c444f15e559b3cdc18fca66303b514 not found: ID does not exist" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.834852 4672 scope.go:117] "RemoveContainer" containerID="65049d3f0a7a92b85a66828abf6e720274ec0a33a7dea6541feab55e794c104f" Dec 06 09:10:20 crc kubenswrapper[4672]: E1206 09:10:20.835336 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65049d3f0a7a92b85a66828abf6e720274ec0a33a7dea6541feab55e794c104f\": container with ID starting with 65049d3f0a7a92b85a66828abf6e720274ec0a33a7dea6541feab55e794c104f not found: ID does not exist" containerID="65049d3f0a7a92b85a66828abf6e720274ec0a33a7dea6541feab55e794c104f" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.835402 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65049d3f0a7a92b85a66828abf6e720274ec0a33a7dea6541feab55e794c104f"} err="failed to get container status \"65049d3f0a7a92b85a66828abf6e720274ec0a33a7dea6541feab55e794c104f\": rpc error: code = NotFound desc = could not find container \"65049d3f0a7a92b85a66828abf6e720274ec0a33a7dea6541feab55e794c104f\": container with ID starting with 65049d3f0a7a92b85a66828abf6e720274ec0a33a7dea6541feab55e794c104f not found: ID does not exist" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.835444 4672 scope.go:117] "RemoveContainer" containerID="50e6c69e5b31c9f343482f2aae087ca7e0ed2cd9a9afded7802d0de715b2a989" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.837274 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w7htz"] Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.842144 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w7htz"] Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.849553 4672 scope.go:117] "RemoveContainer" containerID="76b41f0806edc9db8e5cf7a9d902b73f2c71f8dbb4e98f6f4c95ff72614dc2fc" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.860836 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zhbdf" podStartSLOduration=1.860817081 podStartE2EDuration="1.860817081s" podCreationTimestamp="2025-12-06 09:10:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:10:20.858427648 +0000 UTC m=+238.602687935" watchObservedRunningTime="2025-12-06 09:10:20.860817081 +0000 UTC m=+238.605077368" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.875403 4672 scope.go:117] "RemoveContainer" containerID="53dcfdc6e9ae892497324c12b3d0926f5143fae1cbbbf1d15f87a035cb7db4c4" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.879301 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-69p9s"] Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.882192 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-69p9s"] Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.888990 4672 scope.go:117] "RemoveContainer" containerID="deda8a0fd768b7c3b6f51063462d2096a2bc074793f79e14570c2fdb7e3eacd1" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.902173 4672 scope.go:117] "RemoveContainer" containerID="deda8a0fd768b7c3b6f51063462d2096a2bc074793f79e14570c2fdb7e3eacd1" Dec 06 09:10:20 crc kubenswrapper[4672]: E1206 09:10:20.902592 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deda8a0fd768b7c3b6f51063462d2096a2bc074793f79e14570c2fdb7e3eacd1\": container with ID starting with deda8a0fd768b7c3b6f51063462d2096a2bc074793f79e14570c2fdb7e3eacd1 not found: ID does not exist" containerID="deda8a0fd768b7c3b6f51063462d2096a2bc074793f79e14570c2fdb7e3eacd1" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.902683 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deda8a0fd768b7c3b6f51063462d2096a2bc074793f79e14570c2fdb7e3eacd1"} err="failed to get container status \"deda8a0fd768b7c3b6f51063462d2096a2bc074793f79e14570c2fdb7e3eacd1\": rpc error: code = NotFound desc = could not find container \"deda8a0fd768b7c3b6f51063462d2096a2bc074793f79e14570c2fdb7e3eacd1\": container with ID starting with deda8a0fd768b7c3b6f51063462d2096a2bc074793f79e14570c2fdb7e3eacd1 not found: ID does not exist" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.902726 4672 scope.go:117] "RemoveContainer" containerID="50743ccd030cdb26fb22870253afc2198d1077b51dba6f936f87a4f22d1c7a13" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.915301 4672 scope.go:117] "RemoveContainer" containerID="5aaac56bec57d8bc5f4b4a295ddce3557d01b2d3952ea981ba94d01f5f2c5c8f" Dec 06 09:10:20 crc kubenswrapper[4672]: I1206 09:10:20.982569 4672 scope.go:117] "RemoveContainer" containerID="82dcf6166f8225d14c8fbfa6628b8cee496f9ca4766716ed9e9e6fe50b818bb0" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.004695 4672 scope.go:117] "RemoveContainer" containerID="3ada5976eb603ee3050396a060e37cf71e72f43e27eb2bbc60d7e6f2e567d770" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.025786 4672 scope.go:117] "RemoveContainer" containerID="f83ea81228584af1950b6158648df5c54e300e48bb620bcda4c6cf90b2685971" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.047493 4672 scope.go:117] "RemoveContainer" containerID="ebb9a7c57ed11e45af592c7cf61910e7758db51a3df1139aeef6bc02d6658b4b" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.063697 4672 scope.go:117] "RemoveContainer" containerID="3ada5976eb603ee3050396a060e37cf71e72f43e27eb2bbc60d7e6f2e567d770" Dec 06 09:10:21 crc kubenswrapper[4672]: E1206 09:10:21.064270 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ada5976eb603ee3050396a060e37cf71e72f43e27eb2bbc60d7e6f2e567d770\": container with ID starting with 3ada5976eb603ee3050396a060e37cf71e72f43e27eb2bbc60d7e6f2e567d770 not found: ID does not exist" containerID="3ada5976eb603ee3050396a060e37cf71e72f43e27eb2bbc60d7e6f2e567d770" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.064310 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ada5976eb603ee3050396a060e37cf71e72f43e27eb2bbc60d7e6f2e567d770"} err="failed to get container status \"3ada5976eb603ee3050396a060e37cf71e72f43e27eb2bbc60d7e6f2e567d770\": rpc error: code = NotFound desc = could not find container \"3ada5976eb603ee3050396a060e37cf71e72f43e27eb2bbc60d7e6f2e567d770\": container with ID starting with 3ada5976eb603ee3050396a060e37cf71e72f43e27eb2bbc60d7e6f2e567d770 not found: ID does not exist" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.064360 4672 scope.go:117] "RemoveContainer" containerID="f83ea81228584af1950b6158648df5c54e300e48bb620bcda4c6cf90b2685971" Dec 06 09:10:21 crc kubenswrapper[4672]: E1206 09:10:21.064883 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f83ea81228584af1950b6158648df5c54e300e48bb620bcda4c6cf90b2685971\": container with ID starting with f83ea81228584af1950b6158648df5c54e300e48bb620bcda4c6cf90b2685971 not found: ID does not exist" containerID="f83ea81228584af1950b6158648df5c54e300e48bb620bcda4c6cf90b2685971" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.064941 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83ea81228584af1950b6158648df5c54e300e48bb620bcda4c6cf90b2685971"} err="failed to get container status \"f83ea81228584af1950b6158648df5c54e300e48bb620bcda4c6cf90b2685971\": rpc error: code = NotFound desc = could not find container \"f83ea81228584af1950b6158648df5c54e300e48bb620bcda4c6cf90b2685971\": container with ID starting with f83ea81228584af1950b6158648df5c54e300e48bb620bcda4c6cf90b2685971 not found: ID does not exist" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.064959 4672 scope.go:117] "RemoveContainer" containerID="ebb9a7c57ed11e45af592c7cf61910e7758db51a3df1139aeef6bc02d6658b4b" Dec 06 09:10:21 crc kubenswrapper[4672]: E1206 09:10:21.066906 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebb9a7c57ed11e45af592c7cf61910e7758db51a3df1139aeef6bc02d6658b4b\": container with ID starting with ebb9a7c57ed11e45af592c7cf61910e7758db51a3df1139aeef6bc02d6658b4b not found: ID does not exist" containerID="ebb9a7c57ed11e45af592c7cf61910e7758db51a3df1139aeef6bc02d6658b4b" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.066936 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebb9a7c57ed11e45af592c7cf61910e7758db51a3df1139aeef6bc02d6658b4b"} err="failed to get container status \"ebb9a7c57ed11e45af592c7cf61910e7758db51a3df1139aeef6bc02d6658b4b\": rpc error: code = NotFound desc = could not find container \"ebb9a7c57ed11e45af592c7cf61910e7758db51a3df1139aeef6bc02d6658b4b\": container with ID starting with ebb9a7c57ed11e45af592c7cf61910e7758db51a3df1139aeef6bc02d6658b4b not found: ID does not exist" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.066955 4672 scope.go:117] "RemoveContainer" containerID="64501d48b9be42ae64d444a72901e8005d93ca7d8ec3cdda352e158b95df3c0a" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.083341 4672 scope.go:117] "RemoveContainer" containerID="6cc7aa91ce5d9494b9fa52e131f888b29536be1e889d9c363e7af801bc054cee" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.108687 4672 scope.go:117] "RemoveContainer" containerID="388c3efc1675c4721a2094c340dc6bc242f8189700922d99b508bef4508be516" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.405861 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4mbvq"] Dec 06 09:10:21 crc kubenswrapper[4672]: E1206 09:10:21.406169 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0417e2e-2041-42b0-a404-236595aa99bd" containerName="extract-utilities" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.406189 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0417e2e-2041-42b0-a404-236595aa99bd" containerName="extract-utilities" Dec 06 09:10:21 crc kubenswrapper[4672]: E1206 09:10:21.406207 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88387d22-3fdc-4004-a9a1-4e6467c2c3f4" containerName="registry-server" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.406216 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="88387d22-3fdc-4004-a9a1-4e6467c2c3f4" containerName="registry-server" Dec 06 09:10:21 crc kubenswrapper[4672]: E1206 09:10:21.406228 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0417e2e-2041-42b0-a404-236595aa99bd" containerName="registry-server" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.406238 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0417e2e-2041-42b0-a404-236595aa99bd" containerName="registry-server" Dec 06 09:10:21 crc kubenswrapper[4672]: E1206 09:10:21.406252 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0417e2e-2041-42b0-a404-236595aa99bd" containerName="extract-content" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.406260 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0417e2e-2041-42b0-a404-236595aa99bd" containerName="extract-content" Dec 06 09:10:21 crc kubenswrapper[4672]: E1206 09:10:21.406274 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de182d83-c1ee-4b2a-827b-90fbf7d1e626" containerName="extract-utilities" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.406286 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="de182d83-c1ee-4b2a-827b-90fbf7d1e626" containerName="extract-utilities" Dec 06 09:10:21 crc kubenswrapper[4672]: E1206 09:10:21.406294 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ade26230-5c3c-4b75-bef5-9383cab17974" containerName="extract-content" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.406303 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="ade26230-5c3c-4b75-bef5-9383cab17974" containerName="extract-content" Dec 06 09:10:21 crc kubenswrapper[4672]: E1206 09:10:21.406317 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88387d22-3fdc-4004-a9a1-4e6467c2c3f4" containerName="extract-utilities" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.406327 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="88387d22-3fdc-4004-a9a1-4e6467c2c3f4" containerName="extract-utilities" Dec 06 09:10:21 crc kubenswrapper[4672]: E1206 09:10:21.406339 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de182d83-c1ee-4b2a-827b-90fbf7d1e626" containerName="extract-content" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.406348 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="de182d83-c1ee-4b2a-827b-90fbf7d1e626" containerName="extract-content" Dec 06 09:10:21 crc kubenswrapper[4672]: E1206 09:10:21.406366 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88387d22-3fdc-4004-a9a1-4e6467c2c3f4" containerName="extract-content" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.406375 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="88387d22-3fdc-4004-a9a1-4e6467c2c3f4" containerName="extract-content" Dec 06 09:10:21 crc kubenswrapper[4672]: E1206 09:10:21.406387 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de182d83-c1ee-4b2a-827b-90fbf7d1e626" containerName="registry-server" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.406396 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="de182d83-c1ee-4b2a-827b-90fbf7d1e626" containerName="registry-server" Dec 06 09:10:21 crc kubenswrapper[4672]: E1206 09:10:21.406407 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fef1798-dde5-4ef8-a4fa-5a5997738964" containerName="extract-utilities" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.406415 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fef1798-dde5-4ef8-a4fa-5a5997738964" containerName="extract-utilities" Dec 06 09:10:21 crc kubenswrapper[4672]: E1206 09:10:21.406425 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fef1798-dde5-4ef8-a4fa-5a5997738964" containerName="registry-server" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.406434 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fef1798-dde5-4ef8-a4fa-5a5997738964" containerName="registry-server" Dec 06 09:10:21 crc kubenswrapper[4672]: E1206 09:10:21.406445 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dfe938c-2f3d-4e4c-9156-d2d87b4478fe" containerName="marketplace-operator" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.406454 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dfe938c-2f3d-4e4c-9156-d2d87b4478fe" containerName="marketplace-operator" Dec 06 09:10:21 crc kubenswrapper[4672]: E1206 09:10:21.406465 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fef1798-dde5-4ef8-a4fa-5a5997738964" containerName="extract-content" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.406473 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fef1798-dde5-4ef8-a4fa-5a5997738964" containerName="extract-content" Dec 06 09:10:21 crc kubenswrapper[4672]: E1206 09:10:21.406486 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ade26230-5c3c-4b75-bef5-9383cab17974" containerName="registry-server" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.406494 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="ade26230-5c3c-4b75-bef5-9383cab17974" containerName="registry-server" Dec 06 09:10:21 crc kubenswrapper[4672]: E1206 09:10:21.406505 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ade26230-5c3c-4b75-bef5-9383cab17974" containerName="extract-utilities" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.406514 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="ade26230-5c3c-4b75-bef5-9383cab17974" containerName="extract-utilities" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.406666 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="ade26230-5c3c-4b75-bef5-9383cab17974" containerName="registry-server" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.406684 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0417e2e-2041-42b0-a404-236595aa99bd" containerName="registry-server" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.406695 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dfe938c-2f3d-4e4c-9156-d2d87b4478fe" containerName="marketplace-operator" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.406708 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fef1798-dde5-4ef8-a4fa-5a5997738964" containerName="registry-server" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.406718 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="de182d83-c1ee-4b2a-827b-90fbf7d1e626" containerName="registry-server" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.406729 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="88387d22-3fdc-4004-a9a1-4e6467c2c3f4" containerName="registry-server" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.407957 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4mbvq" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.410818 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.420622 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4mbvq"] Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.542367 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55620a10-8ac9-47b4-88b9-7129c90c4ee4-utilities\") pod \"redhat-marketplace-4mbvq\" (UID: \"55620a10-8ac9-47b4-88b9-7129c90c4ee4\") " pod="openshift-marketplace/redhat-marketplace-4mbvq" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.542448 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4p5q\" (UniqueName: \"kubernetes.io/projected/55620a10-8ac9-47b4-88b9-7129c90c4ee4-kube-api-access-g4p5q\") pod \"redhat-marketplace-4mbvq\" (UID: \"55620a10-8ac9-47b4-88b9-7129c90c4ee4\") " pod="openshift-marketplace/redhat-marketplace-4mbvq" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.542648 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55620a10-8ac9-47b4-88b9-7129c90c4ee4-catalog-content\") pod \"redhat-marketplace-4mbvq\" (UID: \"55620a10-8ac9-47b4-88b9-7129c90c4ee4\") " pod="openshift-marketplace/redhat-marketplace-4mbvq" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.644322 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55620a10-8ac9-47b4-88b9-7129c90c4ee4-utilities\") pod \"redhat-marketplace-4mbvq\" (UID: \"55620a10-8ac9-47b4-88b9-7129c90c4ee4\") " pod="openshift-marketplace/redhat-marketplace-4mbvq" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.644422 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4p5q\" (UniqueName: \"kubernetes.io/projected/55620a10-8ac9-47b4-88b9-7129c90c4ee4-kube-api-access-g4p5q\") pod \"redhat-marketplace-4mbvq\" (UID: \"55620a10-8ac9-47b4-88b9-7129c90c4ee4\") " pod="openshift-marketplace/redhat-marketplace-4mbvq" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.644520 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55620a10-8ac9-47b4-88b9-7129c90c4ee4-catalog-content\") pod \"redhat-marketplace-4mbvq\" (UID: \"55620a10-8ac9-47b4-88b9-7129c90c4ee4\") " pod="openshift-marketplace/redhat-marketplace-4mbvq" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.645060 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55620a10-8ac9-47b4-88b9-7129c90c4ee4-utilities\") pod \"redhat-marketplace-4mbvq\" (UID: \"55620a10-8ac9-47b4-88b9-7129c90c4ee4\") " pod="openshift-marketplace/redhat-marketplace-4mbvq" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.646181 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55620a10-8ac9-47b4-88b9-7129c90c4ee4-catalog-content\") pod \"redhat-marketplace-4mbvq\" (UID: \"55620a10-8ac9-47b4-88b9-7129c90c4ee4\") " pod="openshift-marketplace/redhat-marketplace-4mbvq" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.666537 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4p5q\" (UniqueName: \"kubernetes.io/projected/55620a10-8ac9-47b4-88b9-7129c90c4ee4-kube-api-access-g4p5q\") pod \"redhat-marketplace-4mbvq\" (UID: \"55620a10-8ac9-47b4-88b9-7129c90c4ee4\") " pod="openshift-marketplace/redhat-marketplace-4mbvq" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.726813 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4mbvq" Dec 06 09:10:21 crc kubenswrapper[4672]: I1206 09:10:21.822618 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zhbdf" Dec 06 09:10:22 crc kubenswrapper[4672]: I1206 09:10:22.003247 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rrkkx"] Dec 06 09:10:22 crc kubenswrapper[4672]: I1206 09:10:22.004374 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rrkkx" Dec 06 09:10:22 crc kubenswrapper[4672]: I1206 09:10:22.007176 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 06 09:10:22 crc kubenswrapper[4672]: I1206 09:10:22.043617 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rrkkx"] Dec 06 09:10:22 crc kubenswrapper[4672]: I1206 09:10:22.053735 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c188e7d-d705-41ce-bf0d-468de7745723-catalog-content\") pod \"certified-operators-rrkkx\" (UID: \"7c188e7d-d705-41ce-bf0d-468de7745723\") " pod="openshift-marketplace/certified-operators-rrkkx" Dec 06 09:10:22 crc kubenswrapper[4672]: I1206 09:10:22.053805 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c188e7d-d705-41ce-bf0d-468de7745723-utilities\") pod \"certified-operators-rrkkx\" (UID: \"7c188e7d-d705-41ce-bf0d-468de7745723\") " pod="openshift-marketplace/certified-operators-rrkkx" Dec 06 09:10:22 crc kubenswrapper[4672]: I1206 09:10:22.053838 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vjkl\" (UniqueName: \"kubernetes.io/projected/7c188e7d-d705-41ce-bf0d-468de7745723-kube-api-access-8vjkl\") pod \"certified-operators-rrkkx\" (UID: \"7c188e7d-d705-41ce-bf0d-468de7745723\") " pod="openshift-marketplace/certified-operators-rrkkx" Dec 06 09:10:22 crc kubenswrapper[4672]: I1206 09:10:22.155012 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c188e7d-d705-41ce-bf0d-468de7745723-catalog-content\") pod \"certified-operators-rrkkx\" (UID: \"7c188e7d-d705-41ce-bf0d-468de7745723\") " pod="openshift-marketplace/certified-operators-rrkkx" Dec 06 09:10:22 crc kubenswrapper[4672]: I1206 09:10:22.155095 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c188e7d-d705-41ce-bf0d-468de7745723-utilities\") pod \"certified-operators-rrkkx\" (UID: \"7c188e7d-d705-41ce-bf0d-468de7745723\") " pod="openshift-marketplace/certified-operators-rrkkx" Dec 06 09:10:22 crc kubenswrapper[4672]: I1206 09:10:22.155123 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vjkl\" (UniqueName: \"kubernetes.io/projected/7c188e7d-d705-41ce-bf0d-468de7745723-kube-api-access-8vjkl\") pod \"certified-operators-rrkkx\" (UID: \"7c188e7d-d705-41ce-bf0d-468de7745723\") " pod="openshift-marketplace/certified-operators-rrkkx" Dec 06 09:10:22 crc kubenswrapper[4672]: I1206 09:10:22.155625 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c188e7d-d705-41ce-bf0d-468de7745723-catalog-content\") pod \"certified-operators-rrkkx\" (UID: \"7c188e7d-d705-41ce-bf0d-468de7745723\") " pod="openshift-marketplace/certified-operators-rrkkx" Dec 06 09:10:22 crc kubenswrapper[4672]: I1206 09:10:22.155978 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c188e7d-d705-41ce-bf0d-468de7745723-utilities\") pod \"certified-operators-rrkkx\" (UID: \"7c188e7d-d705-41ce-bf0d-468de7745723\") " pod="openshift-marketplace/certified-operators-rrkkx" Dec 06 09:10:22 crc kubenswrapper[4672]: I1206 09:10:22.180711 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vjkl\" (UniqueName: \"kubernetes.io/projected/7c188e7d-d705-41ce-bf0d-468de7745723-kube-api-access-8vjkl\") pod \"certified-operators-rrkkx\" (UID: \"7c188e7d-d705-41ce-bf0d-468de7745723\") " pod="openshift-marketplace/certified-operators-rrkkx" Dec 06 09:10:22 crc kubenswrapper[4672]: I1206 09:10:22.210394 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4mbvq"] Dec 06 09:10:22 crc kubenswrapper[4672]: I1206 09:10:22.328240 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rrkkx" Dec 06 09:10:22 crc kubenswrapper[4672]: I1206 09:10:22.577149 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dfe938c-2f3d-4e4c-9156-d2d87b4478fe" path="/var/lib/kubelet/pods/2dfe938c-2f3d-4e4c-9156-d2d87b4478fe/volumes" Dec 06 09:10:22 crc kubenswrapper[4672]: I1206 09:10:22.584984 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88387d22-3fdc-4004-a9a1-4e6467c2c3f4" path="/var/lib/kubelet/pods/88387d22-3fdc-4004-a9a1-4e6467c2c3f4/volumes" Dec 06 09:10:22 crc kubenswrapper[4672]: I1206 09:10:22.585664 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fef1798-dde5-4ef8-a4fa-5a5997738964" path="/var/lib/kubelet/pods/9fef1798-dde5-4ef8-a4fa-5a5997738964/volumes" Dec 06 09:10:22 crc kubenswrapper[4672]: I1206 09:10:22.586796 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ade26230-5c3c-4b75-bef5-9383cab17974" path="/var/lib/kubelet/pods/ade26230-5c3c-4b75-bef5-9383cab17974/volumes" Dec 06 09:10:22 crc kubenswrapper[4672]: I1206 09:10:22.587378 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0417e2e-2041-42b0-a404-236595aa99bd" path="/var/lib/kubelet/pods/d0417e2e-2041-42b0-a404-236595aa99bd/volumes" Dec 06 09:10:22 crc kubenswrapper[4672]: I1206 09:10:22.588236 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de182d83-c1ee-4b2a-827b-90fbf7d1e626" path="/var/lib/kubelet/pods/de182d83-c1ee-4b2a-827b-90fbf7d1e626/volumes" Dec 06 09:10:22 crc kubenswrapper[4672]: I1206 09:10:22.755076 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rrkkx"] Dec 06 09:10:22 crc kubenswrapper[4672]: I1206 09:10:22.827086 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rrkkx" event={"ID":"7c188e7d-d705-41ce-bf0d-468de7745723","Type":"ContainerStarted","Data":"599bb2a8fa581af652d32ab4d4a4ab86231e4eaab8af4415213101b6d27c3119"} Dec 06 09:10:22 crc kubenswrapper[4672]: I1206 09:10:22.829306 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mbvq" event={"ID":"55620a10-8ac9-47b4-88b9-7129c90c4ee4","Type":"ContainerStarted","Data":"77b66c914bdd626af583630f057008e753da7a14dc5921ddad10c1a4a4032c67"} Dec 06 09:10:23 crc kubenswrapper[4672]: I1206 09:10:23.802658 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x867j"] Dec 06 09:10:23 crc kubenswrapper[4672]: I1206 09:10:23.804106 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x867j" Dec 06 09:10:23 crc kubenswrapper[4672]: I1206 09:10:23.810408 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 06 09:10:23 crc kubenswrapper[4672]: I1206 09:10:23.818564 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x867j"] Dec 06 09:10:23 crc kubenswrapper[4672]: I1206 09:10:23.839916 4672 generic.go:334] "Generic (PLEG): container finished" podID="7c188e7d-d705-41ce-bf0d-468de7745723" containerID="62489942185bd5d195e13daa2ae72b9616c64348ce0edfbb121ad2a7b5cd3018" exitCode=0 Dec 06 09:10:23 crc kubenswrapper[4672]: I1206 09:10:23.839983 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rrkkx" event={"ID":"7c188e7d-d705-41ce-bf0d-468de7745723","Type":"ContainerDied","Data":"62489942185bd5d195e13daa2ae72b9616c64348ce0edfbb121ad2a7b5cd3018"} Dec 06 09:10:23 crc kubenswrapper[4672]: I1206 09:10:23.844798 4672 generic.go:334] "Generic (PLEG): container finished" podID="55620a10-8ac9-47b4-88b9-7129c90c4ee4" containerID="26f67fc849b68c25921c132b6db7eac9c39613da7227a60ec210acfe928b02ab" exitCode=0 Dec 06 09:10:23 crc kubenswrapper[4672]: I1206 09:10:23.844839 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mbvq" event={"ID":"55620a10-8ac9-47b4-88b9-7129c90c4ee4","Type":"ContainerDied","Data":"26f67fc849b68c25921c132b6db7eac9c39613da7227a60ec210acfe928b02ab"} Dec 06 09:10:23 crc kubenswrapper[4672]: I1206 09:10:23.885970 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6f78753-7aad-4178-bff5-d45475f4a3df-utilities\") pod \"community-operators-x867j\" (UID: \"e6f78753-7aad-4178-bff5-d45475f4a3df\") " pod="openshift-marketplace/community-operators-x867j" Dec 06 09:10:23 crc kubenswrapper[4672]: I1206 09:10:23.886057 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxckw\" (UniqueName: \"kubernetes.io/projected/e6f78753-7aad-4178-bff5-d45475f4a3df-kube-api-access-nxckw\") pod \"community-operators-x867j\" (UID: \"e6f78753-7aad-4178-bff5-d45475f4a3df\") " pod="openshift-marketplace/community-operators-x867j" Dec 06 09:10:23 crc kubenswrapper[4672]: I1206 09:10:23.886375 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6f78753-7aad-4178-bff5-d45475f4a3df-catalog-content\") pod \"community-operators-x867j\" (UID: \"e6f78753-7aad-4178-bff5-d45475f4a3df\") " pod="openshift-marketplace/community-operators-x867j" Dec 06 09:10:23 crc kubenswrapper[4672]: I1206 09:10:23.987638 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxckw\" (UniqueName: \"kubernetes.io/projected/e6f78753-7aad-4178-bff5-d45475f4a3df-kube-api-access-nxckw\") pod \"community-operators-x867j\" (UID: \"e6f78753-7aad-4178-bff5-d45475f4a3df\") " pod="openshift-marketplace/community-operators-x867j" Dec 06 09:10:23 crc kubenswrapper[4672]: I1206 09:10:23.987988 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6f78753-7aad-4178-bff5-d45475f4a3df-catalog-content\") pod \"community-operators-x867j\" (UID: \"e6f78753-7aad-4178-bff5-d45475f4a3df\") " pod="openshift-marketplace/community-operators-x867j" Dec 06 09:10:23 crc kubenswrapper[4672]: I1206 09:10:23.988741 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6f78753-7aad-4178-bff5-d45475f4a3df-catalog-content\") pod \"community-operators-x867j\" (UID: \"e6f78753-7aad-4178-bff5-d45475f4a3df\") " pod="openshift-marketplace/community-operators-x867j" Dec 06 09:10:23 crc kubenswrapper[4672]: I1206 09:10:23.988798 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6f78753-7aad-4178-bff5-d45475f4a3df-utilities\") pod \"community-operators-x867j\" (UID: \"e6f78753-7aad-4178-bff5-d45475f4a3df\") " pod="openshift-marketplace/community-operators-x867j" Dec 06 09:10:23 crc kubenswrapper[4672]: I1206 09:10:23.989581 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6f78753-7aad-4178-bff5-d45475f4a3df-utilities\") pod \"community-operators-x867j\" (UID: \"e6f78753-7aad-4178-bff5-d45475f4a3df\") " pod="openshift-marketplace/community-operators-x867j" Dec 06 09:10:24 crc kubenswrapper[4672]: I1206 09:10:24.016995 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxckw\" (UniqueName: \"kubernetes.io/projected/e6f78753-7aad-4178-bff5-d45475f4a3df-kube-api-access-nxckw\") pod \"community-operators-x867j\" (UID: \"e6f78753-7aad-4178-bff5-d45475f4a3df\") " pod="openshift-marketplace/community-operators-x867j" Dec 06 09:10:24 crc kubenswrapper[4672]: I1206 09:10:24.127330 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x867j" Dec 06 09:10:24 crc kubenswrapper[4672]: I1206 09:10:24.399805 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8whpj"] Dec 06 09:10:24 crc kubenswrapper[4672]: I1206 09:10:24.402826 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8whpj" Dec 06 09:10:24 crc kubenswrapper[4672]: I1206 09:10:24.407531 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 06 09:10:24 crc kubenswrapper[4672]: I1206 09:10:24.424176 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8whpj"] Dec 06 09:10:24 crc kubenswrapper[4672]: I1206 09:10:24.495295 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x4p4\" (UniqueName: \"kubernetes.io/projected/e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7-kube-api-access-8x4p4\") pod \"redhat-operators-8whpj\" (UID: \"e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7\") " pod="openshift-marketplace/redhat-operators-8whpj" Dec 06 09:10:24 crc kubenswrapper[4672]: I1206 09:10:24.495369 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7-utilities\") pod \"redhat-operators-8whpj\" (UID: \"e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7\") " pod="openshift-marketplace/redhat-operators-8whpj" Dec 06 09:10:24 crc kubenswrapper[4672]: I1206 09:10:24.495448 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7-catalog-content\") pod \"redhat-operators-8whpj\" (UID: \"e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7\") " pod="openshift-marketplace/redhat-operators-8whpj" Dec 06 09:10:24 crc kubenswrapper[4672]: I1206 09:10:24.589165 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x867j"] Dec 06 09:10:24 crc kubenswrapper[4672]: I1206 09:10:24.596572 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7-catalog-content\") pod \"redhat-operators-8whpj\" (UID: \"e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7\") " pod="openshift-marketplace/redhat-operators-8whpj" Dec 06 09:10:24 crc kubenswrapper[4672]: I1206 09:10:24.597147 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x4p4\" (UniqueName: \"kubernetes.io/projected/e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7-kube-api-access-8x4p4\") pod \"redhat-operators-8whpj\" (UID: \"e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7\") " pod="openshift-marketplace/redhat-operators-8whpj" Dec 06 09:10:24 crc kubenswrapper[4672]: I1206 09:10:24.597278 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7-utilities\") pod \"redhat-operators-8whpj\" (UID: \"e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7\") " pod="openshift-marketplace/redhat-operators-8whpj" Dec 06 09:10:24 crc kubenswrapper[4672]: I1206 09:10:24.597891 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7-utilities\") pod \"redhat-operators-8whpj\" (UID: \"e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7\") " pod="openshift-marketplace/redhat-operators-8whpj" Dec 06 09:10:24 crc kubenswrapper[4672]: I1206 09:10:24.598157 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7-catalog-content\") pod \"redhat-operators-8whpj\" (UID: \"e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7\") " pod="openshift-marketplace/redhat-operators-8whpj" Dec 06 09:10:24 crc kubenswrapper[4672]: W1206 09:10:24.604532 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6f78753_7aad_4178_bff5_d45475f4a3df.slice/crio-a4ab2dc3c397ad453e79c5eda17f9a4179d63ca3470cf4b1a50ad949078a2465 WatchSource:0}: Error finding container a4ab2dc3c397ad453e79c5eda17f9a4179d63ca3470cf4b1a50ad949078a2465: Status 404 returned error can't find the container with id a4ab2dc3c397ad453e79c5eda17f9a4179d63ca3470cf4b1a50ad949078a2465 Dec 06 09:10:24 crc kubenswrapper[4672]: I1206 09:10:24.621222 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x4p4\" (UniqueName: \"kubernetes.io/projected/e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7-kube-api-access-8x4p4\") pod \"redhat-operators-8whpj\" (UID: \"e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7\") " pod="openshift-marketplace/redhat-operators-8whpj" Dec 06 09:10:24 crc kubenswrapper[4672]: I1206 09:10:24.740254 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8whpj" Dec 06 09:10:24 crc kubenswrapper[4672]: I1206 09:10:24.891175 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rrkkx" event={"ID":"7c188e7d-d705-41ce-bf0d-468de7745723","Type":"ContainerStarted","Data":"5b7795c737da850396bb59f5ac72c3f2ad5300f05f57f3148285c4bdfa6c1434"} Dec 06 09:10:24 crc kubenswrapper[4672]: I1206 09:10:24.896807 4672 generic.go:334] "Generic (PLEG): container finished" podID="e6f78753-7aad-4178-bff5-d45475f4a3df" containerID="d61f3ed94115c561dd23e9d6453e5798e1416b6b154b831a926382dd64cb85ca" exitCode=0 Dec 06 09:10:24 crc kubenswrapper[4672]: I1206 09:10:24.896880 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x867j" event={"ID":"e6f78753-7aad-4178-bff5-d45475f4a3df","Type":"ContainerDied","Data":"d61f3ed94115c561dd23e9d6453e5798e1416b6b154b831a926382dd64cb85ca"} Dec 06 09:10:24 crc kubenswrapper[4672]: I1206 09:10:24.896906 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x867j" event={"ID":"e6f78753-7aad-4178-bff5-d45475f4a3df","Type":"ContainerStarted","Data":"a4ab2dc3c397ad453e79c5eda17f9a4179d63ca3470cf4b1a50ad949078a2465"} Dec 06 09:10:24 crc kubenswrapper[4672]: I1206 09:10:24.925114 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mbvq" event={"ID":"55620a10-8ac9-47b4-88b9-7129c90c4ee4","Type":"ContainerDied","Data":"c1fa8961f0ddd9e084cb5ee5cfe9872ebb9d69c4b79525dc701fa51b8e5daa69"} Dec 06 09:10:24 crc kubenswrapper[4672]: I1206 09:10:24.924244 4672 generic.go:334] "Generic (PLEG): container finished" podID="55620a10-8ac9-47b4-88b9-7129c90c4ee4" containerID="c1fa8961f0ddd9e084cb5ee5cfe9872ebb9d69c4b79525dc701fa51b8e5daa69" exitCode=0 Dec 06 09:10:25 crc kubenswrapper[4672]: I1206 09:10:25.259940 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8whpj"] Dec 06 09:10:25 crc kubenswrapper[4672]: W1206 09:10:25.269863 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode40e6dc4_bcb4_420e_93a6_7eb7c11e12c7.slice/crio-f33cba891b49eada65b568e729c3f43e958e8f28df675ba553bcbad5d67775ee WatchSource:0}: Error finding container f33cba891b49eada65b568e729c3f43e958e8f28df675ba553bcbad5d67775ee: Status 404 returned error can't find the container with id f33cba891b49eada65b568e729c3f43e958e8f28df675ba553bcbad5d67775ee Dec 06 09:10:25 crc kubenswrapper[4672]: I1206 09:10:25.940382 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mbvq" event={"ID":"55620a10-8ac9-47b4-88b9-7129c90c4ee4","Type":"ContainerStarted","Data":"1f047882e2990c3d5565e6974688aa59cf9731a61a63ecfc641777187b974401"} Dec 06 09:10:25 crc kubenswrapper[4672]: I1206 09:10:25.946737 4672 generic.go:334] "Generic (PLEG): container finished" podID="7c188e7d-d705-41ce-bf0d-468de7745723" containerID="5b7795c737da850396bb59f5ac72c3f2ad5300f05f57f3148285c4bdfa6c1434" exitCode=0 Dec 06 09:10:25 crc kubenswrapper[4672]: I1206 09:10:25.946840 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rrkkx" event={"ID":"7c188e7d-d705-41ce-bf0d-468de7745723","Type":"ContainerDied","Data":"5b7795c737da850396bb59f5ac72c3f2ad5300f05f57f3148285c4bdfa6c1434"} Dec 06 09:10:25 crc kubenswrapper[4672]: I1206 09:10:25.951650 4672 generic.go:334] "Generic (PLEG): container finished" podID="e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7" containerID="355299d41e36c0ce25713d568e2eb9e70b7bb8fc2fa3ef06bb50972c22a8754d" exitCode=0 Dec 06 09:10:25 crc kubenswrapper[4672]: I1206 09:10:25.951771 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8whpj" event={"ID":"e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7","Type":"ContainerDied","Data":"355299d41e36c0ce25713d568e2eb9e70b7bb8fc2fa3ef06bb50972c22a8754d"} Dec 06 09:10:25 crc kubenswrapper[4672]: I1206 09:10:25.951794 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8whpj" event={"ID":"e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7","Type":"ContainerStarted","Data":"f33cba891b49eada65b568e729c3f43e958e8f28df675ba553bcbad5d67775ee"} Dec 06 09:10:25 crc kubenswrapper[4672]: I1206 09:10:25.960706 4672 generic.go:334] "Generic (PLEG): container finished" podID="e6f78753-7aad-4178-bff5-d45475f4a3df" containerID="631f1fb7afcbcf9bf5daac09113f3f8830eddd8cf814beb34d939efb49e9b715" exitCode=0 Dec 06 09:10:25 crc kubenswrapper[4672]: I1206 09:10:25.960773 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x867j" event={"ID":"e6f78753-7aad-4178-bff5-d45475f4a3df","Type":"ContainerDied","Data":"631f1fb7afcbcf9bf5daac09113f3f8830eddd8cf814beb34d939efb49e9b715"} Dec 06 09:10:25 crc kubenswrapper[4672]: I1206 09:10:25.972172 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4mbvq" podStartSLOduration=3.472900314 podStartE2EDuration="4.972147185s" podCreationTimestamp="2025-12-06 09:10:21 +0000 UTC" firstStartedPulling="2025-12-06 09:10:23.84645709 +0000 UTC m=+241.590717377" lastFinishedPulling="2025-12-06 09:10:25.345703961 +0000 UTC m=+243.089964248" observedRunningTime="2025-12-06 09:10:25.961438358 +0000 UTC m=+243.705698645" watchObservedRunningTime="2025-12-06 09:10:25.972147185 +0000 UTC m=+243.716407482" Dec 06 09:10:26 crc kubenswrapper[4672]: I1206 09:10:26.970562 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x867j" event={"ID":"e6f78753-7aad-4178-bff5-d45475f4a3df","Type":"ContainerStarted","Data":"8cdfda0a964643da502df451b338373e4b519ea8eb2fcf978e24b8c190c57c73"} Dec 06 09:10:26 crc kubenswrapper[4672]: I1206 09:10:26.973898 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rrkkx" event={"ID":"7c188e7d-d705-41ce-bf0d-468de7745723","Type":"ContainerStarted","Data":"b35c9ecc58ee69c76b63bb672d7d0e66a9a41513eb5419e26a9550bcc06cb8db"} Dec 06 09:10:26 crc kubenswrapper[4672]: I1206 09:10:26.975894 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8whpj" event={"ID":"e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7","Type":"ContainerStarted","Data":"107e83a0b080f00525fb2da2aefaed4ad83c531a75e83fd21c6be80b281a8b15"} Dec 06 09:10:27 crc kubenswrapper[4672]: I1206 09:10:27.003560 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x867j" podStartSLOduration=2.479841561 podStartE2EDuration="4.003527517s" podCreationTimestamp="2025-12-06 09:10:23 +0000 UTC" firstStartedPulling="2025-12-06 09:10:24.901392812 +0000 UTC m=+242.645653099" lastFinishedPulling="2025-12-06 09:10:26.425078768 +0000 UTC m=+244.169339055" observedRunningTime="2025-12-06 09:10:26.998459546 +0000 UTC m=+244.742719853" watchObservedRunningTime="2025-12-06 09:10:27.003527517 +0000 UTC m=+244.747787804" Dec 06 09:10:27 crc kubenswrapper[4672]: I1206 09:10:27.048760 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rrkkx" podStartSLOduration=3.573707529 podStartE2EDuration="6.04873358s" podCreationTimestamp="2025-12-06 09:10:21 +0000 UTC" firstStartedPulling="2025-12-06 09:10:23.842206429 +0000 UTC m=+241.586466716" lastFinishedPulling="2025-12-06 09:10:26.31723248 +0000 UTC m=+244.061492767" observedRunningTime="2025-12-06 09:10:27.046333178 +0000 UTC m=+244.790593465" watchObservedRunningTime="2025-12-06 09:10:27.04873358 +0000 UTC m=+244.792993867" Dec 06 09:10:27 crc kubenswrapper[4672]: I1206 09:10:27.984661 4672 generic.go:334] "Generic (PLEG): container finished" podID="e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7" containerID="107e83a0b080f00525fb2da2aefaed4ad83c531a75e83fd21c6be80b281a8b15" exitCode=0 Dec 06 09:10:27 crc kubenswrapper[4672]: I1206 09:10:27.984952 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8whpj" event={"ID":"e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7","Type":"ContainerDied","Data":"107e83a0b080f00525fb2da2aefaed4ad83c531a75e83fd21c6be80b281a8b15"} Dec 06 09:10:28 crc kubenswrapper[4672]: I1206 09:10:28.992929 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8whpj" event={"ID":"e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7","Type":"ContainerStarted","Data":"23a262d313ca636a804bddb25f5f2457919786f3c5a63dd8e8c5f79ebe895e3b"} Dec 06 09:10:29 crc kubenswrapper[4672]: I1206 09:10:29.017894 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8whpj" podStartSLOduration=2.614744409 podStartE2EDuration="5.017870024s" podCreationTimestamp="2025-12-06 09:10:24 +0000 UTC" firstStartedPulling="2025-12-06 09:10:25.953165013 +0000 UTC m=+243.697425300" lastFinishedPulling="2025-12-06 09:10:28.356290628 +0000 UTC m=+246.100550915" observedRunningTime="2025-12-06 09:10:29.014192978 +0000 UTC m=+246.758453285" watchObservedRunningTime="2025-12-06 09:10:29.017870024 +0000 UTC m=+246.762130321" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.203217 4672 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.204427 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.206697 4672 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.206732 4672 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 06 09:10:30 crc kubenswrapper[4672]: E1206 09:10:30.206852 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.206870 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 09:10:30 crc kubenswrapper[4672]: E1206 09:10:30.206883 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.206888 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 06 09:10:30 crc kubenswrapper[4672]: E1206 09:10:30.206898 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.206906 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 06 09:10:30 crc kubenswrapper[4672]: E1206 09:10:30.206917 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.206922 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 06 09:10:30 crc kubenswrapper[4672]: E1206 09:10:30.206933 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.206939 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 06 09:10:30 crc kubenswrapper[4672]: E1206 09:10:30.206948 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.206954 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.207043 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.207053 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.207249 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.207259 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.207267 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.207301 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c" gracePeriod=15 Dec 06 09:10:30 crc kubenswrapper[4672]: E1206 09:10:30.207363 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.207372 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.207469 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.207505 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e" gracePeriod=15 Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.207551 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7" gracePeriod=15 Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.207589 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3" gracePeriod=15 Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.207651 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3" gracePeriod=15 Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.289880 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.290449 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.290503 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.290636 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.290702 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.290725 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.290744 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.290798 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.392468 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.392541 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.392565 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.392631 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.392668 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.392687 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.392704 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.392733 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.392733 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.392829 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.392860 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.392763 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.392862 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.392882 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.392836 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.392795 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:10:30 crc kubenswrapper[4672]: E1206 09:10:30.670643 4672 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.30:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 09:10:30 crc kubenswrapper[4672]: I1206 09:10:30.671308 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 09:10:30 crc kubenswrapper[4672]: W1206 09:10:30.696390 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-bc6cb15d133638bdb3c8370eee25f48e4ccba546d50cba59ba9f3d9a8f49cf1b WatchSource:0}: Error finding container bc6cb15d133638bdb3c8370eee25f48e4ccba546d50cba59ba9f3d9a8f49cf1b: Status 404 returned error can't find the container with id bc6cb15d133638bdb3c8370eee25f48e4ccba546d50cba59ba9f3d9a8f49cf1b Dec 06 09:10:30 crc kubenswrapper[4672]: E1206 09:10:30.700092 4672 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.30:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e9545bb0e3a16 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-06 09:10:30.699366934 +0000 UTC m=+248.443627221,LastTimestamp:2025-12-06 09:10:30.699366934 +0000 UTC m=+248.443627221,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 06 09:10:31 crc kubenswrapper[4672]: I1206 09:10:31.015324 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 06 09:10:31 crc kubenswrapper[4672]: I1206 09:10:31.017473 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 06 09:10:31 crc kubenswrapper[4672]: I1206 09:10:31.018503 4672 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e" exitCode=0 Dec 06 09:10:31 crc kubenswrapper[4672]: I1206 09:10:31.018551 4672 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7" exitCode=0 Dec 06 09:10:31 crc kubenswrapper[4672]: I1206 09:10:31.018567 4672 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3" exitCode=0 Dec 06 09:10:31 crc kubenswrapper[4672]: I1206 09:10:31.018581 4672 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3" exitCode=2 Dec 06 09:10:31 crc kubenswrapper[4672]: I1206 09:10:31.018713 4672 scope.go:117] "RemoveContainer" containerID="d2d287352c8a2f994db9d91fe81a584d5a863440f220b549cb9716d04acedda2" Dec 06 09:10:31 crc kubenswrapper[4672]: I1206 09:10:31.021268 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"7ea66acfb8f59389be68dfeb81052b2610d669316af31fbdd0fbdbd9a9884be3"} Dec 06 09:10:31 crc kubenswrapper[4672]: I1206 09:10:31.021327 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"bc6cb15d133638bdb3c8370eee25f48e4ccba546d50cba59ba9f3d9a8f49cf1b"} Dec 06 09:10:31 crc kubenswrapper[4672]: E1206 09:10:31.022965 4672 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.30:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 09:10:31 crc kubenswrapper[4672]: I1206 09:10:31.024923 4672 generic.go:334] "Generic (PLEG): container finished" podID="a99b5768-a729-41cc-9cfb-9c6ed85c9fc9" containerID="3a0ad79623a09617def88fec1e134c89610428cd08342f84c8e98092b21b5618" exitCode=0 Dec 06 09:10:31 crc kubenswrapper[4672]: I1206 09:10:31.024984 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a99b5768-a729-41cc-9cfb-9c6ed85c9fc9","Type":"ContainerDied","Data":"3a0ad79623a09617def88fec1e134c89610428cd08342f84c8e98092b21b5618"} Dec 06 09:10:31 crc kubenswrapper[4672]: I1206 09:10:31.025950 4672 status_manager.go:851] "Failed to get status for pod" podUID="a99b5768-a729-41cc-9cfb-9c6ed85c9fc9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:31 crc kubenswrapper[4672]: E1206 09:10:31.630066 4672 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.30:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e9545bb0e3a16 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-06 09:10:30.699366934 +0000 UTC m=+248.443627221,LastTimestamp:2025-12-06 09:10:30.699366934 +0000 UTC m=+248.443627221,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 06 09:10:31 crc kubenswrapper[4672]: I1206 09:10:31.727828 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4mbvq" Dec 06 09:10:31 crc kubenswrapper[4672]: I1206 09:10:31.727886 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4mbvq" Dec 06 09:10:31 crc kubenswrapper[4672]: I1206 09:10:31.791361 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4mbvq" Dec 06 09:10:31 crc kubenswrapper[4672]: I1206 09:10:31.791933 4672 status_manager.go:851] "Failed to get status for pod" podUID="55620a10-8ac9-47b4-88b9-7129c90c4ee4" pod="openshift-marketplace/redhat-marketplace-4mbvq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-4mbvq\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:31 crc kubenswrapper[4672]: I1206 09:10:31.792310 4672 status_manager.go:851] "Failed to get status for pod" podUID="a99b5768-a729-41cc-9cfb-9c6ed85c9fc9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:32 crc kubenswrapper[4672]: I1206 09:10:32.032221 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 06 09:10:32 crc kubenswrapper[4672]: I1206 09:10:32.098618 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4mbvq" Dec 06 09:10:32 crc kubenswrapper[4672]: I1206 09:10:32.099514 4672 status_manager.go:851] "Failed to get status for pod" podUID="55620a10-8ac9-47b4-88b9-7129c90c4ee4" pod="openshift-marketplace/redhat-marketplace-4mbvq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-4mbvq\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:32 crc kubenswrapper[4672]: I1206 09:10:32.100135 4672 status_manager.go:851] "Failed to get status for pod" podUID="a99b5768-a729-41cc-9cfb-9c6ed85c9fc9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:32 crc kubenswrapper[4672]: I1206 09:10:32.338930 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rrkkx" Dec 06 09:10:32 crc kubenswrapper[4672]: I1206 09:10:32.340296 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rrkkx" Dec 06 09:10:32 crc kubenswrapper[4672]: I1206 09:10:32.426925 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rrkkx" Dec 06 09:10:32 crc kubenswrapper[4672]: I1206 09:10:32.427505 4672 status_manager.go:851] "Failed to get status for pod" podUID="7c188e7d-d705-41ce-bf0d-468de7745723" pod="openshift-marketplace/certified-operators-rrkkx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rrkkx\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:32 crc kubenswrapper[4672]: I1206 09:10:32.427806 4672 status_manager.go:851] "Failed to get status for pod" podUID="55620a10-8ac9-47b4-88b9-7129c90c4ee4" pod="openshift-marketplace/redhat-marketplace-4mbvq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-4mbvq\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:32 crc kubenswrapper[4672]: I1206 09:10:32.428000 4672 status_manager.go:851] "Failed to get status for pod" podUID="a99b5768-a729-41cc-9cfb-9c6ed85c9fc9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:32 crc kubenswrapper[4672]: I1206 09:10:32.560983 4672 status_manager.go:851] "Failed to get status for pod" podUID="55620a10-8ac9-47b4-88b9-7129c90c4ee4" pod="openshift-marketplace/redhat-marketplace-4mbvq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-4mbvq\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:32 crc kubenswrapper[4672]: I1206 09:10:32.561817 4672 status_manager.go:851] "Failed to get status for pod" podUID="a99b5768-a729-41cc-9cfb-9c6ed85c9fc9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:32 crc kubenswrapper[4672]: I1206 09:10:32.562238 4672 status_manager.go:851] "Failed to get status for pod" podUID="7c188e7d-d705-41ce-bf0d-468de7745723" pod="openshift-marketplace/certified-operators-rrkkx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rrkkx\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:32 crc kubenswrapper[4672]: I1206 09:10:32.608718 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 06 09:10:32 crc kubenswrapper[4672]: I1206 09:10:32.609577 4672 status_manager.go:851] "Failed to get status for pod" podUID="55620a10-8ac9-47b4-88b9-7129c90c4ee4" pod="openshift-marketplace/redhat-marketplace-4mbvq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-4mbvq\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:32 crc kubenswrapper[4672]: I1206 09:10:32.610043 4672 status_manager.go:851] "Failed to get status for pod" podUID="a99b5768-a729-41cc-9cfb-9c6ed85c9fc9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:32 crc kubenswrapper[4672]: I1206 09:10:32.610581 4672 status_manager.go:851] "Failed to get status for pod" podUID="7c188e7d-d705-41ce-bf0d-468de7745723" pod="openshift-marketplace/certified-operators-rrkkx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rrkkx\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:32 crc kubenswrapper[4672]: I1206 09:10:32.728974 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a99b5768-a729-41cc-9cfb-9c6ed85c9fc9-var-lock\") pod \"a99b5768-a729-41cc-9cfb-9c6ed85c9fc9\" (UID: \"a99b5768-a729-41cc-9cfb-9c6ed85c9fc9\") " Dec 06 09:10:32 crc kubenswrapper[4672]: I1206 09:10:32.729268 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a99b5768-a729-41cc-9cfb-9c6ed85c9fc9-kubelet-dir\") pod \"a99b5768-a729-41cc-9cfb-9c6ed85c9fc9\" (UID: \"a99b5768-a729-41cc-9cfb-9c6ed85c9fc9\") " Dec 06 09:10:32 crc kubenswrapper[4672]: I1206 09:10:32.729322 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a99b5768-a729-41cc-9cfb-9c6ed85c9fc9-kube-api-access\") pod \"a99b5768-a729-41cc-9cfb-9c6ed85c9fc9\" (UID: \"a99b5768-a729-41cc-9cfb-9c6ed85c9fc9\") " Dec 06 09:10:32 crc kubenswrapper[4672]: I1206 09:10:32.730773 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a99b5768-a729-41cc-9cfb-9c6ed85c9fc9-var-lock" (OuterVolumeSpecName: "var-lock") pod "a99b5768-a729-41cc-9cfb-9c6ed85c9fc9" (UID: "a99b5768-a729-41cc-9cfb-9c6ed85c9fc9"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:10:32 crc kubenswrapper[4672]: I1206 09:10:32.730790 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a99b5768-a729-41cc-9cfb-9c6ed85c9fc9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a99b5768-a729-41cc-9cfb-9c6ed85c9fc9" (UID: "a99b5768-a729-41cc-9cfb-9c6ed85c9fc9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:10:32 crc kubenswrapper[4672]: I1206 09:10:32.746976 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a99b5768-a729-41cc-9cfb-9c6ed85c9fc9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a99b5768-a729-41cc-9cfb-9c6ed85c9fc9" (UID: "a99b5768-a729-41cc-9cfb-9c6ed85c9fc9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:10:32 crc kubenswrapper[4672]: I1206 09:10:32.831295 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a99b5768-a729-41cc-9cfb-9c6ed85c9fc9-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:32 crc kubenswrapper[4672]: I1206 09:10:32.831637 4672 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a99b5768-a729-41cc-9cfb-9c6ed85c9fc9-var-lock\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:32 crc kubenswrapper[4672]: I1206 09:10:32.831716 4672 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a99b5768-a729-41cc-9cfb-9c6ed85c9fc9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:33 crc kubenswrapper[4672]: I1206 09:10:33.042071 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 06 09:10:33 crc kubenswrapper[4672]: I1206 09:10:33.042753 4672 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c" exitCode=0 Dec 06 09:10:33 crc kubenswrapper[4672]: I1206 09:10:33.044494 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a99b5768-a729-41cc-9cfb-9c6ed85c9fc9","Type":"ContainerDied","Data":"5758bcbb33ca9ebbc408d7134193aed1b653558b2cff38e07ec1e595109b83c0"} Dec 06 09:10:33 crc kubenswrapper[4672]: I1206 09:10:33.044536 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5758bcbb33ca9ebbc408d7134193aed1b653558b2cff38e07ec1e595109b83c0" Dec 06 09:10:33 crc kubenswrapper[4672]: I1206 09:10:33.044673 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 06 09:10:33 crc kubenswrapper[4672]: I1206 09:10:33.061284 4672 status_manager.go:851] "Failed to get status for pod" podUID="55620a10-8ac9-47b4-88b9-7129c90c4ee4" pod="openshift-marketplace/redhat-marketplace-4mbvq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-4mbvq\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:33 crc kubenswrapper[4672]: I1206 09:10:33.061933 4672 status_manager.go:851] "Failed to get status for pod" podUID="a99b5768-a729-41cc-9cfb-9c6ed85c9fc9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:33 crc kubenswrapper[4672]: I1206 09:10:33.062229 4672 status_manager.go:851] "Failed to get status for pod" podUID="7c188e7d-d705-41ce-bf0d-468de7745723" pod="openshift-marketplace/certified-operators-rrkkx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rrkkx\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:33 crc kubenswrapper[4672]: I1206 09:10:33.090014 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rrkkx" Dec 06 09:10:33 crc kubenswrapper[4672]: I1206 09:10:33.090570 4672 status_manager.go:851] "Failed to get status for pod" podUID="55620a10-8ac9-47b4-88b9-7129c90c4ee4" pod="openshift-marketplace/redhat-marketplace-4mbvq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-4mbvq\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:33 crc kubenswrapper[4672]: I1206 09:10:33.090964 4672 status_manager.go:851] "Failed to get status for pod" podUID="a99b5768-a729-41cc-9cfb-9c6ed85c9fc9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:33 crc kubenswrapper[4672]: I1206 09:10:33.091646 4672 status_manager.go:851] "Failed to get status for pod" podUID="7c188e7d-d705-41ce-bf0d-468de7745723" pod="openshift-marketplace/certified-operators-rrkkx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rrkkx\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:33 crc kubenswrapper[4672]: I1206 09:10:33.217785 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 06 09:10:33 crc kubenswrapper[4672]: I1206 09:10:33.219525 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:10:33 crc kubenswrapper[4672]: I1206 09:10:33.220315 4672 status_manager.go:851] "Failed to get status for pod" podUID="7c188e7d-d705-41ce-bf0d-468de7745723" pod="openshift-marketplace/certified-operators-rrkkx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rrkkx\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:33 crc kubenswrapper[4672]: I1206 09:10:33.220721 4672 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:33 crc kubenswrapper[4672]: I1206 09:10:33.221232 4672 status_manager.go:851] "Failed to get status for pod" podUID="55620a10-8ac9-47b4-88b9-7129c90c4ee4" pod="openshift-marketplace/redhat-marketplace-4mbvq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-4mbvq\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:33 crc kubenswrapper[4672]: I1206 09:10:33.221507 4672 status_manager.go:851] "Failed to get status for pod" podUID="a99b5768-a729-41cc-9cfb-9c6ed85c9fc9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:33 crc kubenswrapper[4672]: I1206 09:10:33.237130 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 06 09:10:33 crc kubenswrapper[4672]: I1206 09:10:33.237240 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 06 09:10:33 crc kubenswrapper[4672]: I1206 09:10:33.237266 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:10:33 crc kubenswrapper[4672]: I1206 09:10:33.237297 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 06 09:10:33 crc kubenswrapper[4672]: I1206 09:10:33.237331 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:10:33 crc kubenswrapper[4672]: I1206 09:10:33.237395 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:10:33 crc kubenswrapper[4672]: I1206 09:10:33.237628 4672 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:33 crc kubenswrapper[4672]: I1206 09:10:33.237648 4672 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:33 crc kubenswrapper[4672]: I1206 09:10:33.237665 4672 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:34 crc kubenswrapper[4672]: I1206 09:10:34.053961 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 06 09:10:34 crc kubenswrapper[4672]: I1206 09:10:34.055981 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:10:34 crc kubenswrapper[4672]: I1206 09:10:34.059676 4672 scope.go:117] "RemoveContainer" containerID="b83e1916d6b882fc1c9a9bb8e518b1f721fb3a0bae23c702e91a6f8e479e597e" Dec 06 09:10:34 crc kubenswrapper[4672]: I1206 09:10:34.077730 4672 status_manager.go:851] "Failed to get status for pod" podUID="55620a10-8ac9-47b4-88b9-7129c90c4ee4" pod="openshift-marketplace/redhat-marketplace-4mbvq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-4mbvq\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:34 crc kubenswrapper[4672]: I1206 09:10:34.078115 4672 status_manager.go:851] "Failed to get status for pod" podUID="a99b5768-a729-41cc-9cfb-9c6ed85c9fc9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:34 crc kubenswrapper[4672]: I1206 09:10:34.078755 4672 status_manager.go:851] "Failed to get status for pod" podUID="7c188e7d-d705-41ce-bf0d-468de7745723" pod="openshift-marketplace/certified-operators-rrkkx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rrkkx\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:34 crc kubenswrapper[4672]: I1206 09:10:34.078942 4672 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:34 crc kubenswrapper[4672]: I1206 09:10:34.083331 4672 scope.go:117] "RemoveContainer" containerID="39b234b8f47392b0807ddc9e56a62151e74280c6fa983c1ecd4b6031e0a87fe7" Dec 06 09:10:34 crc kubenswrapper[4672]: I1206 09:10:34.100778 4672 scope.go:117] "RemoveContainer" containerID="36d7b8d31695f21c6aefa50b35c4d6cad2fd9d36982bd35ae8e2aa4e0a0962b3" Dec 06 09:10:34 crc kubenswrapper[4672]: I1206 09:10:34.115159 4672 scope.go:117] "RemoveContainer" containerID="8ca99b7154a1affae949e4e88bde986fe820886066d1e568640410a02b8ea7f3" Dec 06 09:10:34 crc kubenswrapper[4672]: I1206 09:10:34.130769 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x867j" Dec 06 09:10:34 crc kubenswrapper[4672]: I1206 09:10:34.131845 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x867j" Dec 06 09:10:34 crc kubenswrapper[4672]: I1206 09:10:34.131943 4672 scope.go:117] "RemoveContainer" containerID="7a8b63fb20db5c2d329d00b3942bde17f7bc389d6f24208d0783fd8466d1c86c" Dec 06 09:10:34 crc kubenswrapper[4672]: I1206 09:10:34.152753 4672 scope.go:117] "RemoveContainer" containerID="1a27d74dff39c28634fb10706f35fb472e352df187d08d68da91515c48fc35e4" Dec 06 09:10:34 crc kubenswrapper[4672]: I1206 09:10:34.189284 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x867j" Dec 06 09:10:34 crc kubenswrapper[4672]: I1206 09:10:34.189890 4672 status_manager.go:851] "Failed to get status for pod" podUID="7c188e7d-d705-41ce-bf0d-468de7745723" pod="openshift-marketplace/certified-operators-rrkkx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rrkkx\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:34 crc kubenswrapper[4672]: I1206 09:10:34.190134 4672 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:34 crc kubenswrapper[4672]: I1206 09:10:34.190334 4672 status_manager.go:851] "Failed to get status for pod" podUID="e6f78753-7aad-4178-bff5-d45475f4a3df" pod="openshift-marketplace/community-operators-x867j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-x867j\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:34 crc kubenswrapper[4672]: I1206 09:10:34.190550 4672 status_manager.go:851] "Failed to get status for pod" podUID="55620a10-8ac9-47b4-88b9-7129c90c4ee4" pod="openshift-marketplace/redhat-marketplace-4mbvq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-4mbvq\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:34 crc kubenswrapper[4672]: I1206 09:10:34.191107 4672 status_manager.go:851] "Failed to get status for pod" podUID="a99b5768-a729-41cc-9cfb-9c6ed85c9fc9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:34 crc kubenswrapper[4672]: I1206 09:10:34.565677 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 06 09:10:34 crc kubenswrapper[4672]: I1206 09:10:34.740435 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8whpj" Dec 06 09:10:34 crc kubenswrapper[4672]: I1206 09:10:34.740917 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8whpj" Dec 06 09:10:34 crc kubenswrapper[4672]: I1206 09:10:34.788399 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8whpj" Dec 06 09:10:34 crc kubenswrapper[4672]: I1206 09:10:34.788939 4672 status_manager.go:851] "Failed to get status for pod" podUID="e6f78753-7aad-4178-bff5-d45475f4a3df" pod="openshift-marketplace/community-operators-x867j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-x867j\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:34 crc kubenswrapper[4672]: I1206 09:10:34.789275 4672 status_manager.go:851] "Failed to get status for pod" podUID="55620a10-8ac9-47b4-88b9-7129c90c4ee4" pod="openshift-marketplace/redhat-marketplace-4mbvq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-4mbvq\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:34 crc kubenswrapper[4672]: I1206 09:10:34.789745 4672 status_manager.go:851] "Failed to get status for pod" podUID="a99b5768-a729-41cc-9cfb-9c6ed85c9fc9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:34 crc kubenswrapper[4672]: I1206 09:10:34.790018 4672 status_manager.go:851] "Failed to get status for pod" podUID="e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7" pod="openshift-marketplace/redhat-operators-8whpj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-8whpj\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:34 crc kubenswrapper[4672]: I1206 09:10:34.790396 4672 status_manager.go:851] "Failed to get status for pod" podUID="7c188e7d-d705-41ce-bf0d-468de7745723" pod="openshift-marketplace/certified-operators-rrkkx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rrkkx\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:35 crc kubenswrapper[4672]: I1206 09:10:35.098741 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x867j" Dec 06 09:10:35 crc kubenswrapper[4672]: I1206 09:10:35.099248 4672 status_manager.go:851] "Failed to get status for pod" podUID="e6f78753-7aad-4178-bff5-d45475f4a3df" pod="openshift-marketplace/community-operators-x867j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-x867j\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:35 crc kubenswrapper[4672]: I1206 09:10:35.099455 4672 status_manager.go:851] "Failed to get status for pod" podUID="55620a10-8ac9-47b4-88b9-7129c90c4ee4" pod="openshift-marketplace/redhat-marketplace-4mbvq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-4mbvq\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:35 crc kubenswrapper[4672]: I1206 09:10:35.099655 4672 status_manager.go:851] "Failed to get status for pod" podUID="a99b5768-a729-41cc-9cfb-9c6ed85c9fc9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:35 crc kubenswrapper[4672]: I1206 09:10:35.099916 4672 status_manager.go:851] "Failed to get status for pod" podUID="e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7" pod="openshift-marketplace/redhat-operators-8whpj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-8whpj\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:35 crc kubenswrapper[4672]: I1206 09:10:35.100121 4672 status_manager.go:851] "Failed to get status for pod" podUID="7c188e7d-d705-41ce-bf0d-468de7745723" pod="openshift-marketplace/certified-operators-rrkkx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rrkkx\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:35 crc kubenswrapper[4672]: I1206 09:10:35.101100 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8whpj" Dec 06 09:10:35 crc kubenswrapper[4672]: I1206 09:10:35.101317 4672 status_manager.go:851] "Failed to get status for pod" podUID="55620a10-8ac9-47b4-88b9-7129c90c4ee4" pod="openshift-marketplace/redhat-marketplace-4mbvq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-4mbvq\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:35 crc kubenswrapper[4672]: I1206 09:10:35.101500 4672 status_manager.go:851] "Failed to get status for pod" podUID="a99b5768-a729-41cc-9cfb-9c6ed85c9fc9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:35 crc kubenswrapper[4672]: I1206 09:10:35.101712 4672 status_manager.go:851] "Failed to get status for pod" podUID="e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7" pod="openshift-marketplace/redhat-operators-8whpj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-8whpj\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:35 crc kubenswrapper[4672]: I1206 09:10:35.101930 4672 status_manager.go:851] "Failed to get status for pod" podUID="7c188e7d-d705-41ce-bf0d-468de7745723" pod="openshift-marketplace/certified-operators-rrkkx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rrkkx\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:35 crc kubenswrapper[4672]: I1206 09:10:35.102147 4672 status_manager.go:851] "Failed to get status for pod" podUID="e6f78753-7aad-4178-bff5-d45475f4a3df" pod="openshift-marketplace/community-operators-x867j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-x867j\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:38 crc kubenswrapper[4672]: E1206 09:10:38.383643 4672 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:38 crc kubenswrapper[4672]: E1206 09:10:38.385461 4672 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:38 crc kubenswrapper[4672]: E1206 09:10:38.386087 4672 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:38 crc kubenswrapper[4672]: E1206 09:10:38.386577 4672 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:38 crc kubenswrapper[4672]: E1206 09:10:38.386954 4672 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:38 crc kubenswrapper[4672]: I1206 09:10:38.386981 4672 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 06 09:10:38 crc kubenswrapper[4672]: E1206 09:10:38.387394 4672 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="200ms" Dec 06 09:10:38 crc kubenswrapper[4672]: E1206 09:10:38.588953 4672 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="400ms" Dec 06 09:10:38 crc kubenswrapper[4672]: E1206 09:10:38.990652 4672 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="800ms" Dec 06 09:10:39 crc kubenswrapper[4672]: E1206 09:10:39.791979 4672 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="1.6s" Dec 06 09:10:40 crc kubenswrapper[4672]: E1206 09:10:40.630129 4672 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.30:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" volumeName="registry-storage" Dec 06 09:10:41 crc kubenswrapper[4672]: E1206 09:10:41.393622 4672 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="3.2s" Dec 06 09:10:41 crc kubenswrapper[4672]: E1206 09:10:41.631768 4672 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.30:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e9545bb0e3a16 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-06 09:10:30.699366934 +0000 UTC m=+248.443627221,LastTimestamp:2025-12-06 09:10:30.699366934 +0000 UTC m=+248.443627221,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 06 09:10:41 crc kubenswrapper[4672]: I1206 09:10:41.831207 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" podUID="d543231a-ae36-4b66-ac6a-fc3b48a0acb3" containerName="oauth-openshift" containerID="cri-o://c8da844de12dcbf66a5630dbb05a91305e0eced78e6cafaf7ed87af5be982b66" gracePeriod=15 Dec 06 09:10:42 crc kubenswrapper[4672]: I1206 09:10:42.559265 4672 status_manager.go:851] "Failed to get status for pod" podUID="55620a10-8ac9-47b4-88b9-7129c90c4ee4" pod="openshift-marketplace/redhat-marketplace-4mbvq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-4mbvq\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:42 crc kubenswrapper[4672]: I1206 09:10:42.560456 4672 status_manager.go:851] "Failed to get status for pod" podUID="a99b5768-a729-41cc-9cfb-9c6ed85c9fc9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:42 crc kubenswrapper[4672]: I1206 09:10:42.560823 4672 status_manager.go:851] "Failed to get status for pod" podUID="e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7" pod="openshift-marketplace/redhat-operators-8whpj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-8whpj\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:42 crc kubenswrapper[4672]: I1206 09:10:42.561060 4672 status_manager.go:851] "Failed to get status for pod" podUID="7c188e7d-d705-41ce-bf0d-468de7745723" pod="openshift-marketplace/certified-operators-rrkkx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rrkkx\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:42 crc kubenswrapper[4672]: I1206 09:10:42.561275 4672 status_manager.go:851] "Failed to get status for pod" podUID="e6f78753-7aad-4178-bff5-d45475f4a3df" pod="openshift-marketplace/community-operators-x867j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-x867j\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:44 crc kubenswrapper[4672]: I1206 09:10:44.119246 4672 generic.go:334] "Generic (PLEG): container finished" podID="d543231a-ae36-4b66-ac6a-fc3b48a0acb3" containerID="c8da844de12dcbf66a5630dbb05a91305e0eced78e6cafaf7ed87af5be982b66" exitCode=0 Dec 06 09:10:44 crc kubenswrapper[4672]: I1206 09:10:44.119313 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" event={"ID":"d543231a-ae36-4b66-ac6a-fc3b48a0acb3","Type":"ContainerDied","Data":"c8da844de12dcbf66a5630dbb05a91305e0eced78e6cafaf7ed87af5be982b66"} Dec 06 09:10:44 crc kubenswrapper[4672]: I1206 09:10:44.557854 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:10:44 crc kubenswrapper[4672]: I1206 09:10:44.559702 4672 status_manager.go:851] "Failed to get status for pod" podUID="e6f78753-7aad-4178-bff5-d45475f4a3df" pod="openshift-marketplace/community-operators-x867j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-x867j\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:44 crc kubenswrapper[4672]: I1206 09:10:44.559944 4672 status_manager.go:851] "Failed to get status for pod" podUID="55620a10-8ac9-47b4-88b9-7129c90c4ee4" pod="openshift-marketplace/redhat-marketplace-4mbvq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-4mbvq\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:44 crc kubenswrapper[4672]: I1206 09:10:44.560195 4672 status_manager.go:851] "Failed to get status for pod" podUID="a99b5768-a729-41cc-9cfb-9c6ed85c9fc9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:44 crc kubenswrapper[4672]: I1206 09:10:44.560411 4672 status_manager.go:851] "Failed to get status for pod" podUID="e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7" pod="openshift-marketplace/redhat-operators-8whpj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-8whpj\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:44 crc kubenswrapper[4672]: I1206 09:10:44.560769 4672 status_manager.go:851] "Failed to get status for pod" podUID="7c188e7d-d705-41ce-bf0d-468de7745723" pod="openshift-marketplace/certified-operators-rrkkx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rrkkx\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:44 crc kubenswrapper[4672]: I1206 09:10:44.580861 4672 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3505d55c-174e-4512-98f0-983267f3e3ea" Dec 06 09:10:44 crc kubenswrapper[4672]: I1206 09:10:44.580901 4672 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3505d55c-174e-4512-98f0-983267f3e3ea" Dec 06 09:10:44 crc kubenswrapper[4672]: E1206 09:10:44.581480 4672 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:10:44 crc kubenswrapper[4672]: I1206 09:10:44.582160 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:10:44 crc kubenswrapper[4672]: E1206 09:10:44.595070 4672 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="6.4s" Dec 06 09:10:44 crc kubenswrapper[4672]: I1206 09:10:44.986590 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:10:44 crc kubenswrapper[4672]: I1206 09:10:44.988106 4672 status_manager.go:851] "Failed to get status for pod" podUID="e6f78753-7aad-4178-bff5-d45475f4a3df" pod="openshift-marketplace/community-operators-x867j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-x867j\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:44 crc kubenswrapper[4672]: I1206 09:10:44.988568 4672 status_manager.go:851] "Failed to get status for pod" podUID="55620a10-8ac9-47b4-88b9-7129c90c4ee4" pod="openshift-marketplace/redhat-marketplace-4mbvq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-4mbvq\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:44 crc kubenswrapper[4672]: I1206 09:10:44.988941 4672 status_manager.go:851] "Failed to get status for pod" podUID="a99b5768-a729-41cc-9cfb-9c6ed85c9fc9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:44 crc kubenswrapper[4672]: I1206 09:10:44.989204 4672 status_manager.go:851] "Failed to get status for pod" podUID="e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7" pod="openshift-marketplace/redhat-operators-8whpj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-8whpj\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:44 crc kubenswrapper[4672]: I1206 09:10:44.989447 4672 status_manager.go:851] "Failed to get status for pod" podUID="7c188e7d-d705-41ce-bf0d-468de7745723" pod="openshift-marketplace/certified-operators-rrkkx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rrkkx\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:44 crc kubenswrapper[4672]: I1206 09:10:44.989683 4672 status_manager.go:851] "Failed to get status for pod" podUID="d543231a-ae36-4b66-ac6a-fc3b48a0acb3" pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-lm8cx\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.009287 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-cliconfig\") pod \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.009368 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwzdj\" (UniqueName: \"kubernetes.io/projected/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-kube-api-access-fwzdj\") pod \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.009392 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-router-certs\") pod \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.009432 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-audit-policies\") pod \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.009463 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-service-ca\") pod \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.009483 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-session\") pod \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.009578 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-user-template-provider-selection\") pod \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.009624 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-user-template-error\") pod \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.009648 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-user-idp-0-file-data\") pod \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.009671 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-trusted-ca-bundle\") pod \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.009694 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-user-template-login\") pod \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.009716 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-serving-cert\") pod \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.009741 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-audit-dir\") pod \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.009787 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-ocp-branding-template\") pod \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\" (UID: \"d543231a-ae36-4b66-ac6a-fc3b48a0acb3\") " Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.010445 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "d543231a-ae36-4b66-ac6a-fc3b48a0acb3" (UID: "d543231a-ae36-4b66-ac6a-fc3b48a0acb3"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.010559 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "d543231a-ae36-4b66-ac6a-fc3b48a0acb3" (UID: "d543231a-ae36-4b66-ac6a-fc3b48a0acb3"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.010810 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "d543231a-ae36-4b66-ac6a-fc3b48a0acb3" (UID: "d543231a-ae36-4b66-ac6a-fc3b48a0acb3"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.011576 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "d543231a-ae36-4b66-ac6a-fc3b48a0acb3" (UID: "d543231a-ae36-4b66-ac6a-fc3b48a0acb3"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.014887 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "d543231a-ae36-4b66-ac6a-fc3b48a0acb3" (UID: "d543231a-ae36-4b66-ac6a-fc3b48a0acb3"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.016587 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "d543231a-ae36-4b66-ac6a-fc3b48a0acb3" (UID: "d543231a-ae36-4b66-ac6a-fc3b48a0acb3"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.016659 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "d543231a-ae36-4b66-ac6a-fc3b48a0acb3" (UID: "d543231a-ae36-4b66-ac6a-fc3b48a0acb3"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.016859 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-kube-api-access-fwzdj" (OuterVolumeSpecName: "kube-api-access-fwzdj") pod "d543231a-ae36-4b66-ac6a-fc3b48a0acb3" (UID: "d543231a-ae36-4b66-ac6a-fc3b48a0acb3"). InnerVolumeSpecName "kube-api-access-fwzdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.017083 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "d543231a-ae36-4b66-ac6a-fc3b48a0acb3" (UID: "d543231a-ae36-4b66-ac6a-fc3b48a0acb3"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.017192 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "d543231a-ae36-4b66-ac6a-fc3b48a0acb3" (UID: "d543231a-ae36-4b66-ac6a-fc3b48a0acb3"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.017625 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "d543231a-ae36-4b66-ac6a-fc3b48a0acb3" (UID: "d543231a-ae36-4b66-ac6a-fc3b48a0acb3"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.017683 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "d543231a-ae36-4b66-ac6a-fc3b48a0acb3" (UID: "d543231a-ae36-4b66-ac6a-fc3b48a0acb3"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.017956 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "d543231a-ae36-4b66-ac6a-fc3b48a0acb3" (UID: "d543231a-ae36-4b66-ac6a-fc3b48a0acb3"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.018749 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "d543231a-ae36-4b66-ac6a-fc3b48a0acb3" (UID: "d543231a-ae36-4b66-ac6a-fc3b48a0acb3"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.111113 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.111162 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.111175 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwzdj\" (UniqueName: \"kubernetes.io/projected/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-kube-api-access-fwzdj\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.111185 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.111200 4672 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.111211 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.111224 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.111235 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.111245 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.111254 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.111263 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.111272 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.111281 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.111290 4672 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d543231a-ae36-4b66-ac6a-fc3b48a0acb3-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.126489 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.126722 4672 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92" exitCode=1 Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.126845 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92"} Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.127738 4672 scope.go:117] "RemoveContainer" containerID="759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.128182 4672 status_manager.go:851] "Failed to get status for pod" podUID="e6f78753-7aad-4178-bff5-d45475f4a3df" pod="openshift-marketplace/community-operators-x867j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-x867j\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.128711 4672 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="fa6a95d8c7886ffa272d451ccdda8c462dac8018d9b28171f19d808e8cc19213" exitCode=0 Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.128772 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"fa6a95d8c7886ffa272d451ccdda8c462dac8018d9b28171f19d808e8cc19213"} Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.128798 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a3f28ca8b4996b2cf72c97a98e55cd861a6182d809f460c9aae976bfa58e1e48"} Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.129205 4672 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3505d55c-174e-4512-98f0-983267f3e3ea" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.129222 4672 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3505d55c-174e-4512-98f0-983267f3e3ea" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.129480 4672 status_manager.go:851] "Failed to get status for pod" podUID="55620a10-8ac9-47b4-88b9-7129c90c4ee4" pod="openshift-marketplace/redhat-marketplace-4mbvq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-4mbvq\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:45 crc kubenswrapper[4672]: E1206 09:10:45.129621 4672 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.130734 4672 status_manager.go:851] "Failed to get status for pod" podUID="a99b5768-a729-41cc-9cfb-9c6ed85c9fc9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.131580 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" event={"ID":"d543231a-ae36-4b66-ac6a-fc3b48a0acb3","Type":"ContainerDied","Data":"075673aeebdbdc41ac66ce8a8366f6fe6ea1bde1b3e0a17fc12d625948d2e747"} Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.131748 4672 scope.go:117] "RemoveContainer" containerID="c8da844de12dcbf66a5630dbb05a91305e0eced78e6cafaf7ed87af5be982b66" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.131693 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.146459 4672 status_manager.go:851] "Failed to get status for pod" podUID="e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7" pod="openshift-marketplace/redhat-operators-8whpj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-8whpj\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.146909 4672 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.147144 4672 status_manager.go:851] "Failed to get status for pod" podUID="7c188e7d-d705-41ce-bf0d-468de7745723" pod="openshift-marketplace/certified-operators-rrkkx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rrkkx\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.147417 4672 status_manager.go:851] "Failed to get status for pod" podUID="d543231a-ae36-4b66-ac6a-fc3b48a0acb3" pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-lm8cx\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.148098 4672 status_manager.go:851] "Failed to get status for pod" podUID="e6f78753-7aad-4178-bff5-d45475f4a3df" pod="openshift-marketplace/community-operators-x867j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-x867j\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.148307 4672 status_manager.go:851] "Failed to get status for pod" podUID="55620a10-8ac9-47b4-88b9-7129c90c4ee4" pod="openshift-marketplace/redhat-marketplace-4mbvq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-4mbvq\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.148696 4672 status_manager.go:851] "Failed to get status for pod" podUID="a99b5768-a729-41cc-9cfb-9c6ed85c9fc9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.148905 4672 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.150144 4672 status_manager.go:851] "Failed to get status for pod" podUID="e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7" pod="openshift-marketplace/redhat-operators-8whpj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-8whpj\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.157087 4672 status_manager.go:851] "Failed to get status for pod" podUID="7c188e7d-d705-41ce-bf0d-468de7745723" pod="openshift-marketplace/certified-operators-rrkkx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rrkkx\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.157908 4672 status_manager.go:851] "Failed to get status for pod" podUID="d543231a-ae36-4b66-ac6a-fc3b48a0acb3" pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-lm8cx\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.160544 4672 status_manager.go:851] "Failed to get status for pod" podUID="d543231a-ae36-4b66-ac6a-fc3b48a0acb3" pod="openshift-authentication/oauth-openshift-558db77b4-lm8cx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-lm8cx\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.161083 4672 status_manager.go:851] "Failed to get status for pod" podUID="e6f78753-7aad-4178-bff5-d45475f4a3df" pod="openshift-marketplace/community-operators-x867j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-x867j\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.161294 4672 status_manager.go:851] "Failed to get status for pod" podUID="55620a10-8ac9-47b4-88b9-7129c90c4ee4" pod="openshift-marketplace/redhat-marketplace-4mbvq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-4mbvq\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.161710 4672 status_manager.go:851] "Failed to get status for pod" podUID="a99b5768-a729-41cc-9cfb-9c6ed85c9fc9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.161985 4672 status_manager.go:851] "Failed to get status for pod" podUID="e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7" pod="openshift-marketplace/redhat-operators-8whpj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-8whpj\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.162142 4672 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:45 crc kubenswrapper[4672]: I1206 09:10:45.162299 4672 status_manager.go:851] "Failed to get status for pod" podUID="7c188e7d-d705-41ce-bf0d-468de7745723" pod="openshift-marketplace/certified-operators-rrkkx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rrkkx\": dial tcp 38.102.83.30:6443: connect: connection refused" Dec 06 09:10:46 crc kubenswrapper[4672]: I1206 09:10:46.357372 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 09:10:46 crc kubenswrapper[4672]: I1206 09:10:46.368128 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 06 09:10:46 crc kubenswrapper[4672]: I1206 09:10:46.368397 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b0fd651e61a33daf174baa0089bba57a0d20881f888f7f54b9920cc2a61857ef"} Dec 06 09:10:46 crc kubenswrapper[4672]: I1206 09:10:46.375489 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a21251dc76aa9bac534914f7a499013731a80804469f47aa1f831f86e99c780b"} Dec 06 09:10:46 crc kubenswrapper[4672]: I1206 09:10:46.375556 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3319688a4987910172ea480d5019bd1d1ca5ee70394d60ac554b3acdf67b035a"} Dec 06 09:10:47 crc kubenswrapper[4672]: I1206 09:10:47.388836 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"383a01db16987f26b5ff9a27ba0f455d71ce8a887189b8509e66a3791082f049"} Dec 06 09:10:47 crc kubenswrapper[4672]: I1206 09:10:47.388911 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8cd0904c395688afe63091120f715cae1f1b3743247211aa641be1d8df5b5f93"} Dec 06 09:10:47 crc kubenswrapper[4672]: I1206 09:10:47.388926 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d218932e042a82e29eaef6b6f752de960a0c1e8c0ada3d1a663fb1bbfcb031de"} Dec 06 09:10:47 crc kubenswrapper[4672]: I1206 09:10:47.389358 4672 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3505d55c-174e-4512-98f0-983267f3e3ea" Dec 06 09:10:47 crc kubenswrapper[4672]: I1206 09:10:47.389398 4672 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3505d55c-174e-4512-98f0-983267f3e3ea" Dec 06 09:10:49 crc kubenswrapper[4672]: I1206 09:10:49.583275 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:10:49 crc kubenswrapper[4672]: I1206 09:10:49.585894 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:10:49 crc kubenswrapper[4672]: I1206 09:10:49.588867 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:10:52 crc kubenswrapper[4672]: I1206 09:10:52.534374 4672 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:10:52 crc kubenswrapper[4672]: I1206 09:10:52.776588 4672 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="2077b55b-8928-4177-8c93-c0703ebc6826" Dec 06 09:10:53 crc kubenswrapper[4672]: I1206 09:10:53.429979 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:10:53 crc kubenswrapper[4672]: I1206 09:10:53.430108 4672 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3505d55c-174e-4512-98f0-983267f3e3ea" Dec 06 09:10:53 crc kubenswrapper[4672]: I1206 09:10:53.430148 4672 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3505d55c-174e-4512-98f0-983267f3e3ea" Dec 06 09:10:53 crc kubenswrapper[4672]: I1206 09:10:53.435806 4672 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="2077b55b-8928-4177-8c93-c0703ebc6826" Dec 06 09:10:54 crc kubenswrapper[4672]: I1206 09:10:54.436347 4672 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3505d55c-174e-4512-98f0-983267f3e3ea" Dec 06 09:10:54 crc kubenswrapper[4672]: I1206 09:10:54.436393 4672 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3505d55c-174e-4512-98f0-983267f3e3ea" Dec 06 09:10:54 crc kubenswrapper[4672]: I1206 09:10:54.441659 4672 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="2077b55b-8928-4177-8c93-c0703ebc6826" Dec 06 09:10:54 crc kubenswrapper[4672]: I1206 09:10:54.443125 4672 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://3319688a4987910172ea480d5019bd1d1ca5ee70394d60ac554b3acdf67b035a" Dec 06 09:10:54 crc kubenswrapper[4672]: I1206 09:10:54.443291 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:10:54 crc kubenswrapper[4672]: I1206 09:10:54.475124 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 09:10:54 crc kubenswrapper[4672]: I1206 09:10:54.475522 4672 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 06 09:10:54 crc kubenswrapper[4672]: I1206 09:10:54.475669 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 06 09:10:55 crc kubenswrapper[4672]: I1206 09:10:55.441275 4672 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3505d55c-174e-4512-98f0-983267f3e3ea" Dec 06 09:10:55 crc kubenswrapper[4672]: I1206 09:10:55.441309 4672 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3505d55c-174e-4512-98f0-983267f3e3ea" Dec 06 09:10:55 crc kubenswrapper[4672]: I1206 09:10:55.445722 4672 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="2077b55b-8928-4177-8c93-c0703ebc6826" Dec 06 09:10:55 crc kubenswrapper[4672]: I1206 09:10:55.842045 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 09:11:01 crc kubenswrapper[4672]: I1206 09:11:01.954384 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 06 09:11:02 crc kubenswrapper[4672]: I1206 09:11:02.396872 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 06 09:11:02 crc kubenswrapper[4672]: I1206 09:11:02.513706 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 06 09:11:02 crc kubenswrapper[4672]: I1206 09:11:02.905701 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 06 09:11:03 crc kubenswrapper[4672]: I1206 09:11:03.203555 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 06 09:11:03 crc kubenswrapper[4672]: I1206 09:11:03.815650 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 06 09:11:03 crc kubenswrapper[4672]: I1206 09:11:03.844585 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 06 09:11:03 crc kubenswrapper[4672]: I1206 09:11:03.990532 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 06 09:11:04 crc kubenswrapper[4672]: I1206 09:11:04.460699 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 06 09:11:04 crc kubenswrapper[4672]: I1206 09:11:04.474565 4672 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 06 09:11:04 crc kubenswrapper[4672]: I1206 09:11:04.474650 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 06 09:11:05 crc kubenswrapper[4672]: I1206 09:11:05.021066 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 06 09:11:05 crc kubenswrapper[4672]: I1206 09:11:05.133295 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 06 09:11:05 crc kubenswrapper[4672]: I1206 09:11:05.226359 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 06 09:11:05 crc kubenswrapper[4672]: I1206 09:11:05.326846 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 06 09:11:05 crc kubenswrapper[4672]: I1206 09:11:05.363336 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 06 09:11:05 crc kubenswrapper[4672]: I1206 09:11:05.372461 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 06 09:11:05 crc kubenswrapper[4672]: I1206 09:11:05.410404 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 06 09:11:05 crc kubenswrapper[4672]: I1206 09:11:05.569581 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 06 09:11:05 crc kubenswrapper[4672]: I1206 09:11:05.634764 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 06 09:11:05 crc kubenswrapper[4672]: I1206 09:11:05.652223 4672 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 06 09:11:05 crc kubenswrapper[4672]: I1206 09:11:05.759655 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 06 09:11:05 crc kubenswrapper[4672]: I1206 09:11:05.985550 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 06 09:11:06 crc kubenswrapper[4672]: I1206 09:11:06.098475 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 06 09:11:06 crc kubenswrapper[4672]: I1206 09:11:06.130804 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 06 09:11:06 crc kubenswrapper[4672]: I1206 09:11:06.221559 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 06 09:11:06 crc kubenswrapper[4672]: I1206 09:11:06.304953 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 06 09:11:06 crc kubenswrapper[4672]: I1206 09:11:06.371423 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 06 09:11:06 crc kubenswrapper[4672]: I1206 09:11:06.502751 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 06 09:11:06 crc kubenswrapper[4672]: I1206 09:11:06.543992 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 06 09:11:06 crc kubenswrapper[4672]: I1206 09:11:06.544454 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 06 09:11:06 crc kubenswrapper[4672]: I1206 09:11:06.550477 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 06 09:11:06 crc kubenswrapper[4672]: I1206 09:11:06.583854 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 06 09:11:06 crc kubenswrapper[4672]: I1206 09:11:06.644052 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 06 09:11:06 crc kubenswrapper[4672]: I1206 09:11:06.645519 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 06 09:11:06 crc kubenswrapper[4672]: I1206 09:11:06.705505 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 06 09:11:06 crc kubenswrapper[4672]: I1206 09:11:06.720275 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 06 09:11:06 crc kubenswrapper[4672]: I1206 09:11:06.754402 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 06 09:11:06 crc kubenswrapper[4672]: I1206 09:11:06.788763 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 06 09:11:06 crc kubenswrapper[4672]: I1206 09:11:06.821326 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 06 09:11:06 crc kubenswrapper[4672]: I1206 09:11:06.823479 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 06 09:11:06 crc kubenswrapper[4672]: I1206 09:11:06.864027 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 06 09:11:06 crc kubenswrapper[4672]: I1206 09:11:06.883015 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 06 09:11:06 crc kubenswrapper[4672]: I1206 09:11:06.995296 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 06 09:11:07 crc kubenswrapper[4672]: I1206 09:11:07.009226 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 06 09:11:07 crc kubenswrapper[4672]: I1206 09:11:07.030899 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 06 09:11:07 crc kubenswrapper[4672]: I1206 09:11:07.106263 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 06 09:11:07 crc kubenswrapper[4672]: I1206 09:11:07.128297 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 06 09:11:07 crc kubenswrapper[4672]: I1206 09:11:07.159156 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 06 09:11:07 crc kubenswrapper[4672]: I1206 09:11:07.235061 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 06 09:11:07 crc kubenswrapper[4672]: I1206 09:11:07.282099 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 06 09:11:07 crc kubenswrapper[4672]: I1206 09:11:07.310682 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 06 09:11:07 crc kubenswrapper[4672]: I1206 09:11:07.339894 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 06 09:11:07 crc kubenswrapper[4672]: I1206 09:11:07.558787 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 06 09:11:07 crc kubenswrapper[4672]: I1206 09:11:07.657108 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 06 09:11:07 crc kubenswrapper[4672]: I1206 09:11:07.686317 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 06 09:11:07 crc kubenswrapper[4672]: I1206 09:11:07.689102 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 06 09:11:07 crc kubenswrapper[4672]: I1206 09:11:07.722336 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 06 09:11:07 crc kubenswrapper[4672]: I1206 09:11:07.789481 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 06 09:11:08 crc kubenswrapper[4672]: I1206 09:11:08.058920 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 06 09:11:08 crc kubenswrapper[4672]: I1206 09:11:08.136219 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 06 09:11:08 crc kubenswrapper[4672]: I1206 09:11:08.153403 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 06 09:11:08 crc kubenswrapper[4672]: I1206 09:11:08.156522 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 06 09:11:08 crc kubenswrapper[4672]: I1206 09:11:08.292219 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 06 09:11:08 crc kubenswrapper[4672]: I1206 09:11:08.343246 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 06 09:11:08 crc kubenswrapper[4672]: I1206 09:11:08.407668 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 06 09:11:08 crc kubenswrapper[4672]: I1206 09:11:08.558487 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 06 09:11:08 crc kubenswrapper[4672]: I1206 09:11:08.572649 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 06 09:11:08 crc kubenswrapper[4672]: I1206 09:11:08.618546 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 06 09:11:08 crc kubenswrapper[4672]: I1206 09:11:08.633001 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 06 09:11:08 crc kubenswrapper[4672]: I1206 09:11:08.762496 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 06 09:11:08 crc kubenswrapper[4672]: I1206 09:11:08.774104 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 06 09:11:08 crc kubenswrapper[4672]: I1206 09:11:08.837712 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 06 09:11:08 crc kubenswrapper[4672]: I1206 09:11:08.960012 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 06 09:11:09 crc kubenswrapper[4672]: I1206 09:11:09.169748 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 06 09:11:09 crc kubenswrapper[4672]: I1206 09:11:09.271827 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 06 09:11:09 crc kubenswrapper[4672]: I1206 09:11:09.312815 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 06 09:11:09 crc kubenswrapper[4672]: I1206 09:11:09.344276 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 06 09:11:09 crc kubenswrapper[4672]: I1206 09:11:09.410786 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 06 09:11:09 crc kubenswrapper[4672]: I1206 09:11:09.412320 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 06 09:11:09 crc kubenswrapper[4672]: I1206 09:11:09.440179 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 06 09:11:09 crc kubenswrapper[4672]: I1206 09:11:09.448553 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 06 09:11:09 crc kubenswrapper[4672]: I1206 09:11:09.453006 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 06 09:11:09 crc kubenswrapper[4672]: I1206 09:11:09.475517 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 06 09:11:09 crc kubenswrapper[4672]: I1206 09:11:09.569373 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 06 09:11:09 crc kubenswrapper[4672]: I1206 09:11:09.614309 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 06 09:11:09 crc kubenswrapper[4672]: I1206 09:11:09.737048 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 06 09:11:09 crc kubenswrapper[4672]: I1206 09:11:09.771084 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 06 09:11:09 crc kubenswrapper[4672]: I1206 09:11:09.860878 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 06 09:11:09 crc kubenswrapper[4672]: I1206 09:11:09.954372 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 06 09:11:09 crc kubenswrapper[4672]: I1206 09:11:09.959580 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 06 09:11:10 crc kubenswrapper[4672]: I1206 09:11:10.028566 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 06 09:11:10 crc kubenswrapper[4672]: I1206 09:11:10.131512 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 06 09:11:10 crc kubenswrapper[4672]: I1206 09:11:10.147187 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 06 09:11:10 crc kubenswrapper[4672]: I1206 09:11:10.204225 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 06 09:11:10 crc kubenswrapper[4672]: I1206 09:11:10.278120 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 06 09:11:10 crc kubenswrapper[4672]: I1206 09:11:10.282100 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 06 09:11:10 crc kubenswrapper[4672]: I1206 09:11:10.304449 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 06 09:11:10 crc kubenswrapper[4672]: I1206 09:11:10.368775 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 06 09:11:10 crc kubenswrapper[4672]: I1206 09:11:10.379688 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 06 09:11:10 crc kubenswrapper[4672]: I1206 09:11:10.449192 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 06 09:11:10 crc kubenswrapper[4672]: I1206 09:11:10.474796 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 06 09:11:10 crc kubenswrapper[4672]: I1206 09:11:10.642724 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 06 09:11:10 crc kubenswrapper[4672]: I1206 09:11:10.710373 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 06 09:11:10 crc kubenswrapper[4672]: I1206 09:11:10.734143 4672 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 06 09:11:10 crc kubenswrapper[4672]: I1206 09:11:10.806792 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 06 09:11:10 crc kubenswrapper[4672]: I1206 09:11:10.855722 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 06 09:11:10 crc kubenswrapper[4672]: I1206 09:11:10.857059 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 06 09:11:10 crc kubenswrapper[4672]: I1206 09:11:10.861869 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 06 09:11:10 crc kubenswrapper[4672]: I1206 09:11:10.876140 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 06 09:11:10 crc kubenswrapper[4672]: I1206 09:11:10.890589 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 06 09:11:10 crc kubenswrapper[4672]: I1206 09:11:10.901976 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 06 09:11:10 crc kubenswrapper[4672]: I1206 09:11:10.994792 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 06 09:11:11 crc kubenswrapper[4672]: I1206 09:11:11.056122 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 06 09:11:11 crc kubenswrapper[4672]: I1206 09:11:11.105738 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 06 09:11:11 crc kubenswrapper[4672]: I1206 09:11:11.110145 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 06 09:11:11 crc kubenswrapper[4672]: I1206 09:11:11.191183 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 06 09:11:11 crc kubenswrapper[4672]: I1206 09:11:11.207028 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 06 09:11:11 crc kubenswrapper[4672]: I1206 09:11:11.509505 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 06 09:11:11 crc kubenswrapper[4672]: I1206 09:11:11.591869 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 06 09:11:11 crc kubenswrapper[4672]: I1206 09:11:11.611766 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 06 09:11:11 crc kubenswrapper[4672]: I1206 09:11:11.615012 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 06 09:11:11 crc kubenswrapper[4672]: I1206 09:11:11.732269 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 06 09:11:11 crc kubenswrapper[4672]: I1206 09:11:11.774845 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 06 09:11:11 crc kubenswrapper[4672]: I1206 09:11:11.794625 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 06 09:11:11 crc kubenswrapper[4672]: I1206 09:11:11.805914 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 06 09:11:11 crc kubenswrapper[4672]: I1206 09:11:11.816284 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 06 09:11:11 crc kubenswrapper[4672]: I1206 09:11:11.849476 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 06 09:11:11 crc kubenswrapper[4672]: I1206 09:11:11.897072 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 06 09:11:11 crc kubenswrapper[4672]: I1206 09:11:11.939373 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 06 09:11:12 crc kubenswrapper[4672]: I1206 09:11:12.019065 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 06 09:11:12 crc kubenswrapper[4672]: I1206 09:11:12.115381 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 06 09:11:12 crc kubenswrapper[4672]: I1206 09:11:12.228943 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 06 09:11:12 crc kubenswrapper[4672]: I1206 09:11:12.310047 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 06 09:11:12 crc kubenswrapper[4672]: I1206 09:11:12.341344 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 06 09:11:12 crc kubenswrapper[4672]: I1206 09:11:12.378514 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 06 09:11:12 crc kubenswrapper[4672]: I1206 09:11:12.423392 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 06 09:11:12 crc kubenswrapper[4672]: I1206 09:11:12.694456 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 06 09:11:12 crc kubenswrapper[4672]: I1206 09:11:12.724445 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 06 09:11:12 crc kubenswrapper[4672]: I1206 09:11:12.750900 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 06 09:11:12 crc kubenswrapper[4672]: I1206 09:11:12.823847 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 06 09:11:12 crc kubenswrapper[4672]: I1206 09:11:12.869050 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 06 09:11:12 crc kubenswrapper[4672]: I1206 09:11:12.971417 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 06 09:11:13 crc kubenswrapper[4672]: I1206 09:11:13.100667 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 06 09:11:13 crc kubenswrapper[4672]: I1206 09:11:13.115033 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 06 09:11:13 crc kubenswrapper[4672]: I1206 09:11:13.269982 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 06 09:11:13 crc kubenswrapper[4672]: I1206 09:11:13.276863 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 06 09:11:13 crc kubenswrapper[4672]: I1206 09:11:13.330790 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 06 09:11:13 crc kubenswrapper[4672]: I1206 09:11:13.427020 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 06 09:11:13 crc kubenswrapper[4672]: I1206 09:11:13.461838 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 06 09:11:13 crc kubenswrapper[4672]: I1206 09:11:13.510434 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 06 09:11:13 crc kubenswrapper[4672]: I1206 09:11:13.614575 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 06 09:11:13 crc kubenswrapper[4672]: I1206 09:11:13.708680 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 06 09:11:13 crc kubenswrapper[4672]: I1206 09:11:13.722900 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 06 09:11:13 crc kubenswrapper[4672]: I1206 09:11:13.730553 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 06 09:11:13 crc kubenswrapper[4672]: I1206 09:11:13.843872 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 06 09:11:13 crc kubenswrapper[4672]: I1206 09:11:13.854317 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 06 09:11:13 crc kubenswrapper[4672]: I1206 09:11:13.907687 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 06 09:11:13 crc kubenswrapper[4672]: I1206 09:11:13.958525 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 06 09:11:13 crc kubenswrapper[4672]: I1206 09:11:13.968278 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 06 09:11:13 crc kubenswrapper[4672]: I1206 09:11:13.974266 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 06 09:11:14 crc kubenswrapper[4672]: I1206 09:11:14.224077 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 06 09:11:14 crc kubenswrapper[4672]: I1206 09:11:14.274925 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 06 09:11:14 crc kubenswrapper[4672]: I1206 09:11:14.312358 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 06 09:11:14 crc kubenswrapper[4672]: I1206 09:11:14.363379 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 06 09:11:14 crc kubenswrapper[4672]: I1206 09:11:14.384111 4672 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 06 09:11:14 crc kubenswrapper[4672]: I1206 09:11:14.384488 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 06 09:11:14 crc kubenswrapper[4672]: I1206 09:11:14.395284 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 06 09:11:14 crc kubenswrapper[4672]: I1206 09:11:14.395332 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 06 09:11:14 crc kubenswrapper[4672]: I1206 09:11:14.434850 4672 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 06 09:11:14 crc kubenswrapper[4672]: I1206 09:11:14.474641 4672 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 06 09:11:14 crc kubenswrapper[4672]: I1206 09:11:14.474736 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 06 09:11:14 crc kubenswrapper[4672]: I1206 09:11:14.474817 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 09:11:14 crc kubenswrapper[4672]: I1206 09:11:14.475878 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"b0fd651e61a33daf174baa0089bba57a0d20881f888f7f54b9920cc2a61857ef"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Dec 06 09:11:14 crc kubenswrapper[4672]: I1206 09:11:14.476033 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://b0fd651e61a33daf174baa0089bba57a0d20881f888f7f54b9920cc2a61857ef" gracePeriod=30 Dec 06 09:11:14 crc kubenswrapper[4672]: I1206 09:11:14.501514 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 06 09:11:14 crc kubenswrapper[4672]: I1206 09:11:14.523463 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 06 09:11:14 crc kubenswrapper[4672]: I1206 09:11:14.545730 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 06 09:11:14 crc kubenswrapper[4672]: I1206 09:11:14.659084 4672 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 06 09:11:14 crc kubenswrapper[4672]: I1206 09:11:14.659624 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 06 09:11:14 crc kubenswrapper[4672]: I1206 09:11:14.712885 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 06 09:11:14 crc kubenswrapper[4672]: I1206 09:11:14.742905 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 06 09:11:14 crc kubenswrapper[4672]: I1206 09:11:14.766974 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 06 09:11:14 crc kubenswrapper[4672]: I1206 09:11:14.837976 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 06 09:11:14 crc kubenswrapper[4672]: I1206 09:11:14.865509 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 06 09:11:14 crc kubenswrapper[4672]: I1206 09:11:14.866701 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 06 09:11:14 crc kubenswrapper[4672]: I1206 09:11:14.921455 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 06 09:11:14 crc kubenswrapper[4672]: I1206 09:11:14.930109 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 06 09:11:15 crc kubenswrapper[4672]: I1206 09:11:15.094464 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 06 09:11:15 crc kubenswrapper[4672]: I1206 09:11:15.108551 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 06 09:11:15 crc kubenswrapper[4672]: I1206 09:11:15.138210 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 06 09:11:15 crc kubenswrapper[4672]: I1206 09:11:15.241383 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 06 09:11:15 crc kubenswrapper[4672]: I1206 09:11:15.296434 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 06 09:11:15 crc kubenswrapper[4672]: I1206 09:11:15.391386 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 06 09:11:15 crc kubenswrapper[4672]: I1206 09:11:15.425578 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 06 09:11:15 crc kubenswrapper[4672]: I1206 09:11:15.439639 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 06 09:11:15 crc kubenswrapper[4672]: I1206 09:11:15.480112 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 06 09:11:15 crc kubenswrapper[4672]: I1206 09:11:15.532780 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 06 09:11:15 crc kubenswrapper[4672]: I1206 09:11:15.583094 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 06 09:11:15 crc kubenswrapper[4672]: I1206 09:11:15.607954 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 06 09:11:15 crc kubenswrapper[4672]: I1206 09:11:15.620655 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 06 09:11:15 crc kubenswrapper[4672]: I1206 09:11:15.654983 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 06 09:11:15 crc kubenswrapper[4672]: I1206 09:11:15.858250 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 06 09:11:15 crc kubenswrapper[4672]: I1206 09:11:15.971782 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 06 09:11:15 crc kubenswrapper[4672]: I1206 09:11:15.989397 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 06 09:11:16 crc kubenswrapper[4672]: I1206 09:11:16.128544 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 06 09:11:16 crc kubenswrapper[4672]: I1206 09:11:16.166793 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 06 09:11:16 crc kubenswrapper[4672]: I1206 09:11:16.221069 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 06 09:11:16 crc kubenswrapper[4672]: I1206 09:11:16.346724 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 06 09:11:16 crc kubenswrapper[4672]: I1206 09:11:16.360715 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 06 09:11:16 crc kubenswrapper[4672]: I1206 09:11:16.411360 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 06 09:11:16 crc kubenswrapper[4672]: I1206 09:11:16.440513 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 06 09:11:16 crc kubenswrapper[4672]: I1206 09:11:16.666847 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 06 09:11:16 crc kubenswrapper[4672]: I1206 09:11:16.673219 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 06 09:11:16 crc kubenswrapper[4672]: I1206 09:11:16.762229 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 06 09:11:16 crc kubenswrapper[4672]: I1206 09:11:16.801092 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 06 09:11:16 crc kubenswrapper[4672]: I1206 09:11:16.892055 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 06 09:11:16 crc kubenswrapper[4672]: I1206 09:11:16.892298 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 06 09:11:16 crc kubenswrapper[4672]: I1206 09:11:16.904374 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 06 09:11:16 crc kubenswrapper[4672]: I1206 09:11:16.952784 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 06 09:11:16 crc kubenswrapper[4672]: I1206 09:11:16.990618 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 06 09:11:17 crc kubenswrapper[4672]: I1206 09:11:17.106841 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 06 09:11:17 crc kubenswrapper[4672]: I1206 09:11:17.136570 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 06 09:11:17 crc kubenswrapper[4672]: I1206 09:11:17.137815 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 06 09:11:17 crc kubenswrapper[4672]: I1206 09:11:17.193840 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 06 09:11:17 crc kubenswrapper[4672]: I1206 09:11:17.194101 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 06 09:11:17 crc kubenswrapper[4672]: I1206 09:11:17.233637 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 06 09:11:17 crc kubenswrapper[4672]: I1206 09:11:17.500731 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 06 09:11:17 crc kubenswrapper[4672]: I1206 09:11:17.557250 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 06 09:11:17 crc kubenswrapper[4672]: I1206 09:11:17.610128 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 06 09:11:17 crc kubenswrapper[4672]: I1206 09:11:17.612542 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 06 09:11:17 crc kubenswrapper[4672]: I1206 09:11:17.621298 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 06 09:11:17 crc kubenswrapper[4672]: I1206 09:11:17.776128 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 06 09:11:17 crc kubenswrapper[4672]: I1206 09:11:17.836943 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 06 09:11:17 crc kubenswrapper[4672]: I1206 09:11:17.954477 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.041702 4672 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.060731 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lm8cx","openshift-kube-apiserver/kube-apiserver-crc"] Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.060812 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-54b5c98c4-hrns7","openshift-kube-apiserver/kube-apiserver-crc"] Dec 06 09:11:18 crc kubenswrapper[4672]: E1206 09:11:18.061821 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a99b5768-a729-41cc-9cfb-9c6ed85c9fc9" containerName="installer" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.062052 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="a99b5768-a729-41cc-9cfb-9c6ed85c9fc9" containerName="installer" Dec 06 09:11:18 crc kubenswrapper[4672]: E1206 09:11:18.062264 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d543231a-ae36-4b66-ac6a-fc3b48a0acb3" containerName="oauth-openshift" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.062453 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d543231a-ae36-4b66-ac6a-fc3b48a0acb3" containerName="oauth-openshift" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.062521 4672 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3505d55c-174e-4512-98f0-983267f3e3ea" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.062833 4672 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3505d55c-174e-4512-98f0-983267f3e3ea" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.063303 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="d543231a-ae36-4b66-ac6a-fc3b48a0acb3" containerName="oauth-openshift" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.063565 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="a99b5768-a729-41cc-9cfb-9c6ed85c9fc9" containerName="installer" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.064514 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.069090 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.069632 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.070199 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.071431 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.072511 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.072659 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.073083 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.074080 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.074801 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.079906 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.080108 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.080850 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.098272 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.098507 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.103147 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=26.103126993 podStartE2EDuration="26.103126993s" podCreationTimestamp="2025-12-06 09:10:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:11:18.099552165 +0000 UTC m=+295.843812492" watchObservedRunningTime="2025-12-06 09:11:18.103126993 +0000 UTC m=+295.847387310" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.106432 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.121323 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-system-router-certs\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.121371 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-system-session\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.121508 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f57d9f38-7332-44c2-b8d1-148a03163ada-audit-dir\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.121560 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h946\" (UniqueName: \"kubernetes.io/projected/f57d9f38-7332-44c2-b8d1-148a03163ada-kube-api-access-2h946\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.121721 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-system-serving-cert\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.121821 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.121873 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.121937 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.121981 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-user-template-login\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.122016 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f57d9f38-7332-44c2-b8d1-148a03163ada-audit-policies\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.122066 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-system-cliconfig\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.122103 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-user-template-error\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.122146 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.122171 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-system-service-ca\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.224993 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.225061 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-system-service-ca\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.225095 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-system-router-certs\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.225121 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-system-session\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.225151 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f57d9f38-7332-44c2-b8d1-148a03163ada-audit-dir\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.225172 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h946\" (UniqueName: \"kubernetes.io/projected/f57d9f38-7332-44c2-b8d1-148a03163ada-kube-api-access-2h946\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.225227 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-system-serving-cert\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.225272 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.225302 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.225336 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.225363 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-user-template-login\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.225388 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f57d9f38-7332-44c2-b8d1-148a03163ada-audit-policies\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.225412 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-system-cliconfig\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.225441 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-user-template-error\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.227396 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f57d9f38-7332-44c2-b8d1-148a03163ada-audit-dir\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.229181 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-system-cliconfig\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.229437 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f57d9f38-7332-44c2-b8d1-148a03163ada-audit-policies\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.229575 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.229778 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-system-service-ca\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.232654 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.235280 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-user-template-error\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.236840 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-system-session\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.237242 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.238021 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-system-serving-cert\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.238661 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-system-router-certs\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.239640 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-user-template-login\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.240173 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f57d9f38-7332-44c2-b8d1-148a03163ada-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.247697 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h946\" (UniqueName: \"kubernetes.io/projected/f57d9f38-7332-44c2-b8d1-148a03163ada-kube-api-access-2h946\") pod \"oauth-openshift-54b5c98c4-hrns7\" (UID: \"f57d9f38-7332-44c2-b8d1-148a03163ada\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.285975 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.329548 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.397650 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.403580 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.494495 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.508840 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.568039 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d543231a-ae36-4b66-ac6a-fc3b48a0acb3" path="/var/lib/kubelet/pods/d543231a-ae36-4b66-ac6a-fc3b48a0acb3/volumes" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.577486 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 09:11:18 crc kubenswrapper[4672]: I1206 09:11:18.689688 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-54b5c98c4-hrns7"] Dec 06 09:11:19 crc kubenswrapper[4672]: I1206 09:11:19.582246 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" event={"ID":"f57d9f38-7332-44c2-b8d1-148a03163ada","Type":"ContainerStarted","Data":"5e4e3f4535055e22c25fc5d3a5f851c0a59694b45c0cf2f9136218c4b088a153"} Dec 06 09:11:19 crc kubenswrapper[4672]: I1206 09:11:19.582980 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:19 crc kubenswrapper[4672]: I1206 09:11:19.583008 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" event={"ID":"f57d9f38-7332-44c2-b8d1-148a03163ada","Type":"ContainerStarted","Data":"2e096b7c044c0a43efd8068361b29fae98b148e6358837ef48f628d9bb4a20c1"} Dec 06 09:11:19 crc kubenswrapper[4672]: I1206 09:11:19.590262 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" Dec 06 09:11:19 crc kubenswrapper[4672]: I1206 09:11:19.609916 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-54b5c98c4-hrns7" podStartSLOduration=63.609897904 podStartE2EDuration="1m3.609897904s" podCreationTimestamp="2025-12-06 09:10:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:11:19.609257367 +0000 UTC m=+297.353517654" watchObservedRunningTime="2025-12-06 09:11:19.609897904 +0000 UTC m=+297.354158191" Dec 06 09:11:19 crc kubenswrapper[4672]: I1206 09:11:19.797225 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 06 09:11:25 crc kubenswrapper[4672]: I1206 09:11:25.372308 4672 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 06 09:11:25 crc kubenswrapper[4672]: I1206 09:11:25.372971 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://7ea66acfb8f59389be68dfeb81052b2610d669316af31fbdd0fbdbd9a9884be3" gracePeriod=5 Dec 06 09:11:30 crc kubenswrapper[4672]: I1206 09:11:30.647816 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 06 09:11:30 crc kubenswrapper[4672]: I1206 09:11:30.648187 4672 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="7ea66acfb8f59389be68dfeb81052b2610d669316af31fbdd0fbdbd9a9884be3" exitCode=137 Dec 06 09:11:30 crc kubenswrapper[4672]: I1206 09:11:30.974435 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 06 09:11:30 crc kubenswrapper[4672]: I1206 09:11:30.974515 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 09:11:31 crc kubenswrapper[4672]: I1206 09:11:31.141951 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 06 09:11:31 crc kubenswrapper[4672]: I1206 09:11:31.142034 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 06 09:11:31 crc kubenswrapper[4672]: I1206 09:11:31.142138 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 06 09:11:31 crc kubenswrapper[4672]: I1206 09:11:31.142175 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 06 09:11:31 crc kubenswrapper[4672]: I1206 09:11:31.142223 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 06 09:11:31 crc kubenswrapper[4672]: I1206 09:11:31.142649 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:11:31 crc kubenswrapper[4672]: I1206 09:11:31.142714 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:11:31 crc kubenswrapper[4672]: I1206 09:11:31.142737 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:11:31 crc kubenswrapper[4672]: I1206 09:11:31.142781 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:11:31 crc kubenswrapper[4672]: I1206 09:11:31.143077 4672 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 06 09:11:31 crc kubenswrapper[4672]: I1206 09:11:31.143094 4672 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 06 09:11:31 crc kubenswrapper[4672]: I1206 09:11:31.143105 4672 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 06 09:11:31 crc kubenswrapper[4672]: I1206 09:11:31.143115 4672 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 06 09:11:31 crc kubenswrapper[4672]: I1206 09:11:31.153788 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:11:31 crc kubenswrapper[4672]: I1206 09:11:31.244178 4672 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 06 09:11:31 crc kubenswrapper[4672]: I1206 09:11:31.657387 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 06 09:11:31 crc kubenswrapper[4672]: I1206 09:11:31.657913 4672 scope.go:117] "RemoveContainer" containerID="7ea66acfb8f59389be68dfeb81052b2610d669316af31fbdd0fbdbd9a9884be3" Dec 06 09:11:31 crc kubenswrapper[4672]: I1206 09:11:31.657980 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 09:11:32 crc kubenswrapper[4672]: I1206 09:11:32.564891 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 06 09:11:41 crc kubenswrapper[4672]: I1206 09:11:41.389029 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 06 09:11:44 crc kubenswrapper[4672]: I1206 09:11:44.731656 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 06 09:11:44 crc kubenswrapper[4672]: I1206 09:11:44.733410 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 06 09:11:44 crc kubenswrapper[4672]: I1206 09:11:44.733458 4672 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="b0fd651e61a33daf174baa0089bba57a0d20881f888f7f54b9920cc2a61857ef" exitCode=137 Dec 06 09:11:44 crc kubenswrapper[4672]: I1206 09:11:44.733492 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"b0fd651e61a33daf174baa0089bba57a0d20881f888f7f54b9920cc2a61857ef"} Dec 06 09:11:44 crc kubenswrapper[4672]: I1206 09:11:44.733530 4672 scope.go:117] "RemoveContainer" containerID="759b6d3d1f936d78b4bd9c2c945adc49e9093cbcf65c1dcf67a254f42b839a92" Dec 06 09:11:45 crc kubenswrapper[4672]: I1206 09:11:45.741320 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 06 09:11:45 crc kubenswrapper[4672]: I1206 09:11:45.742890 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7f733329363c19286c80bdbd4d89f12577bee5953b0b70ad63f7905637c8c489"} Dec 06 09:11:45 crc kubenswrapper[4672]: I1206 09:11:45.842562 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 09:11:50 crc kubenswrapper[4672]: I1206 09:11:50.188959 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 06 09:11:54 crc kubenswrapper[4672]: I1206 09:11:54.474392 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 09:11:54 crc kubenswrapper[4672]: I1206 09:11:54.480146 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 09:11:55 crc kubenswrapper[4672]: I1206 09:11:55.848138 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.097157 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-r2nq8"] Dec 06 09:12:22 crc kubenswrapper[4672]: E1206 09:12:22.099032 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.099102 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.099256 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.099760 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-r2nq8" Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.178479 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-r2nq8"] Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.203347 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8ptq\" (UniqueName: \"kubernetes.io/projected/72277381-496a-4f22-94fc-a341a765c603-kube-api-access-v8ptq\") pod \"image-registry-66df7c8f76-r2nq8\" (UID: \"72277381-496a-4f22-94fc-a341a765c603\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2nq8" Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.203400 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/72277381-496a-4f22-94fc-a341a765c603-installation-pull-secrets\") pod \"image-registry-66df7c8f76-r2nq8\" (UID: \"72277381-496a-4f22-94fc-a341a765c603\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2nq8" Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.203429 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-r2nq8\" (UID: \"72277381-496a-4f22-94fc-a341a765c603\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2nq8" Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.203450 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72277381-496a-4f22-94fc-a341a765c603-trusted-ca\") pod \"image-registry-66df7c8f76-r2nq8\" (UID: \"72277381-496a-4f22-94fc-a341a765c603\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2nq8" Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.203477 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/72277381-496a-4f22-94fc-a341a765c603-bound-sa-token\") pod \"image-registry-66df7c8f76-r2nq8\" (UID: \"72277381-496a-4f22-94fc-a341a765c603\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2nq8" Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.203524 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/72277381-496a-4f22-94fc-a341a765c603-registry-certificates\") pod \"image-registry-66df7c8f76-r2nq8\" (UID: \"72277381-496a-4f22-94fc-a341a765c603\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2nq8" Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.203553 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/72277381-496a-4f22-94fc-a341a765c603-registry-tls\") pod \"image-registry-66df7c8f76-r2nq8\" (UID: \"72277381-496a-4f22-94fc-a341a765c603\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2nq8" Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.203570 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/72277381-496a-4f22-94fc-a341a765c603-ca-trust-extracted\") pod \"image-registry-66df7c8f76-r2nq8\" (UID: \"72277381-496a-4f22-94fc-a341a765c603\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2nq8" Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.239948 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-r2nq8\" (UID: \"72277381-496a-4f22-94fc-a341a765c603\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2nq8" Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.305070 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/72277381-496a-4f22-94fc-a341a765c603-ca-trust-extracted\") pod \"image-registry-66df7c8f76-r2nq8\" (UID: \"72277381-496a-4f22-94fc-a341a765c603\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2nq8" Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.305136 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8ptq\" (UniqueName: \"kubernetes.io/projected/72277381-496a-4f22-94fc-a341a765c603-kube-api-access-v8ptq\") pod \"image-registry-66df7c8f76-r2nq8\" (UID: \"72277381-496a-4f22-94fc-a341a765c603\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2nq8" Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.305161 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/72277381-496a-4f22-94fc-a341a765c603-installation-pull-secrets\") pod \"image-registry-66df7c8f76-r2nq8\" (UID: \"72277381-496a-4f22-94fc-a341a765c603\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2nq8" Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.305192 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72277381-496a-4f22-94fc-a341a765c603-trusted-ca\") pod \"image-registry-66df7c8f76-r2nq8\" (UID: \"72277381-496a-4f22-94fc-a341a765c603\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2nq8" Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.305223 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/72277381-496a-4f22-94fc-a341a765c603-bound-sa-token\") pod \"image-registry-66df7c8f76-r2nq8\" (UID: \"72277381-496a-4f22-94fc-a341a765c603\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2nq8" Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.305270 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/72277381-496a-4f22-94fc-a341a765c603-registry-certificates\") pod \"image-registry-66df7c8f76-r2nq8\" (UID: \"72277381-496a-4f22-94fc-a341a765c603\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2nq8" Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.305309 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/72277381-496a-4f22-94fc-a341a765c603-registry-tls\") pod \"image-registry-66df7c8f76-r2nq8\" (UID: \"72277381-496a-4f22-94fc-a341a765c603\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2nq8" Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.305782 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/72277381-496a-4f22-94fc-a341a765c603-ca-trust-extracted\") pod \"image-registry-66df7c8f76-r2nq8\" (UID: \"72277381-496a-4f22-94fc-a341a765c603\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2nq8" Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.306848 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/72277381-496a-4f22-94fc-a341a765c603-registry-certificates\") pod \"image-registry-66df7c8f76-r2nq8\" (UID: \"72277381-496a-4f22-94fc-a341a765c603\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2nq8" Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.307430 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72277381-496a-4f22-94fc-a341a765c603-trusted-ca\") pod \"image-registry-66df7c8f76-r2nq8\" (UID: \"72277381-496a-4f22-94fc-a341a765c603\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2nq8" Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.317296 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/72277381-496a-4f22-94fc-a341a765c603-installation-pull-secrets\") pod \"image-registry-66df7c8f76-r2nq8\" (UID: \"72277381-496a-4f22-94fc-a341a765c603\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2nq8" Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.322255 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/72277381-496a-4f22-94fc-a341a765c603-registry-tls\") pod \"image-registry-66df7c8f76-r2nq8\" (UID: \"72277381-496a-4f22-94fc-a341a765c603\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2nq8" Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.322464 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8ptq\" (UniqueName: \"kubernetes.io/projected/72277381-496a-4f22-94fc-a341a765c603-kube-api-access-v8ptq\") pod \"image-registry-66df7c8f76-r2nq8\" (UID: \"72277381-496a-4f22-94fc-a341a765c603\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2nq8" Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.326082 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/72277381-496a-4f22-94fc-a341a765c603-bound-sa-token\") pod \"image-registry-66df7c8f76-r2nq8\" (UID: \"72277381-496a-4f22-94fc-a341a765c603\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2nq8" Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.422075 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-r2nq8" Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.607534 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-r2nq8"] Dec 06 09:12:22 crc kubenswrapper[4672]: I1206 09:12:22.976981 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-r2nq8" event={"ID":"72277381-496a-4f22-94fc-a341a765c603","Type":"ContainerStarted","Data":"e21c656f83d9fe6bb6f2aa0e78fa019170d0fbab6f4d01a8dc108ccdef696194"} Dec 06 09:12:23 crc kubenswrapper[4672]: I1206 09:12:23.852628 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zvxtd"] Dec 06 09:12:23 crc kubenswrapper[4672]: I1206 09:12:23.853258 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-zvxtd" podUID="05e5af51-76dc-4825-bab8-a5048aea49e9" containerName="controller-manager" containerID="cri-o://c51a4bd045d8936c25b4cdb38d689fd49fb82ef5f09a3502485ef0a3c991977c" gracePeriod=30 Dec 06 09:12:23 crc kubenswrapper[4672]: I1206 09:12:23.952742 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv"] Dec 06 09:12:23 crc kubenswrapper[4672]: I1206 09:12:23.953005 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv" podUID="211614db-3bf5-4db7-9146-cc91303fc217" containerName="route-controller-manager" containerID="cri-o://5c6c509ca639e7fdbdd190de1e3fa0dbb48caf263a1b0d45f67faf9f39808637" gracePeriod=30 Dec 06 09:12:23 crc kubenswrapper[4672]: I1206 09:12:23.984566 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-r2nq8" event={"ID":"72277381-496a-4f22-94fc-a341a765c603","Type":"ContainerStarted","Data":"8b1c68cf1330cd59195117a5533f730af4ed77501e2af66ae850234401754fb6"} Dec 06 09:12:23 crc kubenswrapper[4672]: I1206 09:12:23.985764 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-r2nq8" Dec 06 09:12:23 crc kubenswrapper[4672]: I1206 09:12:23.987557 4672 generic.go:334] "Generic (PLEG): container finished" podID="05e5af51-76dc-4825-bab8-a5048aea49e9" containerID="c51a4bd045d8936c25b4cdb38d689fd49fb82ef5f09a3502485ef0a3c991977c" exitCode=0 Dec 06 09:12:23 crc kubenswrapper[4672]: I1206 09:12:23.987614 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zvxtd" event={"ID":"05e5af51-76dc-4825-bab8-a5048aea49e9","Type":"ContainerDied","Data":"c51a4bd045d8936c25b4cdb38d689fd49fb82ef5f09a3502485ef0a3c991977c"} Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.235857 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zvxtd" Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.261510 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-r2nq8" podStartSLOduration=2.261465605 podStartE2EDuration="2.261465605s" podCreationTimestamp="2025-12-06 09:12:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:12:24.008501966 +0000 UTC m=+361.752762273" watchObservedRunningTime="2025-12-06 09:12:24.261465605 +0000 UTC m=+362.005725892" Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.323455 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv" Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.345740 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/05e5af51-76dc-4825-bab8-a5048aea49e9-proxy-ca-bundles\") pod \"05e5af51-76dc-4825-bab8-a5048aea49e9\" (UID: \"05e5af51-76dc-4825-bab8-a5048aea49e9\") " Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.345782 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05e5af51-76dc-4825-bab8-a5048aea49e9-config\") pod \"05e5af51-76dc-4825-bab8-a5048aea49e9\" (UID: \"05e5af51-76dc-4825-bab8-a5048aea49e9\") " Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.345828 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vh9z\" (UniqueName: \"kubernetes.io/projected/05e5af51-76dc-4825-bab8-a5048aea49e9-kube-api-access-4vh9z\") pod \"05e5af51-76dc-4825-bab8-a5048aea49e9\" (UID: \"05e5af51-76dc-4825-bab8-a5048aea49e9\") " Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.345888 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05e5af51-76dc-4825-bab8-a5048aea49e9-serving-cert\") pod \"05e5af51-76dc-4825-bab8-a5048aea49e9\" (UID: \"05e5af51-76dc-4825-bab8-a5048aea49e9\") " Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.345914 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05e5af51-76dc-4825-bab8-a5048aea49e9-client-ca\") pod \"05e5af51-76dc-4825-bab8-a5048aea49e9\" (UID: \"05e5af51-76dc-4825-bab8-a5048aea49e9\") " Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.347008 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05e5af51-76dc-4825-bab8-a5048aea49e9-client-ca" (OuterVolumeSpecName: "client-ca") pod "05e5af51-76dc-4825-bab8-a5048aea49e9" (UID: "05e5af51-76dc-4825-bab8-a5048aea49e9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.347186 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05e5af51-76dc-4825-bab8-a5048aea49e9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "05e5af51-76dc-4825-bab8-a5048aea49e9" (UID: "05e5af51-76dc-4825-bab8-a5048aea49e9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.347296 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05e5af51-76dc-4825-bab8-a5048aea49e9-config" (OuterVolumeSpecName: "config") pod "05e5af51-76dc-4825-bab8-a5048aea49e9" (UID: "05e5af51-76dc-4825-bab8-a5048aea49e9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.361036 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05e5af51-76dc-4825-bab8-a5048aea49e9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "05e5af51-76dc-4825-bab8-a5048aea49e9" (UID: "05e5af51-76dc-4825-bab8-a5048aea49e9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.361876 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05e5af51-76dc-4825-bab8-a5048aea49e9-kube-api-access-4vh9z" (OuterVolumeSpecName: "kube-api-access-4vh9z") pod "05e5af51-76dc-4825-bab8-a5048aea49e9" (UID: "05e5af51-76dc-4825-bab8-a5048aea49e9"). InnerVolumeSpecName "kube-api-access-4vh9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.447152 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwjp5\" (UniqueName: \"kubernetes.io/projected/211614db-3bf5-4db7-9146-cc91303fc217-kube-api-access-lwjp5\") pod \"211614db-3bf5-4db7-9146-cc91303fc217\" (UID: \"211614db-3bf5-4db7-9146-cc91303fc217\") " Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.447199 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/211614db-3bf5-4db7-9146-cc91303fc217-serving-cert\") pod \"211614db-3bf5-4db7-9146-cc91303fc217\" (UID: \"211614db-3bf5-4db7-9146-cc91303fc217\") " Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.447243 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/211614db-3bf5-4db7-9146-cc91303fc217-client-ca\") pod \"211614db-3bf5-4db7-9146-cc91303fc217\" (UID: \"211614db-3bf5-4db7-9146-cc91303fc217\") " Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.447266 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/211614db-3bf5-4db7-9146-cc91303fc217-config\") pod \"211614db-3bf5-4db7-9146-cc91303fc217\" (UID: \"211614db-3bf5-4db7-9146-cc91303fc217\") " Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.447575 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05e5af51-76dc-4825-bab8-a5048aea49e9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.447596 4672 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05e5af51-76dc-4825-bab8-a5048aea49e9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.447621 4672 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/05e5af51-76dc-4825-bab8-a5048aea49e9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.447634 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05e5af51-76dc-4825-bab8-a5048aea49e9-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.447644 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vh9z\" (UniqueName: \"kubernetes.io/projected/05e5af51-76dc-4825-bab8-a5048aea49e9-kube-api-access-4vh9z\") on node \"crc\" DevicePath \"\"" Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.448618 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/211614db-3bf5-4db7-9146-cc91303fc217-config" (OuterVolumeSpecName: "config") pod "211614db-3bf5-4db7-9146-cc91303fc217" (UID: "211614db-3bf5-4db7-9146-cc91303fc217"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.448743 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/211614db-3bf5-4db7-9146-cc91303fc217-client-ca" (OuterVolumeSpecName: "client-ca") pod "211614db-3bf5-4db7-9146-cc91303fc217" (UID: "211614db-3bf5-4db7-9146-cc91303fc217"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.451326 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/211614db-3bf5-4db7-9146-cc91303fc217-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "211614db-3bf5-4db7-9146-cc91303fc217" (UID: "211614db-3bf5-4db7-9146-cc91303fc217"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.451406 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/211614db-3bf5-4db7-9146-cc91303fc217-kube-api-access-lwjp5" (OuterVolumeSpecName: "kube-api-access-lwjp5") pod "211614db-3bf5-4db7-9146-cc91303fc217" (UID: "211614db-3bf5-4db7-9146-cc91303fc217"). InnerVolumeSpecName "kube-api-access-lwjp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.549011 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwjp5\" (UniqueName: \"kubernetes.io/projected/211614db-3bf5-4db7-9146-cc91303fc217-kube-api-access-lwjp5\") on node \"crc\" DevicePath \"\"" Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.549051 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/211614db-3bf5-4db7-9146-cc91303fc217-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.549061 4672 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/211614db-3bf5-4db7-9146-cc91303fc217-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.549069 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/211614db-3bf5-4db7-9146-cc91303fc217-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.997521 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zvxtd" Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.998736 4672 generic.go:334] "Generic (PLEG): container finished" podID="211614db-3bf5-4db7-9146-cc91303fc217" containerID="5c6c509ca639e7fdbdd190de1e3fa0dbb48caf263a1b0d45f67faf9f39808637" exitCode=0 Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.997469 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zvxtd" event={"ID":"05e5af51-76dc-4825-bab8-a5048aea49e9","Type":"ContainerDied","Data":"f6b1e85ec4948743ad314bc62ad1e35b28c3c6710cd616e3a5b5d228b329c479"} Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.998806 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv" Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.998887 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv" event={"ID":"211614db-3bf5-4db7-9146-cc91303fc217","Type":"ContainerDied","Data":"5c6c509ca639e7fdbdd190de1e3fa0dbb48caf263a1b0d45f67faf9f39808637"} Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.998949 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv" event={"ID":"211614db-3bf5-4db7-9146-cc91303fc217","Type":"ContainerDied","Data":"1626201f60efd7891ab74d06d02ebc0c51b8a4baaf04879d9d02542e646e2d8c"} Dec 06 09:12:24 crc kubenswrapper[4672]: I1206 09:12:24.999000 4672 scope.go:117] "RemoveContainer" containerID="c51a4bd045d8936c25b4cdb38d689fd49fb82ef5f09a3502485ef0a3c991977c" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.016749 4672 scope.go:117] "RemoveContainer" containerID="5c6c509ca639e7fdbdd190de1e3fa0dbb48caf263a1b0d45f67faf9f39808637" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.019826 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv"] Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.027909 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5pxtv"] Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.034560 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zvxtd"] Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.040896 4672 scope.go:117] "RemoveContainer" containerID="5c6c509ca639e7fdbdd190de1e3fa0dbb48caf263a1b0d45f67faf9f39808637" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.041102 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zvxtd"] Dec 06 09:12:25 crc kubenswrapper[4672]: E1206 09:12:25.041469 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c6c509ca639e7fdbdd190de1e3fa0dbb48caf263a1b0d45f67faf9f39808637\": container with ID starting with 5c6c509ca639e7fdbdd190de1e3fa0dbb48caf263a1b0d45f67faf9f39808637 not found: ID does not exist" containerID="5c6c509ca639e7fdbdd190de1e3fa0dbb48caf263a1b0d45f67faf9f39808637" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.041522 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c6c509ca639e7fdbdd190de1e3fa0dbb48caf263a1b0d45f67faf9f39808637"} err="failed to get container status \"5c6c509ca639e7fdbdd190de1e3fa0dbb48caf263a1b0d45f67faf9f39808637\": rpc error: code = NotFound desc = could not find container \"5c6c509ca639e7fdbdd190de1e3fa0dbb48caf263a1b0d45f67faf9f39808637\": container with ID starting with 5c6c509ca639e7fdbdd190de1e3fa0dbb48caf263a1b0d45f67faf9f39808637 not found: ID does not exist" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.085793 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-74586866d5-p29bp"] Dec 06 09:12:25 crc kubenswrapper[4672]: E1206 09:12:25.086136 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05e5af51-76dc-4825-bab8-a5048aea49e9" containerName="controller-manager" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.086161 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="05e5af51-76dc-4825-bab8-a5048aea49e9" containerName="controller-manager" Dec 06 09:12:25 crc kubenswrapper[4672]: E1206 09:12:25.086200 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="211614db-3bf5-4db7-9146-cc91303fc217" containerName="route-controller-manager" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.086210 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="211614db-3bf5-4db7-9146-cc91303fc217" containerName="route-controller-manager" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.086342 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="05e5af51-76dc-4825-bab8-a5048aea49e9" containerName="controller-manager" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.086370 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="211614db-3bf5-4db7-9146-cc91303fc217" containerName="route-controller-manager" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.086891 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74586866d5-p29bp" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.089400 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-678cc6964f-wjnb7"] Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.090075 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-678cc6964f-wjnb7" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.090394 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.090640 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.090859 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.090947 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.090983 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.091449 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.092118 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.092244 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.092340 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.092478 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.092751 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.093955 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.105411 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-678cc6964f-wjnb7"] Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.107055 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.109162 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74586866d5-p29bp"] Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.157584 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e87cf20e-9c40-4119-9868-02fd0fba981a-proxy-ca-bundles\") pod \"controller-manager-74586866d5-p29bp\" (UID: \"e87cf20e-9c40-4119-9868-02fd0fba981a\") " pod="openshift-controller-manager/controller-manager-74586866d5-p29bp" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.157667 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5832ce31-cac1-4f07-8d76-f349ce678c9e-client-ca\") pod \"route-controller-manager-678cc6964f-wjnb7\" (UID: \"5832ce31-cac1-4f07-8d76-f349ce678c9e\") " pod="openshift-route-controller-manager/route-controller-manager-678cc6964f-wjnb7" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.157709 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e87cf20e-9c40-4119-9868-02fd0fba981a-serving-cert\") pod \"controller-manager-74586866d5-p29bp\" (UID: \"e87cf20e-9c40-4119-9868-02fd0fba981a\") " pod="openshift-controller-manager/controller-manager-74586866d5-p29bp" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.157753 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e87cf20e-9c40-4119-9868-02fd0fba981a-client-ca\") pod \"controller-manager-74586866d5-p29bp\" (UID: \"e87cf20e-9c40-4119-9868-02fd0fba981a\") " pod="openshift-controller-manager/controller-manager-74586866d5-p29bp" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.157769 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5832ce31-cac1-4f07-8d76-f349ce678c9e-config\") pod \"route-controller-manager-678cc6964f-wjnb7\" (UID: \"5832ce31-cac1-4f07-8d76-f349ce678c9e\") " pod="openshift-route-controller-manager/route-controller-manager-678cc6964f-wjnb7" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.157786 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb2cz\" (UniqueName: \"kubernetes.io/projected/5832ce31-cac1-4f07-8d76-f349ce678c9e-kube-api-access-jb2cz\") pod \"route-controller-manager-678cc6964f-wjnb7\" (UID: \"5832ce31-cac1-4f07-8d76-f349ce678c9e\") " pod="openshift-route-controller-manager/route-controller-manager-678cc6964f-wjnb7" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.157815 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btz7j\" (UniqueName: \"kubernetes.io/projected/e87cf20e-9c40-4119-9868-02fd0fba981a-kube-api-access-btz7j\") pod \"controller-manager-74586866d5-p29bp\" (UID: \"e87cf20e-9c40-4119-9868-02fd0fba981a\") " pod="openshift-controller-manager/controller-manager-74586866d5-p29bp" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.157836 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5832ce31-cac1-4f07-8d76-f349ce678c9e-serving-cert\") pod \"route-controller-manager-678cc6964f-wjnb7\" (UID: \"5832ce31-cac1-4f07-8d76-f349ce678c9e\") " pod="openshift-route-controller-manager/route-controller-manager-678cc6964f-wjnb7" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.157880 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e87cf20e-9c40-4119-9868-02fd0fba981a-config\") pod \"controller-manager-74586866d5-p29bp\" (UID: \"e87cf20e-9c40-4119-9868-02fd0fba981a\") " pod="openshift-controller-manager/controller-manager-74586866d5-p29bp" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.258784 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e87cf20e-9c40-4119-9868-02fd0fba981a-client-ca\") pod \"controller-manager-74586866d5-p29bp\" (UID: \"e87cf20e-9c40-4119-9868-02fd0fba981a\") " pod="openshift-controller-manager/controller-manager-74586866d5-p29bp" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.258859 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb2cz\" (UniqueName: \"kubernetes.io/projected/5832ce31-cac1-4f07-8d76-f349ce678c9e-kube-api-access-jb2cz\") pod \"route-controller-manager-678cc6964f-wjnb7\" (UID: \"5832ce31-cac1-4f07-8d76-f349ce678c9e\") " pod="openshift-route-controller-manager/route-controller-manager-678cc6964f-wjnb7" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.258895 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5832ce31-cac1-4f07-8d76-f349ce678c9e-config\") pod \"route-controller-manager-678cc6964f-wjnb7\" (UID: \"5832ce31-cac1-4f07-8d76-f349ce678c9e\") " pod="openshift-route-controller-manager/route-controller-manager-678cc6964f-wjnb7" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.258935 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btz7j\" (UniqueName: \"kubernetes.io/projected/e87cf20e-9c40-4119-9868-02fd0fba981a-kube-api-access-btz7j\") pod \"controller-manager-74586866d5-p29bp\" (UID: \"e87cf20e-9c40-4119-9868-02fd0fba981a\") " pod="openshift-controller-manager/controller-manager-74586866d5-p29bp" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.258975 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5832ce31-cac1-4f07-8d76-f349ce678c9e-serving-cert\") pod \"route-controller-manager-678cc6964f-wjnb7\" (UID: \"5832ce31-cac1-4f07-8d76-f349ce678c9e\") " pod="openshift-route-controller-manager/route-controller-manager-678cc6964f-wjnb7" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.259023 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e87cf20e-9c40-4119-9868-02fd0fba981a-config\") pod \"controller-manager-74586866d5-p29bp\" (UID: \"e87cf20e-9c40-4119-9868-02fd0fba981a\") " pod="openshift-controller-manager/controller-manager-74586866d5-p29bp" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.259096 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e87cf20e-9c40-4119-9868-02fd0fba981a-proxy-ca-bundles\") pod \"controller-manager-74586866d5-p29bp\" (UID: \"e87cf20e-9c40-4119-9868-02fd0fba981a\") " pod="openshift-controller-manager/controller-manager-74586866d5-p29bp" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.259144 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5832ce31-cac1-4f07-8d76-f349ce678c9e-client-ca\") pod \"route-controller-manager-678cc6964f-wjnb7\" (UID: \"5832ce31-cac1-4f07-8d76-f349ce678c9e\") " pod="openshift-route-controller-manager/route-controller-manager-678cc6964f-wjnb7" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.259203 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e87cf20e-9c40-4119-9868-02fd0fba981a-serving-cert\") pod \"controller-manager-74586866d5-p29bp\" (UID: \"e87cf20e-9c40-4119-9868-02fd0fba981a\") " pod="openshift-controller-manager/controller-manager-74586866d5-p29bp" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.261960 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e87cf20e-9c40-4119-9868-02fd0fba981a-client-ca\") pod \"controller-manager-74586866d5-p29bp\" (UID: \"e87cf20e-9c40-4119-9868-02fd0fba981a\") " pod="openshift-controller-manager/controller-manager-74586866d5-p29bp" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.261984 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5832ce31-cac1-4f07-8d76-f349ce678c9e-config\") pod \"route-controller-manager-678cc6964f-wjnb7\" (UID: \"5832ce31-cac1-4f07-8d76-f349ce678c9e\") " pod="openshift-route-controller-manager/route-controller-manager-678cc6964f-wjnb7" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.262769 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e87cf20e-9c40-4119-9868-02fd0fba981a-config\") pod \"controller-manager-74586866d5-p29bp\" (UID: \"e87cf20e-9c40-4119-9868-02fd0fba981a\") " pod="openshift-controller-manager/controller-manager-74586866d5-p29bp" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.263016 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e87cf20e-9c40-4119-9868-02fd0fba981a-proxy-ca-bundles\") pod \"controller-manager-74586866d5-p29bp\" (UID: \"e87cf20e-9c40-4119-9868-02fd0fba981a\") " pod="openshift-controller-manager/controller-manager-74586866d5-p29bp" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.263114 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5832ce31-cac1-4f07-8d76-f349ce678c9e-client-ca\") pod \"route-controller-manager-678cc6964f-wjnb7\" (UID: \"5832ce31-cac1-4f07-8d76-f349ce678c9e\") " pod="openshift-route-controller-manager/route-controller-manager-678cc6964f-wjnb7" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.263891 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5832ce31-cac1-4f07-8d76-f349ce678c9e-serving-cert\") pod \"route-controller-manager-678cc6964f-wjnb7\" (UID: \"5832ce31-cac1-4f07-8d76-f349ce678c9e\") " pod="openshift-route-controller-manager/route-controller-manager-678cc6964f-wjnb7" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.266220 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e87cf20e-9c40-4119-9868-02fd0fba981a-serving-cert\") pod \"controller-manager-74586866d5-p29bp\" (UID: \"e87cf20e-9c40-4119-9868-02fd0fba981a\") " pod="openshift-controller-manager/controller-manager-74586866d5-p29bp" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.286000 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btz7j\" (UniqueName: \"kubernetes.io/projected/e87cf20e-9c40-4119-9868-02fd0fba981a-kube-api-access-btz7j\") pod \"controller-manager-74586866d5-p29bp\" (UID: \"e87cf20e-9c40-4119-9868-02fd0fba981a\") " pod="openshift-controller-manager/controller-manager-74586866d5-p29bp" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.286971 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb2cz\" (UniqueName: \"kubernetes.io/projected/5832ce31-cac1-4f07-8d76-f349ce678c9e-kube-api-access-jb2cz\") pod \"route-controller-manager-678cc6964f-wjnb7\" (UID: \"5832ce31-cac1-4f07-8d76-f349ce678c9e\") " pod="openshift-route-controller-manager/route-controller-manager-678cc6964f-wjnb7" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.411256 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74586866d5-p29bp" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.420952 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-678cc6964f-wjnb7" Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.626241 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74586866d5-p29bp"] Dec 06 09:12:25 crc kubenswrapper[4672]: W1206 09:12:25.626812 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode87cf20e_9c40_4119_9868_02fd0fba981a.slice/crio-48982df86178259a939f12b95c3504ccce3ebb1863d8d1f9213abe1bd7c43492 WatchSource:0}: Error finding container 48982df86178259a939f12b95c3504ccce3ebb1863d8d1f9213abe1bd7c43492: Status 404 returned error can't find the container with id 48982df86178259a939f12b95c3504ccce3ebb1863d8d1f9213abe1bd7c43492 Dec 06 09:12:25 crc kubenswrapper[4672]: I1206 09:12:25.661781 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-678cc6964f-wjnb7"] Dec 06 09:12:25 crc kubenswrapper[4672]: W1206 09:12:25.666082 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5832ce31_cac1_4f07_8d76_f349ce678c9e.slice/crio-1fd1692770f45441d4b54a2e938aab3dfcd67861ca7e7d7607ac5a94215132ec WatchSource:0}: Error finding container 1fd1692770f45441d4b54a2e938aab3dfcd67861ca7e7d7607ac5a94215132ec: Status 404 returned error can't find the container with id 1fd1692770f45441d4b54a2e938aab3dfcd67861ca7e7d7607ac5a94215132ec Dec 06 09:12:26 crc kubenswrapper[4672]: I1206 09:12:26.007256 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74586866d5-p29bp" event={"ID":"e87cf20e-9c40-4119-9868-02fd0fba981a","Type":"ContainerStarted","Data":"a916184606cd558f7e261c0320196bdbde7ab4bb24bcf0e826f168035d993a1f"} Dec 06 09:12:26 crc kubenswrapper[4672]: I1206 09:12:26.007317 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74586866d5-p29bp" event={"ID":"e87cf20e-9c40-4119-9868-02fd0fba981a","Type":"ContainerStarted","Data":"48982df86178259a939f12b95c3504ccce3ebb1863d8d1f9213abe1bd7c43492"} Dec 06 09:12:26 crc kubenswrapper[4672]: I1206 09:12:26.009024 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-74586866d5-p29bp" Dec 06 09:12:26 crc kubenswrapper[4672]: I1206 09:12:26.011533 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-678cc6964f-wjnb7" event={"ID":"5832ce31-cac1-4f07-8d76-f349ce678c9e","Type":"ContainerStarted","Data":"5b4d3ae6e39f26193cb710e014ca1c74e37dea0bb6c324eed1101b24a3060af1"} Dec 06 09:12:26 crc kubenswrapper[4672]: I1206 09:12:26.011574 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-678cc6964f-wjnb7" Dec 06 09:12:26 crc kubenswrapper[4672]: I1206 09:12:26.011589 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-678cc6964f-wjnb7" event={"ID":"5832ce31-cac1-4f07-8d76-f349ce678c9e","Type":"ContainerStarted","Data":"1fd1692770f45441d4b54a2e938aab3dfcd67861ca7e7d7607ac5a94215132ec"} Dec 06 09:12:26 crc kubenswrapper[4672]: I1206 09:12:26.036021 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-74586866d5-p29bp" Dec 06 09:12:26 crc kubenswrapper[4672]: I1206 09:12:26.088233 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-74586866d5-p29bp" podStartSLOduration=3.088212557 podStartE2EDuration="3.088212557s" podCreationTimestamp="2025-12-06 09:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:12:26.062866295 +0000 UTC m=+363.807126582" watchObservedRunningTime="2025-12-06 09:12:26.088212557 +0000 UTC m=+363.832472844" Dec 06 09:12:26 crc kubenswrapper[4672]: I1206 09:12:26.090325 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-678cc6964f-wjnb7" podStartSLOduration=3.090316085 podStartE2EDuration="3.090316085s" podCreationTimestamp="2025-12-06 09:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:12:26.086304934 +0000 UTC m=+363.830565221" watchObservedRunningTime="2025-12-06 09:12:26.090316085 +0000 UTC m=+363.834576372" Dec 06 09:12:26 crc kubenswrapper[4672]: I1206 09:12:26.380430 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-678cc6964f-wjnb7" Dec 06 09:12:26 crc kubenswrapper[4672]: I1206 09:12:26.548997 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74586866d5-p29bp"] Dec 06 09:12:26 crc kubenswrapper[4672]: I1206 09:12:26.565045 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05e5af51-76dc-4825-bab8-a5048aea49e9" path="/var/lib/kubelet/pods/05e5af51-76dc-4825-bab8-a5048aea49e9/volumes" Dec 06 09:12:26 crc kubenswrapper[4672]: I1206 09:12:26.565928 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="211614db-3bf5-4db7-9146-cc91303fc217" path="/var/lib/kubelet/pods/211614db-3bf5-4db7-9146-cc91303fc217/volumes" Dec 06 09:12:26 crc kubenswrapper[4672]: I1206 09:12:26.566478 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-678cc6964f-wjnb7"] Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.022126 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-74586866d5-p29bp" podUID="e87cf20e-9c40-4119-9868-02fd0fba981a" containerName="controller-manager" containerID="cri-o://a916184606cd558f7e261c0320196bdbde7ab4bb24bcf0e826f168035d993a1f" gracePeriod=30 Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.022202 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-678cc6964f-wjnb7" podUID="5832ce31-cac1-4f07-8d76-f349ce678c9e" containerName="route-controller-manager" containerID="cri-o://5b4d3ae6e39f26193cb710e014ca1c74e37dea0bb6c324eed1101b24a3060af1" gracePeriod=30 Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.434193 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-678cc6964f-wjnb7" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.458566 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5948fdc94d-p9bdm"] Dec 06 09:12:28 crc kubenswrapper[4672]: E1206 09:12:28.458801 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5832ce31-cac1-4f07-8d76-f349ce678c9e" containerName="route-controller-manager" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.458814 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="5832ce31-cac1-4f07-8d76-f349ce678c9e" containerName="route-controller-manager" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.458953 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="5832ce31-cac1-4f07-8d76-f349ce678c9e" containerName="route-controller-manager" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.459411 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5948fdc94d-p9bdm" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.502045 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74586866d5-p29bp" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.508420 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5948fdc94d-p9bdm"] Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.512665 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5832ce31-cac1-4f07-8d76-f349ce678c9e-serving-cert\") pod \"5832ce31-cac1-4f07-8d76-f349ce678c9e\" (UID: \"5832ce31-cac1-4f07-8d76-f349ce678c9e\") " Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.512705 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5832ce31-cac1-4f07-8d76-f349ce678c9e-client-ca\") pod \"5832ce31-cac1-4f07-8d76-f349ce678c9e\" (UID: \"5832ce31-cac1-4f07-8d76-f349ce678c9e\") " Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.512801 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5832ce31-cac1-4f07-8d76-f349ce678c9e-config\") pod \"5832ce31-cac1-4f07-8d76-f349ce678c9e\" (UID: \"5832ce31-cac1-4f07-8d76-f349ce678c9e\") " Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.512863 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb2cz\" (UniqueName: \"kubernetes.io/projected/5832ce31-cac1-4f07-8d76-f349ce678c9e-kube-api-access-jb2cz\") pod \"5832ce31-cac1-4f07-8d76-f349ce678c9e\" (UID: \"5832ce31-cac1-4f07-8d76-f349ce678c9e\") " Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.513058 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9wqn\" (UniqueName: \"kubernetes.io/projected/52b2e50f-31ca-46f8-8bb3-7a2202b86601-kube-api-access-n9wqn\") pod \"route-controller-manager-5948fdc94d-p9bdm\" (UID: \"52b2e50f-31ca-46f8-8bb3-7a2202b86601\") " pod="openshift-route-controller-manager/route-controller-manager-5948fdc94d-p9bdm" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.513088 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52b2e50f-31ca-46f8-8bb3-7a2202b86601-client-ca\") pod \"route-controller-manager-5948fdc94d-p9bdm\" (UID: \"52b2e50f-31ca-46f8-8bb3-7a2202b86601\") " pod="openshift-route-controller-manager/route-controller-manager-5948fdc94d-p9bdm" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.513115 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52b2e50f-31ca-46f8-8bb3-7a2202b86601-serving-cert\") pod \"route-controller-manager-5948fdc94d-p9bdm\" (UID: \"52b2e50f-31ca-46f8-8bb3-7a2202b86601\") " pod="openshift-route-controller-manager/route-controller-manager-5948fdc94d-p9bdm" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.513388 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52b2e50f-31ca-46f8-8bb3-7a2202b86601-config\") pod \"route-controller-manager-5948fdc94d-p9bdm\" (UID: \"52b2e50f-31ca-46f8-8bb3-7a2202b86601\") " pod="openshift-route-controller-manager/route-controller-manager-5948fdc94d-p9bdm" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.513472 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5832ce31-cac1-4f07-8d76-f349ce678c9e-client-ca" (OuterVolumeSpecName: "client-ca") pod "5832ce31-cac1-4f07-8d76-f349ce678c9e" (UID: "5832ce31-cac1-4f07-8d76-f349ce678c9e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.513564 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5832ce31-cac1-4f07-8d76-f349ce678c9e-config" (OuterVolumeSpecName: "config") pod "5832ce31-cac1-4f07-8d76-f349ce678c9e" (UID: "5832ce31-cac1-4f07-8d76-f349ce678c9e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.518779 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5832ce31-cac1-4f07-8d76-f349ce678c9e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5832ce31-cac1-4f07-8d76-f349ce678c9e" (UID: "5832ce31-cac1-4f07-8d76-f349ce678c9e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.524089 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5832ce31-cac1-4f07-8d76-f349ce678c9e-kube-api-access-jb2cz" (OuterVolumeSpecName: "kube-api-access-jb2cz") pod "5832ce31-cac1-4f07-8d76-f349ce678c9e" (UID: "5832ce31-cac1-4f07-8d76-f349ce678c9e"). InnerVolumeSpecName "kube-api-access-jb2cz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.614751 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btz7j\" (UniqueName: \"kubernetes.io/projected/e87cf20e-9c40-4119-9868-02fd0fba981a-kube-api-access-btz7j\") pod \"e87cf20e-9c40-4119-9868-02fd0fba981a\" (UID: \"e87cf20e-9c40-4119-9868-02fd0fba981a\") " Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.614879 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e87cf20e-9c40-4119-9868-02fd0fba981a-proxy-ca-bundles\") pod \"e87cf20e-9c40-4119-9868-02fd0fba981a\" (UID: \"e87cf20e-9c40-4119-9868-02fd0fba981a\") " Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.614922 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e87cf20e-9c40-4119-9868-02fd0fba981a-serving-cert\") pod \"e87cf20e-9c40-4119-9868-02fd0fba981a\" (UID: \"e87cf20e-9c40-4119-9868-02fd0fba981a\") " Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.614951 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e87cf20e-9c40-4119-9868-02fd0fba981a-client-ca\") pod \"e87cf20e-9c40-4119-9868-02fd0fba981a\" (UID: \"e87cf20e-9c40-4119-9868-02fd0fba981a\") " Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.615015 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e87cf20e-9c40-4119-9868-02fd0fba981a-config\") pod \"e87cf20e-9c40-4119-9868-02fd0fba981a\" (UID: \"e87cf20e-9c40-4119-9868-02fd0fba981a\") " Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.615168 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52b2e50f-31ca-46f8-8bb3-7a2202b86601-serving-cert\") pod \"route-controller-manager-5948fdc94d-p9bdm\" (UID: \"52b2e50f-31ca-46f8-8bb3-7a2202b86601\") " pod="openshift-route-controller-manager/route-controller-manager-5948fdc94d-p9bdm" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.615247 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52b2e50f-31ca-46f8-8bb3-7a2202b86601-config\") pod \"route-controller-manager-5948fdc94d-p9bdm\" (UID: \"52b2e50f-31ca-46f8-8bb3-7a2202b86601\") " pod="openshift-route-controller-manager/route-controller-manager-5948fdc94d-p9bdm" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.615280 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9wqn\" (UniqueName: \"kubernetes.io/projected/52b2e50f-31ca-46f8-8bb3-7a2202b86601-kube-api-access-n9wqn\") pod \"route-controller-manager-5948fdc94d-p9bdm\" (UID: \"52b2e50f-31ca-46f8-8bb3-7a2202b86601\") " pod="openshift-route-controller-manager/route-controller-manager-5948fdc94d-p9bdm" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.615300 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52b2e50f-31ca-46f8-8bb3-7a2202b86601-client-ca\") pod \"route-controller-manager-5948fdc94d-p9bdm\" (UID: \"52b2e50f-31ca-46f8-8bb3-7a2202b86601\") " pod="openshift-route-controller-manager/route-controller-manager-5948fdc94d-p9bdm" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.615335 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5832ce31-cac1-4f07-8d76-f349ce678c9e-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.615346 4672 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5832ce31-cac1-4f07-8d76-f349ce678c9e-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.615367 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5832ce31-cac1-4f07-8d76-f349ce678c9e-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.615378 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb2cz\" (UniqueName: \"kubernetes.io/projected/5832ce31-cac1-4f07-8d76-f349ce678c9e-kube-api-access-jb2cz\") on node \"crc\" DevicePath \"\"" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.615723 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e87cf20e-9c40-4119-9868-02fd0fba981a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e87cf20e-9c40-4119-9868-02fd0fba981a" (UID: "e87cf20e-9c40-4119-9868-02fd0fba981a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.616155 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e87cf20e-9c40-4119-9868-02fd0fba981a-config" (OuterVolumeSpecName: "config") pod "e87cf20e-9c40-4119-9868-02fd0fba981a" (UID: "e87cf20e-9c40-4119-9868-02fd0fba981a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.616221 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52b2e50f-31ca-46f8-8bb3-7a2202b86601-client-ca\") pod \"route-controller-manager-5948fdc94d-p9bdm\" (UID: \"52b2e50f-31ca-46f8-8bb3-7a2202b86601\") " pod="openshift-route-controller-manager/route-controller-manager-5948fdc94d-p9bdm" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.616710 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e87cf20e-9c40-4119-9868-02fd0fba981a-client-ca" (OuterVolumeSpecName: "client-ca") pod "e87cf20e-9c40-4119-9868-02fd0fba981a" (UID: "e87cf20e-9c40-4119-9868-02fd0fba981a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.616978 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52b2e50f-31ca-46f8-8bb3-7a2202b86601-config\") pod \"route-controller-manager-5948fdc94d-p9bdm\" (UID: \"52b2e50f-31ca-46f8-8bb3-7a2202b86601\") " pod="openshift-route-controller-manager/route-controller-manager-5948fdc94d-p9bdm" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.619228 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52b2e50f-31ca-46f8-8bb3-7a2202b86601-serving-cert\") pod \"route-controller-manager-5948fdc94d-p9bdm\" (UID: \"52b2e50f-31ca-46f8-8bb3-7a2202b86601\") " pod="openshift-route-controller-manager/route-controller-manager-5948fdc94d-p9bdm" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.619328 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e87cf20e-9c40-4119-9868-02fd0fba981a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e87cf20e-9c40-4119-9868-02fd0fba981a" (UID: "e87cf20e-9c40-4119-9868-02fd0fba981a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.621786 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e87cf20e-9c40-4119-9868-02fd0fba981a-kube-api-access-btz7j" (OuterVolumeSpecName: "kube-api-access-btz7j") pod "e87cf20e-9c40-4119-9868-02fd0fba981a" (UID: "e87cf20e-9c40-4119-9868-02fd0fba981a"). InnerVolumeSpecName "kube-api-access-btz7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.630854 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9wqn\" (UniqueName: \"kubernetes.io/projected/52b2e50f-31ca-46f8-8bb3-7a2202b86601-kube-api-access-n9wqn\") pod \"route-controller-manager-5948fdc94d-p9bdm\" (UID: \"52b2e50f-31ca-46f8-8bb3-7a2202b86601\") " pod="openshift-route-controller-manager/route-controller-manager-5948fdc94d-p9bdm" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.716670 4672 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e87cf20e-9c40-4119-9868-02fd0fba981a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.716711 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e87cf20e-9c40-4119-9868-02fd0fba981a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.716722 4672 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e87cf20e-9c40-4119-9868-02fd0fba981a-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.716736 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e87cf20e-9c40-4119-9868-02fd0fba981a-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.716750 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btz7j\" (UniqueName: \"kubernetes.io/projected/e87cf20e-9c40-4119-9868-02fd0fba981a-kube-api-access-btz7j\") on node \"crc\" DevicePath \"\"" Dec 06 09:12:28 crc kubenswrapper[4672]: I1206 09:12:28.795082 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5948fdc94d-p9bdm" Dec 06 09:12:29 crc kubenswrapper[4672]: I1206 09:12:29.028625 4672 generic.go:334] "Generic (PLEG): container finished" podID="5832ce31-cac1-4f07-8d76-f349ce678c9e" containerID="5b4d3ae6e39f26193cb710e014ca1c74e37dea0bb6c324eed1101b24a3060af1" exitCode=0 Dec 06 09:12:29 crc kubenswrapper[4672]: I1206 09:12:29.028745 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-678cc6964f-wjnb7" Dec 06 09:12:29 crc kubenswrapper[4672]: I1206 09:12:29.029360 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-678cc6964f-wjnb7" event={"ID":"5832ce31-cac1-4f07-8d76-f349ce678c9e","Type":"ContainerDied","Data":"5b4d3ae6e39f26193cb710e014ca1c74e37dea0bb6c324eed1101b24a3060af1"} Dec 06 09:12:29 crc kubenswrapper[4672]: I1206 09:12:29.029389 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-678cc6964f-wjnb7" event={"ID":"5832ce31-cac1-4f07-8d76-f349ce678c9e","Type":"ContainerDied","Data":"1fd1692770f45441d4b54a2e938aab3dfcd67861ca7e7d7607ac5a94215132ec"} Dec 06 09:12:29 crc kubenswrapper[4672]: I1206 09:12:29.029411 4672 scope.go:117] "RemoveContainer" containerID="5b4d3ae6e39f26193cb710e014ca1c74e37dea0bb6c324eed1101b24a3060af1" Dec 06 09:12:29 crc kubenswrapper[4672]: I1206 09:12:29.031062 4672 generic.go:334] "Generic (PLEG): container finished" podID="e87cf20e-9c40-4119-9868-02fd0fba981a" containerID="a916184606cd558f7e261c0320196bdbde7ab4bb24bcf0e826f168035d993a1f" exitCode=0 Dec 06 09:12:29 crc kubenswrapper[4672]: I1206 09:12:29.031094 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74586866d5-p29bp" event={"ID":"e87cf20e-9c40-4119-9868-02fd0fba981a","Type":"ContainerDied","Data":"a916184606cd558f7e261c0320196bdbde7ab4bb24bcf0e826f168035d993a1f"} Dec 06 09:12:29 crc kubenswrapper[4672]: I1206 09:12:29.031110 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74586866d5-p29bp" event={"ID":"e87cf20e-9c40-4119-9868-02fd0fba981a","Type":"ContainerDied","Data":"48982df86178259a939f12b95c3504ccce3ebb1863d8d1f9213abe1bd7c43492"} Dec 06 09:12:29 crc kubenswrapper[4672]: I1206 09:12:29.031151 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74586866d5-p29bp" Dec 06 09:12:29 crc kubenswrapper[4672]: I1206 09:12:29.057745 4672 scope.go:117] "RemoveContainer" containerID="5b4d3ae6e39f26193cb710e014ca1c74e37dea0bb6c324eed1101b24a3060af1" Dec 06 09:12:29 crc kubenswrapper[4672]: I1206 09:12:29.058095 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-678cc6964f-wjnb7"] Dec 06 09:12:29 crc kubenswrapper[4672]: E1206 09:12:29.058354 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b4d3ae6e39f26193cb710e014ca1c74e37dea0bb6c324eed1101b24a3060af1\": container with ID starting with 5b4d3ae6e39f26193cb710e014ca1c74e37dea0bb6c324eed1101b24a3060af1 not found: ID does not exist" containerID="5b4d3ae6e39f26193cb710e014ca1c74e37dea0bb6c324eed1101b24a3060af1" Dec 06 09:12:29 crc kubenswrapper[4672]: I1206 09:12:29.058422 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b4d3ae6e39f26193cb710e014ca1c74e37dea0bb6c324eed1101b24a3060af1"} err="failed to get container status \"5b4d3ae6e39f26193cb710e014ca1c74e37dea0bb6c324eed1101b24a3060af1\": rpc error: code = NotFound desc = could not find container \"5b4d3ae6e39f26193cb710e014ca1c74e37dea0bb6c324eed1101b24a3060af1\": container with ID starting with 5b4d3ae6e39f26193cb710e014ca1c74e37dea0bb6c324eed1101b24a3060af1 not found: ID does not exist" Dec 06 09:12:29 crc kubenswrapper[4672]: I1206 09:12:29.058451 4672 scope.go:117] "RemoveContainer" containerID="a916184606cd558f7e261c0320196bdbde7ab4bb24bcf0e826f168035d993a1f" Dec 06 09:12:29 crc kubenswrapper[4672]: I1206 09:12:29.062314 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-678cc6964f-wjnb7"] Dec 06 09:12:29 crc kubenswrapper[4672]: I1206 09:12:29.069370 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74586866d5-p29bp"] Dec 06 09:12:29 crc kubenswrapper[4672]: I1206 09:12:29.075303 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-74586866d5-p29bp"] Dec 06 09:12:29 crc kubenswrapper[4672]: I1206 09:12:29.080810 4672 scope.go:117] "RemoveContainer" containerID="a916184606cd558f7e261c0320196bdbde7ab4bb24bcf0e826f168035d993a1f" Dec 06 09:12:29 crc kubenswrapper[4672]: E1206 09:12:29.081344 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a916184606cd558f7e261c0320196bdbde7ab4bb24bcf0e826f168035d993a1f\": container with ID starting with a916184606cd558f7e261c0320196bdbde7ab4bb24bcf0e826f168035d993a1f not found: ID does not exist" containerID="a916184606cd558f7e261c0320196bdbde7ab4bb24bcf0e826f168035d993a1f" Dec 06 09:12:29 crc kubenswrapper[4672]: I1206 09:12:29.081401 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a916184606cd558f7e261c0320196bdbde7ab4bb24bcf0e826f168035d993a1f"} err="failed to get container status \"a916184606cd558f7e261c0320196bdbde7ab4bb24bcf0e826f168035d993a1f\": rpc error: code = NotFound desc = could not find container \"a916184606cd558f7e261c0320196bdbde7ab4bb24bcf0e826f168035d993a1f\": container with ID starting with a916184606cd558f7e261c0320196bdbde7ab4bb24bcf0e826f168035d993a1f not found: ID does not exist" Dec 06 09:12:29 crc kubenswrapper[4672]: I1206 09:12:29.209849 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5948fdc94d-p9bdm"] Dec 06 09:12:29 crc kubenswrapper[4672]: W1206 09:12:29.216746 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52b2e50f_31ca_46f8_8bb3_7a2202b86601.slice/crio-a3659098a4901cfbc4207758d3ecc62d2e9a0b7f5b33f927c73d5e0e81aa3c33 WatchSource:0}: Error finding container a3659098a4901cfbc4207758d3ecc62d2e9a0b7f5b33f927c73d5e0e81aa3c33: Status 404 returned error can't find the container with id a3659098a4901cfbc4207758d3ecc62d2e9a0b7f5b33f927c73d5e0e81aa3c33 Dec 06 09:12:30 crc kubenswrapper[4672]: I1206 09:12:30.042803 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5948fdc94d-p9bdm" event={"ID":"52b2e50f-31ca-46f8-8bb3-7a2202b86601","Type":"ContainerStarted","Data":"a0615fb6c1dd69de9e462d1065466746191aacb316cd6cda36085a6aadb2cc76"} Dec 06 09:12:30 crc kubenswrapper[4672]: I1206 09:12:30.043291 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5948fdc94d-p9bdm" event={"ID":"52b2e50f-31ca-46f8-8bb3-7a2202b86601","Type":"ContainerStarted","Data":"a3659098a4901cfbc4207758d3ecc62d2e9a0b7f5b33f927c73d5e0e81aa3c33"} Dec 06 09:12:30 crc kubenswrapper[4672]: I1206 09:12:30.044822 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5948fdc94d-p9bdm" Dec 06 09:12:30 crc kubenswrapper[4672]: I1206 09:12:30.055312 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5948fdc94d-p9bdm" Dec 06 09:12:30 crc kubenswrapper[4672]: I1206 09:12:30.110218 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5948fdc94d-p9bdm" podStartSLOduration=4.110104302 podStartE2EDuration="4.110104302s" podCreationTimestamp="2025-12-06 09:12:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:12:30.072675396 +0000 UTC m=+367.816935683" watchObservedRunningTime="2025-12-06 09:12:30.110104302 +0000 UTC m=+367.854364589" Dec 06 09:12:30 crc kubenswrapper[4672]: I1206 09:12:30.562935 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5832ce31-cac1-4f07-8d76-f349ce678c9e" path="/var/lib/kubelet/pods/5832ce31-cac1-4f07-8d76-f349ce678c9e/volumes" Dec 06 09:12:30 crc kubenswrapper[4672]: I1206 09:12:30.563507 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e87cf20e-9c40-4119-9868-02fd0fba981a" path="/var/lib/kubelet/pods/e87cf20e-9c40-4119-9868-02fd0fba981a/volumes" Dec 06 09:12:31 crc kubenswrapper[4672]: I1206 09:12:31.095160 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5877696bb8-h28zl"] Dec 06 09:12:31 crc kubenswrapper[4672]: E1206 09:12:31.095507 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e87cf20e-9c40-4119-9868-02fd0fba981a" containerName="controller-manager" Dec 06 09:12:31 crc kubenswrapper[4672]: I1206 09:12:31.095523 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="e87cf20e-9c40-4119-9868-02fd0fba981a" containerName="controller-manager" Dec 06 09:12:31 crc kubenswrapper[4672]: I1206 09:12:31.095657 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="e87cf20e-9c40-4119-9868-02fd0fba981a" containerName="controller-manager" Dec 06 09:12:31 crc kubenswrapper[4672]: I1206 09:12:31.096159 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5877696bb8-h28zl" Dec 06 09:12:31 crc kubenswrapper[4672]: I1206 09:12:31.098728 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 06 09:12:31 crc kubenswrapper[4672]: I1206 09:12:31.101107 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 06 09:12:31 crc kubenswrapper[4672]: I1206 09:12:31.101726 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 06 09:12:31 crc kubenswrapper[4672]: I1206 09:12:31.101831 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 06 09:12:31 crc kubenswrapper[4672]: I1206 09:12:31.102406 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 06 09:12:31 crc kubenswrapper[4672]: I1206 09:12:31.104590 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 06 09:12:31 crc kubenswrapper[4672]: I1206 09:12:31.114985 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 06 09:12:31 crc kubenswrapper[4672]: I1206 09:12:31.117734 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5877696bb8-h28zl"] Dec 06 09:12:31 crc kubenswrapper[4672]: I1206 09:12:31.149556 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c99bb7c-d5ed-4602-89aa-48206041f746-serving-cert\") pod \"controller-manager-5877696bb8-h28zl\" (UID: \"7c99bb7c-d5ed-4602-89aa-48206041f746\") " pod="openshift-controller-manager/controller-manager-5877696bb8-h28zl" Dec 06 09:12:31 crc kubenswrapper[4672]: I1206 09:12:31.149627 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7c99bb7c-d5ed-4602-89aa-48206041f746-client-ca\") pod \"controller-manager-5877696bb8-h28zl\" (UID: \"7c99bb7c-d5ed-4602-89aa-48206041f746\") " pod="openshift-controller-manager/controller-manager-5877696bb8-h28zl" Dec 06 09:12:31 crc kubenswrapper[4672]: I1206 09:12:31.149818 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c99bb7c-d5ed-4602-89aa-48206041f746-config\") pod \"controller-manager-5877696bb8-h28zl\" (UID: \"7c99bb7c-d5ed-4602-89aa-48206041f746\") " pod="openshift-controller-manager/controller-manager-5877696bb8-h28zl" Dec 06 09:12:31 crc kubenswrapper[4672]: I1206 09:12:31.150079 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c99bb7c-d5ed-4602-89aa-48206041f746-proxy-ca-bundles\") pod \"controller-manager-5877696bb8-h28zl\" (UID: \"7c99bb7c-d5ed-4602-89aa-48206041f746\") " pod="openshift-controller-manager/controller-manager-5877696bb8-h28zl" Dec 06 09:12:31 crc kubenswrapper[4672]: I1206 09:12:31.150274 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zm77\" (UniqueName: \"kubernetes.io/projected/7c99bb7c-d5ed-4602-89aa-48206041f746-kube-api-access-9zm77\") pod \"controller-manager-5877696bb8-h28zl\" (UID: \"7c99bb7c-d5ed-4602-89aa-48206041f746\") " pod="openshift-controller-manager/controller-manager-5877696bb8-h28zl" Dec 06 09:12:31 crc kubenswrapper[4672]: I1206 09:12:31.251991 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zm77\" (UniqueName: \"kubernetes.io/projected/7c99bb7c-d5ed-4602-89aa-48206041f746-kube-api-access-9zm77\") pod \"controller-manager-5877696bb8-h28zl\" (UID: \"7c99bb7c-d5ed-4602-89aa-48206041f746\") " pod="openshift-controller-manager/controller-manager-5877696bb8-h28zl" Dec 06 09:12:31 crc kubenswrapper[4672]: I1206 09:12:31.252054 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c99bb7c-d5ed-4602-89aa-48206041f746-serving-cert\") pod \"controller-manager-5877696bb8-h28zl\" (UID: \"7c99bb7c-d5ed-4602-89aa-48206041f746\") " pod="openshift-controller-manager/controller-manager-5877696bb8-h28zl" Dec 06 09:12:31 crc kubenswrapper[4672]: I1206 09:12:31.252078 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7c99bb7c-d5ed-4602-89aa-48206041f746-client-ca\") pod \"controller-manager-5877696bb8-h28zl\" (UID: \"7c99bb7c-d5ed-4602-89aa-48206041f746\") " pod="openshift-controller-manager/controller-manager-5877696bb8-h28zl" Dec 06 09:12:31 crc kubenswrapper[4672]: I1206 09:12:31.252108 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c99bb7c-d5ed-4602-89aa-48206041f746-config\") pod \"controller-manager-5877696bb8-h28zl\" (UID: \"7c99bb7c-d5ed-4602-89aa-48206041f746\") " pod="openshift-controller-manager/controller-manager-5877696bb8-h28zl" Dec 06 09:12:31 crc kubenswrapper[4672]: I1206 09:12:31.252161 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c99bb7c-d5ed-4602-89aa-48206041f746-proxy-ca-bundles\") pod \"controller-manager-5877696bb8-h28zl\" (UID: \"7c99bb7c-d5ed-4602-89aa-48206041f746\") " pod="openshift-controller-manager/controller-manager-5877696bb8-h28zl" Dec 06 09:12:31 crc kubenswrapper[4672]: I1206 09:12:31.254129 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c99bb7c-d5ed-4602-89aa-48206041f746-proxy-ca-bundles\") pod \"controller-manager-5877696bb8-h28zl\" (UID: \"7c99bb7c-d5ed-4602-89aa-48206041f746\") " pod="openshift-controller-manager/controller-manager-5877696bb8-h28zl" Dec 06 09:12:31 crc kubenswrapper[4672]: I1206 09:12:31.254707 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c99bb7c-d5ed-4602-89aa-48206041f746-config\") pod \"controller-manager-5877696bb8-h28zl\" (UID: \"7c99bb7c-d5ed-4602-89aa-48206041f746\") " pod="openshift-controller-manager/controller-manager-5877696bb8-h28zl" Dec 06 09:12:31 crc kubenswrapper[4672]: I1206 09:12:31.254939 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7c99bb7c-d5ed-4602-89aa-48206041f746-client-ca\") pod \"controller-manager-5877696bb8-h28zl\" (UID: \"7c99bb7c-d5ed-4602-89aa-48206041f746\") " pod="openshift-controller-manager/controller-manager-5877696bb8-h28zl" Dec 06 09:12:31 crc kubenswrapper[4672]: I1206 09:12:31.261460 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c99bb7c-d5ed-4602-89aa-48206041f746-serving-cert\") pod \"controller-manager-5877696bb8-h28zl\" (UID: \"7c99bb7c-d5ed-4602-89aa-48206041f746\") " pod="openshift-controller-manager/controller-manager-5877696bb8-h28zl" Dec 06 09:12:31 crc kubenswrapper[4672]: I1206 09:12:31.278087 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zm77\" (UniqueName: \"kubernetes.io/projected/7c99bb7c-d5ed-4602-89aa-48206041f746-kube-api-access-9zm77\") pod \"controller-manager-5877696bb8-h28zl\" (UID: \"7c99bb7c-d5ed-4602-89aa-48206041f746\") " pod="openshift-controller-manager/controller-manager-5877696bb8-h28zl" Dec 06 09:12:31 crc kubenswrapper[4672]: I1206 09:12:31.416594 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5877696bb8-h28zl" Dec 06 09:12:31 crc kubenswrapper[4672]: I1206 09:12:31.654107 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5877696bb8-h28zl"] Dec 06 09:12:32 crc kubenswrapper[4672]: I1206 09:12:32.065250 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5877696bb8-h28zl" event={"ID":"7c99bb7c-d5ed-4602-89aa-48206041f746","Type":"ContainerStarted","Data":"b7638b99a112e11b5d1bbc1a1e46926573ad4fdc8f8d1dbcce938bd72d746c21"} Dec 06 09:12:32 crc kubenswrapper[4672]: I1206 09:12:32.066043 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5877696bb8-h28zl" event={"ID":"7c99bb7c-d5ed-4602-89aa-48206041f746","Type":"ContainerStarted","Data":"d4b15e3d193bf80cb542f3256f9e83879db43d2a7751fa88dc62b84a1f4f3d9f"} Dec 06 09:12:32 crc kubenswrapper[4672]: I1206 09:12:32.067128 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5877696bb8-h28zl" Dec 06 09:12:32 crc kubenswrapper[4672]: I1206 09:12:32.121073 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5877696bb8-h28zl" Dec 06 09:12:32 crc kubenswrapper[4672]: I1206 09:12:32.152263 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5877696bb8-h28zl" podStartSLOduration=6.152237432 podStartE2EDuration="6.152237432s" podCreationTimestamp="2025-12-06 09:12:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:12:32.091210274 +0000 UTC m=+369.835470571" watchObservedRunningTime="2025-12-06 09:12:32.152237432 +0000 UTC m=+369.896497719" Dec 06 09:12:42 crc kubenswrapper[4672]: I1206 09:12:42.320246 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:12:42 crc kubenswrapper[4672]: I1206 09:12:42.321386 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:12:42 crc kubenswrapper[4672]: I1206 09:12:42.427980 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-r2nq8" Dec 06 09:12:42 crc kubenswrapper[4672]: I1206 09:12:42.493939 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dbpp6"] Dec 06 09:12:43 crc kubenswrapper[4672]: I1206 09:12:43.839151 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5877696bb8-h28zl"] Dec 06 09:12:43 crc kubenswrapper[4672]: I1206 09:12:43.839785 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5877696bb8-h28zl" podUID="7c99bb7c-d5ed-4602-89aa-48206041f746" containerName="controller-manager" containerID="cri-o://b7638b99a112e11b5d1bbc1a1e46926573ad4fdc8f8d1dbcce938bd72d746c21" gracePeriod=30 Dec 06 09:12:43 crc kubenswrapper[4672]: I1206 09:12:43.857728 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5948fdc94d-p9bdm"] Dec 06 09:12:43 crc kubenswrapper[4672]: I1206 09:12:43.857990 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5948fdc94d-p9bdm" podUID="52b2e50f-31ca-46f8-8bb3-7a2202b86601" containerName="route-controller-manager" containerID="cri-o://a0615fb6c1dd69de9e462d1065466746191aacb316cd6cda36085a6aadb2cc76" gracePeriod=30 Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.125291 4672 generic.go:334] "Generic (PLEG): container finished" podID="52b2e50f-31ca-46f8-8bb3-7a2202b86601" containerID="a0615fb6c1dd69de9e462d1065466746191aacb316cd6cda36085a6aadb2cc76" exitCode=0 Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.125372 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5948fdc94d-p9bdm" event={"ID":"52b2e50f-31ca-46f8-8bb3-7a2202b86601","Type":"ContainerDied","Data":"a0615fb6c1dd69de9e462d1065466746191aacb316cd6cda36085a6aadb2cc76"} Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.126923 4672 generic.go:334] "Generic (PLEG): container finished" podID="7c99bb7c-d5ed-4602-89aa-48206041f746" containerID="b7638b99a112e11b5d1bbc1a1e46926573ad4fdc8f8d1dbcce938bd72d746c21" exitCode=0 Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.126952 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5877696bb8-h28zl" event={"ID":"7c99bb7c-d5ed-4602-89aa-48206041f746","Type":"ContainerDied","Data":"b7638b99a112e11b5d1bbc1a1e46926573ad4fdc8f8d1dbcce938bd72d746c21"} Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.371753 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5948fdc94d-p9bdm" Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.414756 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5877696bb8-h28zl" Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.543758 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52b2e50f-31ca-46f8-8bb3-7a2202b86601-config\") pod \"52b2e50f-31ca-46f8-8bb3-7a2202b86601\" (UID: \"52b2e50f-31ca-46f8-8bb3-7a2202b86601\") " Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.543810 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7c99bb7c-d5ed-4602-89aa-48206041f746-client-ca\") pod \"7c99bb7c-d5ed-4602-89aa-48206041f746\" (UID: \"7c99bb7c-d5ed-4602-89aa-48206041f746\") " Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.543840 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52b2e50f-31ca-46f8-8bb3-7a2202b86601-serving-cert\") pod \"52b2e50f-31ca-46f8-8bb3-7a2202b86601\" (UID: \"52b2e50f-31ca-46f8-8bb3-7a2202b86601\") " Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.543906 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52b2e50f-31ca-46f8-8bb3-7a2202b86601-client-ca\") pod \"52b2e50f-31ca-46f8-8bb3-7a2202b86601\" (UID: \"52b2e50f-31ca-46f8-8bb3-7a2202b86601\") " Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.543944 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9wqn\" (UniqueName: \"kubernetes.io/projected/52b2e50f-31ca-46f8-8bb3-7a2202b86601-kube-api-access-n9wqn\") pod \"52b2e50f-31ca-46f8-8bb3-7a2202b86601\" (UID: \"52b2e50f-31ca-46f8-8bb3-7a2202b86601\") " Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.543967 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c99bb7c-d5ed-4602-89aa-48206041f746-serving-cert\") pod \"7c99bb7c-d5ed-4602-89aa-48206041f746\" (UID: \"7c99bb7c-d5ed-4602-89aa-48206041f746\") " Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.543985 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c99bb7c-d5ed-4602-89aa-48206041f746-proxy-ca-bundles\") pod \"7c99bb7c-d5ed-4602-89aa-48206041f746\" (UID: \"7c99bb7c-d5ed-4602-89aa-48206041f746\") " Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.544008 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c99bb7c-d5ed-4602-89aa-48206041f746-config\") pod \"7c99bb7c-d5ed-4602-89aa-48206041f746\" (UID: \"7c99bb7c-d5ed-4602-89aa-48206041f746\") " Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.544049 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zm77\" (UniqueName: \"kubernetes.io/projected/7c99bb7c-d5ed-4602-89aa-48206041f746-kube-api-access-9zm77\") pod \"7c99bb7c-d5ed-4602-89aa-48206041f746\" (UID: \"7c99bb7c-d5ed-4602-89aa-48206041f746\") " Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.544876 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c99bb7c-d5ed-4602-89aa-48206041f746-client-ca" (OuterVolumeSpecName: "client-ca") pod "7c99bb7c-d5ed-4602-89aa-48206041f746" (UID: "7c99bb7c-d5ed-4602-89aa-48206041f746"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.544881 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52b2e50f-31ca-46f8-8bb3-7a2202b86601-client-ca" (OuterVolumeSpecName: "client-ca") pod "52b2e50f-31ca-46f8-8bb3-7a2202b86601" (UID: "52b2e50f-31ca-46f8-8bb3-7a2202b86601"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.544944 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52b2e50f-31ca-46f8-8bb3-7a2202b86601-config" (OuterVolumeSpecName: "config") pod "52b2e50f-31ca-46f8-8bb3-7a2202b86601" (UID: "52b2e50f-31ca-46f8-8bb3-7a2202b86601"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.545525 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c99bb7c-d5ed-4602-89aa-48206041f746-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7c99bb7c-d5ed-4602-89aa-48206041f746" (UID: "7c99bb7c-d5ed-4602-89aa-48206041f746"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.545683 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c99bb7c-d5ed-4602-89aa-48206041f746-config" (OuterVolumeSpecName: "config") pod "7c99bb7c-d5ed-4602-89aa-48206041f746" (UID: "7c99bb7c-d5ed-4602-89aa-48206041f746"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.550015 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c99bb7c-d5ed-4602-89aa-48206041f746-kube-api-access-9zm77" (OuterVolumeSpecName: "kube-api-access-9zm77") pod "7c99bb7c-d5ed-4602-89aa-48206041f746" (UID: "7c99bb7c-d5ed-4602-89aa-48206041f746"). InnerVolumeSpecName "kube-api-access-9zm77". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.550065 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52b2e50f-31ca-46f8-8bb3-7a2202b86601-kube-api-access-n9wqn" (OuterVolumeSpecName: "kube-api-access-n9wqn") pod "52b2e50f-31ca-46f8-8bb3-7a2202b86601" (UID: "52b2e50f-31ca-46f8-8bb3-7a2202b86601"). InnerVolumeSpecName "kube-api-access-n9wqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.550352 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c99bb7c-d5ed-4602-89aa-48206041f746-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7c99bb7c-d5ed-4602-89aa-48206041f746" (UID: "7c99bb7c-d5ed-4602-89aa-48206041f746"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.551285 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52b2e50f-31ca-46f8-8bb3-7a2202b86601-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "52b2e50f-31ca-46f8-8bb3-7a2202b86601" (UID: "52b2e50f-31ca-46f8-8bb3-7a2202b86601"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.645901 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9wqn\" (UniqueName: \"kubernetes.io/projected/52b2e50f-31ca-46f8-8bb3-7a2202b86601-kube-api-access-n9wqn\") on node \"crc\" DevicePath \"\"" Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.645954 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c99bb7c-d5ed-4602-89aa-48206041f746-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.645964 4672 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c99bb7c-d5ed-4602-89aa-48206041f746-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.645972 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c99bb7c-d5ed-4602-89aa-48206041f746-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.645981 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zm77\" (UniqueName: \"kubernetes.io/projected/7c99bb7c-d5ed-4602-89aa-48206041f746-kube-api-access-9zm77\") on node \"crc\" DevicePath \"\"" Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.646008 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52b2e50f-31ca-46f8-8bb3-7a2202b86601-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.646017 4672 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7c99bb7c-d5ed-4602-89aa-48206041f746-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.646028 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52b2e50f-31ca-46f8-8bb3-7a2202b86601-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:12:44 crc kubenswrapper[4672]: I1206 09:12:44.646036 4672 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52b2e50f-31ca-46f8-8bb3-7a2202b86601-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.101744 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5c9df99b5b-vgq8l"] Dec 06 09:12:45 crc kubenswrapper[4672]: E1206 09:12:45.102220 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52b2e50f-31ca-46f8-8bb3-7a2202b86601" containerName="route-controller-manager" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.102250 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b2e50f-31ca-46f8-8bb3-7a2202b86601" containerName="route-controller-manager" Dec 06 09:12:45 crc kubenswrapper[4672]: E1206 09:12:45.102288 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c99bb7c-d5ed-4602-89aa-48206041f746" containerName="controller-manager" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.102305 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c99bb7c-d5ed-4602-89aa-48206041f746" containerName="controller-manager" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.102545 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="52b2e50f-31ca-46f8-8bb3-7a2202b86601" containerName="route-controller-manager" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.102585 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c99bb7c-d5ed-4602-89aa-48206041f746" containerName="controller-manager" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.103242 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c9df99b5b-vgq8l" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.107505 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fcd5cbbdd-6gw4c"] Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.108327 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fcd5cbbdd-6gw4c" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.127719 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c9df99b5b-vgq8l"] Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.135485 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5948fdc94d-p9bdm" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.135757 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5948fdc94d-p9bdm" event={"ID":"52b2e50f-31ca-46f8-8bb3-7a2202b86601","Type":"ContainerDied","Data":"a3659098a4901cfbc4207758d3ecc62d2e9a0b7f5b33f927c73d5e0e81aa3c33"} Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.135827 4672 scope.go:117] "RemoveContainer" containerID="a0615fb6c1dd69de9e462d1065466746191aacb316cd6cda36085a6aadb2cc76" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.136896 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5877696bb8-h28zl" event={"ID":"7c99bb7c-d5ed-4602-89aa-48206041f746","Type":"ContainerDied","Data":"d4b15e3d193bf80cb542f3256f9e83879db43d2a7751fa88dc62b84a1f4f3d9f"} Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.136971 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5877696bb8-h28zl" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.159887 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fcd5cbbdd-6gw4c"] Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.160007 4672 scope.go:117] "RemoveContainer" containerID="b7638b99a112e11b5d1bbc1a1e46926573ad4fdc8f8d1dbcce938bd72d746c21" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.185672 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5877696bb8-h28zl"] Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.186295 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5877696bb8-h28zl"] Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.191184 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5948fdc94d-p9bdm"] Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.212753 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5948fdc94d-p9bdm"] Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.254493 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b602a587-6eda-433a-afc6-a929fb553dc7-config\") pod \"route-controller-manager-6fcd5cbbdd-6gw4c\" (UID: \"b602a587-6eda-433a-afc6-a929fb553dc7\") " pod="openshift-route-controller-manager/route-controller-manager-6fcd5cbbdd-6gw4c" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.254554 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9401395f-01fb-4e10-9f96-6548014d3b6d-client-ca\") pod \"controller-manager-5c9df99b5b-vgq8l\" (UID: \"9401395f-01fb-4e10-9f96-6548014d3b6d\") " pod="openshift-controller-manager/controller-manager-5c9df99b5b-vgq8l" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.254579 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck6rd\" (UniqueName: \"kubernetes.io/projected/9401395f-01fb-4e10-9f96-6548014d3b6d-kube-api-access-ck6rd\") pod \"controller-manager-5c9df99b5b-vgq8l\" (UID: \"9401395f-01fb-4e10-9f96-6548014d3b6d\") " pod="openshift-controller-manager/controller-manager-5c9df99b5b-vgq8l" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.254613 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9401395f-01fb-4e10-9f96-6548014d3b6d-proxy-ca-bundles\") pod \"controller-manager-5c9df99b5b-vgq8l\" (UID: \"9401395f-01fb-4e10-9f96-6548014d3b6d\") " pod="openshift-controller-manager/controller-manager-5c9df99b5b-vgq8l" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.254633 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b602a587-6eda-433a-afc6-a929fb553dc7-client-ca\") pod \"route-controller-manager-6fcd5cbbdd-6gw4c\" (UID: \"b602a587-6eda-433a-afc6-a929fb553dc7\") " pod="openshift-route-controller-manager/route-controller-manager-6fcd5cbbdd-6gw4c" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.254655 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9401395f-01fb-4e10-9f96-6548014d3b6d-serving-cert\") pod \"controller-manager-5c9df99b5b-vgq8l\" (UID: \"9401395f-01fb-4e10-9f96-6548014d3b6d\") " pod="openshift-controller-manager/controller-manager-5c9df99b5b-vgq8l" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.254685 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2zr7\" (UniqueName: \"kubernetes.io/projected/b602a587-6eda-433a-afc6-a929fb553dc7-kube-api-access-q2zr7\") pod \"route-controller-manager-6fcd5cbbdd-6gw4c\" (UID: \"b602a587-6eda-433a-afc6-a929fb553dc7\") " pod="openshift-route-controller-manager/route-controller-manager-6fcd5cbbdd-6gw4c" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.254964 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b602a587-6eda-433a-afc6-a929fb553dc7-serving-cert\") pod \"route-controller-manager-6fcd5cbbdd-6gw4c\" (UID: \"b602a587-6eda-433a-afc6-a929fb553dc7\") " pod="openshift-route-controller-manager/route-controller-manager-6fcd5cbbdd-6gw4c" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.255048 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9401395f-01fb-4e10-9f96-6548014d3b6d-config\") pod \"controller-manager-5c9df99b5b-vgq8l\" (UID: \"9401395f-01fb-4e10-9f96-6548014d3b6d\") " pod="openshift-controller-manager/controller-manager-5c9df99b5b-vgq8l" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.358140 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9401395f-01fb-4e10-9f96-6548014d3b6d-client-ca\") pod \"controller-manager-5c9df99b5b-vgq8l\" (UID: \"9401395f-01fb-4e10-9f96-6548014d3b6d\") " pod="openshift-controller-manager/controller-manager-5c9df99b5b-vgq8l" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.358683 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b602a587-6eda-433a-afc6-a929fb553dc7-config\") pod \"route-controller-manager-6fcd5cbbdd-6gw4c\" (UID: \"b602a587-6eda-433a-afc6-a929fb553dc7\") " pod="openshift-route-controller-manager/route-controller-manager-6fcd5cbbdd-6gw4c" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.358927 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck6rd\" (UniqueName: \"kubernetes.io/projected/9401395f-01fb-4e10-9f96-6548014d3b6d-kube-api-access-ck6rd\") pod \"controller-manager-5c9df99b5b-vgq8l\" (UID: \"9401395f-01fb-4e10-9f96-6548014d3b6d\") " pod="openshift-controller-manager/controller-manager-5c9df99b5b-vgq8l" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.359190 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9401395f-01fb-4e10-9f96-6548014d3b6d-proxy-ca-bundles\") pod \"controller-manager-5c9df99b5b-vgq8l\" (UID: \"9401395f-01fb-4e10-9f96-6548014d3b6d\") " pod="openshift-controller-manager/controller-manager-5c9df99b5b-vgq8l" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.359358 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b602a587-6eda-433a-afc6-a929fb553dc7-client-ca\") pod \"route-controller-manager-6fcd5cbbdd-6gw4c\" (UID: \"b602a587-6eda-433a-afc6-a929fb553dc7\") " pod="openshift-route-controller-manager/route-controller-manager-6fcd5cbbdd-6gw4c" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.359463 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9401395f-01fb-4e10-9f96-6548014d3b6d-serving-cert\") pod \"controller-manager-5c9df99b5b-vgq8l\" (UID: \"9401395f-01fb-4e10-9f96-6548014d3b6d\") " pod="openshift-controller-manager/controller-manager-5c9df99b5b-vgq8l" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.359674 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2zr7\" (UniqueName: \"kubernetes.io/projected/b602a587-6eda-433a-afc6-a929fb553dc7-kube-api-access-q2zr7\") pod \"route-controller-manager-6fcd5cbbdd-6gw4c\" (UID: \"b602a587-6eda-433a-afc6-a929fb553dc7\") " pod="openshift-route-controller-manager/route-controller-manager-6fcd5cbbdd-6gw4c" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.359858 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b602a587-6eda-433a-afc6-a929fb553dc7-serving-cert\") pod \"route-controller-manager-6fcd5cbbdd-6gw4c\" (UID: \"b602a587-6eda-433a-afc6-a929fb553dc7\") " pod="openshift-route-controller-manager/route-controller-manager-6fcd5cbbdd-6gw4c" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.360006 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9401395f-01fb-4e10-9f96-6548014d3b6d-config\") pod \"controller-manager-5c9df99b5b-vgq8l\" (UID: \"9401395f-01fb-4e10-9f96-6548014d3b6d\") " pod="openshift-controller-manager/controller-manager-5c9df99b5b-vgq8l" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.359565 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9401395f-01fb-4e10-9f96-6548014d3b6d-client-ca\") pod \"controller-manager-5c9df99b5b-vgq8l\" (UID: \"9401395f-01fb-4e10-9f96-6548014d3b6d\") " pod="openshift-controller-manager/controller-manager-5c9df99b5b-vgq8l" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.360066 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b602a587-6eda-433a-afc6-a929fb553dc7-config\") pod \"route-controller-manager-6fcd5cbbdd-6gw4c\" (UID: \"b602a587-6eda-433a-afc6-a929fb553dc7\") " pod="openshift-route-controller-manager/route-controller-manager-6fcd5cbbdd-6gw4c" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.360288 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b602a587-6eda-433a-afc6-a929fb553dc7-client-ca\") pod \"route-controller-manager-6fcd5cbbdd-6gw4c\" (UID: \"b602a587-6eda-433a-afc6-a929fb553dc7\") " pod="openshift-route-controller-manager/route-controller-manager-6fcd5cbbdd-6gw4c" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.361012 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9401395f-01fb-4e10-9f96-6548014d3b6d-proxy-ca-bundles\") pod \"controller-manager-5c9df99b5b-vgq8l\" (UID: \"9401395f-01fb-4e10-9f96-6548014d3b6d\") " pod="openshift-controller-manager/controller-manager-5c9df99b5b-vgq8l" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.362070 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9401395f-01fb-4e10-9f96-6548014d3b6d-config\") pod \"controller-manager-5c9df99b5b-vgq8l\" (UID: \"9401395f-01fb-4e10-9f96-6548014d3b6d\") " pod="openshift-controller-manager/controller-manager-5c9df99b5b-vgq8l" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.364866 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9401395f-01fb-4e10-9f96-6548014d3b6d-serving-cert\") pod \"controller-manager-5c9df99b5b-vgq8l\" (UID: \"9401395f-01fb-4e10-9f96-6548014d3b6d\") " pod="openshift-controller-manager/controller-manager-5c9df99b5b-vgq8l" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.366085 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b602a587-6eda-433a-afc6-a929fb553dc7-serving-cert\") pod \"route-controller-manager-6fcd5cbbdd-6gw4c\" (UID: \"b602a587-6eda-433a-afc6-a929fb553dc7\") " pod="openshift-route-controller-manager/route-controller-manager-6fcd5cbbdd-6gw4c" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.388714 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck6rd\" (UniqueName: \"kubernetes.io/projected/9401395f-01fb-4e10-9f96-6548014d3b6d-kube-api-access-ck6rd\") pod \"controller-manager-5c9df99b5b-vgq8l\" (UID: \"9401395f-01fb-4e10-9f96-6548014d3b6d\") " pod="openshift-controller-manager/controller-manager-5c9df99b5b-vgq8l" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.389797 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2zr7\" (UniqueName: \"kubernetes.io/projected/b602a587-6eda-433a-afc6-a929fb553dc7-kube-api-access-q2zr7\") pod \"route-controller-manager-6fcd5cbbdd-6gw4c\" (UID: \"b602a587-6eda-433a-afc6-a929fb553dc7\") " pod="openshift-route-controller-manager/route-controller-manager-6fcd5cbbdd-6gw4c" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.419134 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c9df99b5b-vgq8l" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.431418 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fcd5cbbdd-6gw4c" Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.669926 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c9df99b5b-vgq8l"] Dec 06 09:12:45 crc kubenswrapper[4672]: I1206 09:12:45.714104 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fcd5cbbdd-6gw4c"] Dec 06 09:12:45 crc kubenswrapper[4672]: W1206 09:12:45.721717 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb602a587_6eda_433a_afc6_a929fb553dc7.slice/crio-a4335a6649a9027d3c17fde769b29ba1dfbe71de9481e24a834afe12ee3ebe30 WatchSource:0}: Error finding container a4335a6649a9027d3c17fde769b29ba1dfbe71de9481e24a834afe12ee3ebe30: Status 404 returned error can't find the container with id a4335a6649a9027d3c17fde769b29ba1dfbe71de9481e24a834afe12ee3ebe30 Dec 06 09:12:46 crc kubenswrapper[4672]: I1206 09:12:46.142869 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fcd5cbbdd-6gw4c" event={"ID":"b602a587-6eda-433a-afc6-a929fb553dc7","Type":"ContainerStarted","Data":"6d43362c655bb17ea237dc6782d2e759d09722d46587a9e932d1480fff146f74"} Dec 06 09:12:46 crc kubenswrapper[4672]: I1206 09:12:46.142939 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fcd5cbbdd-6gw4c" event={"ID":"b602a587-6eda-433a-afc6-a929fb553dc7","Type":"ContainerStarted","Data":"a4335a6649a9027d3c17fde769b29ba1dfbe71de9481e24a834afe12ee3ebe30"} Dec 06 09:12:46 crc kubenswrapper[4672]: I1206 09:12:46.144060 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6fcd5cbbdd-6gw4c" Dec 06 09:12:46 crc kubenswrapper[4672]: I1206 09:12:46.144156 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c9df99b5b-vgq8l" event={"ID":"9401395f-01fb-4e10-9f96-6548014d3b6d","Type":"ContainerStarted","Data":"0f51c078c18c55796649687f2e1314ff2494005136636754c21d9aacf92f6780"} Dec 06 09:12:46 crc kubenswrapper[4672]: I1206 09:12:46.144219 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c9df99b5b-vgq8l" event={"ID":"9401395f-01fb-4e10-9f96-6548014d3b6d","Type":"ContainerStarted","Data":"6f2f302081b8a423dbccf35aeca9a7de254d09cde2317d292187edb267215a4d"} Dec 06 09:12:46 crc kubenswrapper[4672]: I1206 09:12:46.144583 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5c9df99b5b-vgq8l" Dec 06 09:12:46 crc kubenswrapper[4672]: I1206 09:12:46.152085 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5c9df99b5b-vgq8l" Dec 06 09:12:46 crc kubenswrapper[4672]: I1206 09:12:46.161723 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6fcd5cbbdd-6gw4c" podStartSLOduration=3.161709247 podStartE2EDuration="3.161709247s" podCreationTimestamp="2025-12-06 09:12:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:12:46.160174515 +0000 UTC m=+383.904434802" watchObservedRunningTime="2025-12-06 09:12:46.161709247 +0000 UTC m=+383.905969534" Dec 06 09:12:46 crc kubenswrapper[4672]: I1206 09:12:46.183070 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5c9df99b5b-vgq8l" podStartSLOduration=3.183055448 podStartE2EDuration="3.183055448s" podCreationTimestamp="2025-12-06 09:12:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:12:46.181134924 +0000 UTC m=+383.925395221" watchObservedRunningTime="2025-12-06 09:12:46.183055448 +0000 UTC m=+383.927315735" Dec 06 09:12:46 crc kubenswrapper[4672]: I1206 09:12:46.243150 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6fcd5cbbdd-6gw4c" Dec 06 09:12:46 crc kubenswrapper[4672]: I1206 09:12:46.563439 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52b2e50f-31ca-46f8-8bb3-7a2202b86601" path="/var/lib/kubelet/pods/52b2e50f-31ca-46f8-8bb3-7a2202b86601/volumes" Dec 06 09:12:46 crc kubenswrapper[4672]: I1206 09:12:46.564020 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c99bb7c-d5ed-4602-89aa-48206041f746" path="/var/lib/kubelet/pods/7c99bb7c-d5ed-4602-89aa-48206041f746/volumes" Dec 06 09:13:07 crc kubenswrapper[4672]: I1206 09:13:07.544715 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" podUID="f874c07b-7566-441d-9546-6c3f7b64de13" containerName="registry" containerID="cri-o://083aa75bae22861c8f4f062ed48c6719e3c88e04195da26346b88708b03ce037" gracePeriod=30 Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.060707 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.190025 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mck6n\" (UniqueName: \"kubernetes.io/projected/f874c07b-7566-441d-9546-6c3f7b64de13-kube-api-access-mck6n\") pod \"f874c07b-7566-441d-9546-6c3f7b64de13\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.190326 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"f874c07b-7566-441d-9546-6c3f7b64de13\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.190427 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f874c07b-7566-441d-9546-6c3f7b64de13-bound-sa-token\") pod \"f874c07b-7566-441d-9546-6c3f7b64de13\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.190498 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f874c07b-7566-441d-9546-6c3f7b64de13-ca-trust-extracted\") pod \"f874c07b-7566-441d-9546-6c3f7b64de13\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.190697 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f874c07b-7566-441d-9546-6c3f7b64de13-trusted-ca\") pod \"f874c07b-7566-441d-9546-6c3f7b64de13\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.190811 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f874c07b-7566-441d-9546-6c3f7b64de13-registry-tls\") pod \"f874c07b-7566-441d-9546-6c3f7b64de13\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.190905 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f874c07b-7566-441d-9546-6c3f7b64de13-registry-certificates\") pod \"f874c07b-7566-441d-9546-6c3f7b64de13\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.191010 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f874c07b-7566-441d-9546-6c3f7b64de13-installation-pull-secrets\") pod \"f874c07b-7566-441d-9546-6c3f7b64de13\" (UID: \"f874c07b-7566-441d-9546-6c3f7b64de13\") " Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.192004 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f874c07b-7566-441d-9546-6c3f7b64de13-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f874c07b-7566-441d-9546-6c3f7b64de13" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.192285 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f874c07b-7566-441d-9546-6c3f7b64de13-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f874c07b-7566-441d-9546-6c3f7b64de13" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.196592 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f874c07b-7566-441d-9546-6c3f7b64de13-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f874c07b-7566-441d-9546-6c3f7b64de13" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.199284 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f874c07b-7566-441d-9546-6c3f7b64de13-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f874c07b-7566-441d-9546-6c3f7b64de13" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.199791 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f874c07b-7566-441d-9546-6c3f7b64de13-kube-api-access-mck6n" (OuterVolumeSpecName: "kube-api-access-mck6n") pod "f874c07b-7566-441d-9546-6c3f7b64de13" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13"). InnerVolumeSpecName "kube-api-access-mck6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.199931 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f874c07b-7566-441d-9546-6c3f7b64de13-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f874c07b-7566-441d-9546-6c3f7b64de13" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.205781 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "f874c07b-7566-441d-9546-6c3f7b64de13" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.216322 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f874c07b-7566-441d-9546-6c3f7b64de13-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f874c07b-7566-441d-9546-6c3f7b64de13" (UID: "f874c07b-7566-441d-9546-6c3f7b64de13"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.277026 4672 generic.go:334] "Generic (PLEG): container finished" podID="f874c07b-7566-441d-9546-6c3f7b64de13" containerID="083aa75bae22861c8f4f062ed48c6719e3c88e04195da26346b88708b03ce037" exitCode=0 Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.277077 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" event={"ID":"f874c07b-7566-441d-9546-6c3f7b64de13","Type":"ContainerDied","Data":"083aa75bae22861c8f4f062ed48c6719e3c88e04195da26346b88708b03ce037"} Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.277108 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" event={"ID":"f874c07b-7566-441d-9546-6c3f7b64de13","Type":"ContainerDied","Data":"884e6a4f2788b70182a892c58f356d22ceb1bfc18abcd28f5f6bf5d06079dd69"} Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.277130 4672 scope.go:117] "RemoveContainer" containerID="083aa75bae22861c8f4f062ed48c6719e3c88e04195da26346b88708b03ce037" Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.277140 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dbpp6" Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.292131 4672 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f874c07b-7566-441d-9546-6c3f7b64de13-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.292167 4672 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f874c07b-7566-441d-9546-6c3f7b64de13-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.292180 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mck6n\" (UniqueName: \"kubernetes.io/projected/f874c07b-7566-441d-9546-6c3f7b64de13-kube-api-access-mck6n\") on node \"crc\" DevicePath \"\"" Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.292194 4672 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f874c07b-7566-441d-9546-6c3f7b64de13-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.292203 4672 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f874c07b-7566-441d-9546-6c3f7b64de13-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.292213 4672 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f874c07b-7566-441d-9546-6c3f7b64de13-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.292224 4672 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f874c07b-7566-441d-9546-6c3f7b64de13-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.302533 4672 scope.go:117] "RemoveContainer" containerID="083aa75bae22861c8f4f062ed48c6719e3c88e04195da26346b88708b03ce037" Dec 06 09:13:08 crc kubenswrapper[4672]: E1206 09:13:08.306227 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"083aa75bae22861c8f4f062ed48c6719e3c88e04195da26346b88708b03ce037\": container with ID starting with 083aa75bae22861c8f4f062ed48c6719e3c88e04195da26346b88708b03ce037 not found: ID does not exist" containerID="083aa75bae22861c8f4f062ed48c6719e3c88e04195da26346b88708b03ce037" Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.306463 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"083aa75bae22861c8f4f062ed48c6719e3c88e04195da26346b88708b03ce037"} err="failed to get container status \"083aa75bae22861c8f4f062ed48c6719e3c88e04195da26346b88708b03ce037\": rpc error: code = NotFound desc = could not find container \"083aa75bae22861c8f4f062ed48c6719e3c88e04195da26346b88708b03ce037\": container with ID starting with 083aa75bae22861c8f4f062ed48c6719e3c88e04195da26346b88708b03ce037 not found: ID does not exist" Dec 06 09:13:08 crc kubenswrapper[4672]: E1206 09:13:08.320887 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf874c07b_7566_441d_9546_6c3f7b64de13.slice\": RecentStats: unable to find data in memory cache]" Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.327110 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dbpp6"] Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.329296 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dbpp6"] Dec 06 09:13:08 crc kubenswrapper[4672]: I1206 09:13:08.567382 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f874c07b-7566-441d-9546-6c3f7b64de13" path="/var/lib/kubelet/pods/f874c07b-7566-441d-9546-6c3f7b64de13/volumes" Dec 06 09:13:12 crc kubenswrapper[4672]: I1206 09:13:12.319842 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:13:12 crc kubenswrapper[4672]: I1206 09:13:12.320219 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:13:42 crc kubenswrapper[4672]: I1206 09:13:42.319763 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:13:42 crc kubenswrapper[4672]: I1206 09:13:42.321809 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:13:42 crc kubenswrapper[4672]: I1206 09:13:42.321921 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 09:13:42 crc kubenswrapper[4672]: I1206 09:13:42.322986 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c07965cc625156f67df18ec68f14cf89ea9bd464984c84ab0aa0cd0dd54f62ac"} pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 09:13:42 crc kubenswrapper[4672]: I1206 09:13:42.323098 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" containerID="cri-o://c07965cc625156f67df18ec68f14cf89ea9bd464984c84ab0aa0cd0dd54f62ac" gracePeriod=600 Dec 06 09:13:42 crc kubenswrapper[4672]: I1206 09:13:42.478644 4672 generic.go:334] "Generic (PLEG): container finished" podID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerID="c07965cc625156f67df18ec68f14cf89ea9bd464984c84ab0aa0cd0dd54f62ac" exitCode=0 Dec 06 09:13:42 crc kubenswrapper[4672]: I1206 09:13:42.478746 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerDied","Data":"c07965cc625156f67df18ec68f14cf89ea9bd464984c84ab0aa0cd0dd54f62ac"} Dec 06 09:13:42 crc kubenswrapper[4672]: I1206 09:13:42.479065 4672 scope.go:117] "RemoveContainer" containerID="389eb5011ceb2fc5c77e359d7c5066d0d013ca72ce83527f9882e3ed743b5a3b" Dec 06 09:13:43 crc kubenswrapper[4672]: I1206 09:13:43.487116 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerStarted","Data":"2b410864b2f905e632c9f0faa7e115cee3e4f8d1dd843cd26f566a60bf5790f9"} Dec 06 09:15:00 crc kubenswrapper[4672]: I1206 09:15:00.209842 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416875-rxxdl"] Dec 06 09:15:00 crc kubenswrapper[4672]: E1206 09:15:00.210774 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f874c07b-7566-441d-9546-6c3f7b64de13" containerName="registry" Dec 06 09:15:00 crc kubenswrapper[4672]: I1206 09:15:00.210789 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="f874c07b-7566-441d-9546-6c3f7b64de13" containerName="registry" Dec 06 09:15:00 crc kubenswrapper[4672]: I1206 09:15:00.210904 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="f874c07b-7566-441d-9546-6c3f7b64de13" containerName="registry" Dec 06 09:15:00 crc kubenswrapper[4672]: I1206 09:15:00.211303 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416875-rxxdl"] Dec 06 09:15:00 crc kubenswrapper[4672]: I1206 09:15:00.211382 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-rxxdl" Dec 06 09:15:00 crc kubenswrapper[4672]: I1206 09:15:00.224884 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 09:15:00 crc kubenswrapper[4672]: I1206 09:15:00.230904 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 09:15:00 crc kubenswrapper[4672]: I1206 09:15:00.299260 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-552qz\" (UniqueName: \"kubernetes.io/projected/5571247b-599d-4a18-b507-f9a153ded2ec-kube-api-access-552qz\") pod \"collect-profiles-29416875-rxxdl\" (UID: \"5571247b-599d-4a18-b507-f9a153ded2ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-rxxdl" Dec 06 09:15:00 crc kubenswrapper[4672]: I1206 09:15:00.299378 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5571247b-599d-4a18-b507-f9a153ded2ec-secret-volume\") pod \"collect-profiles-29416875-rxxdl\" (UID: \"5571247b-599d-4a18-b507-f9a153ded2ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-rxxdl" Dec 06 09:15:00 crc kubenswrapper[4672]: I1206 09:15:00.299438 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5571247b-599d-4a18-b507-f9a153ded2ec-config-volume\") pod \"collect-profiles-29416875-rxxdl\" (UID: \"5571247b-599d-4a18-b507-f9a153ded2ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-rxxdl" Dec 06 09:15:00 crc kubenswrapper[4672]: I1206 09:15:00.400383 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5571247b-599d-4a18-b507-f9a153ded2ec-secret-volume\") pod \"collect-profiles-29416875-rxxdl\" (UID: \"5571247b-599d-4a18-b507-f9a153ded2ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-rxxdl" Dec 06 09:15:00 crc kubenswrapper[4672]: I1206 09:15:00.400455 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5571247b-599d-4a18-b507-f9a153ded2ec-config-volume\") pod \"collect-profiles-29416875-rxxdl\" (UID: \"5571247b-599d-4a18-b507-f9a153ded2ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-rxxdl" Dec 06 09:15:00 crc kubenswrapper[4672]: I1206 09:15:00.400495 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-552qz\" (UniqueName: \"kubernetes.io/projected/5571247b-599d-4a18-b507-f9a153ded2ec-kube-api-access-552qz\") pod \"collect-profiles-29416875-rxxdl\" (UID: \"5571247b-599d-4a18-b507-f9a153ded2ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-rxxdl" Dec 06 09:15:00 crc kubenswrapper[4672]: I1206 09:15:00.401620 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5571247b-599d-4a18-b507-f9a153ded2ec-config-volume\") pod \"collect-profiles-29416875-rxxdl\" (UID: \"5571247b-599d-4a18-b507-f9a153ded2ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-rxxdl" Dec 06 09:15:00 crc kubenswrapper[4672]: I1206 09:15:00.407845 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5571247b-599d-4a18-b507-f9a153ded2ec-secret-volume\") pod \"collect-profiles-29416875-rxxdl\" (UID: \"5571247b-599d-4a18-b507-f9a153ded2ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-rxxdl" Dec 06 09:15:00 crc kubenswrapper[4672]: I1206 09:15:00.418783 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-552qz\" (UniqueName: \"kubernetes.io/projected/5571247b-599d-4a18-b507-f9a153ded2ec-kube-api-access-552qz\") pod \"collect-profiles-29416875-rxxdl\" (UID: \"5571247b-599d-4a18-b507-f9a153ded2ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-rxxdl" Dec 06 09:15:00 crc kubenswrapper[4672]: I1206 09:15:00.541035 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-rxxdl" Dec 06 09:15:00 crc kubenswrapper[4672]: I1206 09:15:00.729499 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416875-rxxdl"] Dec 06 09:15:01 crc kubenswrapper[4672]: I1206 09:15:01.648107 4672 generic.go:334] "Generic (PLEG): container finished" podID="5571247b-599d-4a18-b507-f9a153ded2ec" containerID="6e94145b9634ce6896a4ba9e85586c7ab48924ba369b1a34839cb38fb75a2012" exitCode=0 Dec 06 09:15:01 crc kubenswrapper[4672]: I1206 09:15:01.648393 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-rxxdl" event={"ID":"5571247b-599d-4a18-b507-f9a153ded2ec","Type":"ContainerDied","Data":"6e94145b9634ce6896a4ba9e85586c7ab48924ba369b1a34839cb38fb75a2012"} Dec 06 09:15:01 crc kubenswrapper[4672]: I1206 09:15:01.648418 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-rxxdl" event={"ID":"5571247b-599d-4a18-b507-f9a153ded2ec","Type":"ContainerStarted","Data":"c6d403069cf84db8f0d760b97fb3504972f6b8bc8723552ef5b064566d6fd290"} Dec 06 09:15:02 crc kubenswrapper[4672]: I1206 09:15:02.909406 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-rxxdl" Dec 06 09:15:03 crc kubenswrapper[4672]: I1206 09:15:03.037965 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5571247b-599d-4a18-b507-f9a153ded2ec-secret-volume\") pod \"5571247b-599d-4a18-b507-f9a153ded2ec\" (UID: \"5571247b-599d-4a18-b507-f9a153ded2ec\") " Dec 06 09:15:03 crc kubenswrapper[4672]: I1206 09:15:03.038093 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5571247b-599d-4a18-b507-f9a153ded2ec-config-volume\") pod \"5571247b-599d-4a18-b507-f9a153ded2ec\" (UID: \"5571247b-599d-4a18-b507-f9a153ded2ec\") " Dec 06 09:15:03 crc kubenswrapper[4672]: I1206 09:15:03.038155 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-552qz\" (UniqueName: \"kubernetes.io/projected/5571247b-599d-4a18-b507-f9a153ded2ec-kube-api-access-552qz\") pod \"5571247b-599d-4a18-b507-f9a153ded2ec\" (UID: \"5571247b-599d-4a18-b507-f9a153ded2ec\") " Dec 06 09:15:03 crc kubenswrapper[4672]: I1206 09:15:03.038918 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5571247b-599d-4a18-b507-f9a153ded2ec-config-volume" (OuterVolumeSpecName: "config-volume") pod "5571247b-599d-4a18-b507-f9a153ded2ec" (UID: "5571247b-599d-4a18-b507-f9a153ded2ec"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:15:03 crc kubenswrapper[4672]: I1206 09:15:03.044445 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5571247b-599d-4a18-b507-f9a153ded2ec-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5571247b-599d-4a18-b507-f9a153ded2ec" (UID: "5571247b-599d-4a18-b507-f9a153ded2ec"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:15:03 crc kubenswrapper[4672]: I1206 09:15:03.044807 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5571247b-599d-4a18-b507-f9a153ded2ec-kube-api-access-552qz" (OuterVolumeSpecName: "kube-api-access-552qz") pod "5571247b-599d-4a18-b507-f9a153ded2ec" (UID: "5571247b-599d-4a18-b507-f9a153ded2ec"). InnerVolumeSpecName "kube-api-access-552qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:15:03 crc kubenswrapper[4672]: I1206 09:15:03.139446 4672 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5571247b-599d-4a18-b507-f9a153ded2ec-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 09:15:03 crc kubenswrapper[4672]: I1206 09:15:03.139745 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-552qz\" (UniqueName: \"kubernetes.io/projected/5571247b-599d-4a18-b507-f9a153ded2ec-kube-api-access-552qz\") on node \"crc\" DevicePath \"\"" Dec 06 09:15:03 crc kubenswrapper[4672]: I1206 09:15:03.139813 4672 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5571247b-599d-4a18-b507-f9a153ded2ec-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 09:15:03 crc kubenswrapper[4672]: I1206 09:15:03.664838 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-rxxdl" event={"ID":"5571247b-599d-4a18-b507-f9a153ded2ec","Type":"ContainerDied","Data":"c6d403069cf84db8f0d760b97fb3504972f6b8bc8723552ef5b064566d6fd290"} Dec 06 09:15:03 crc kubenswrapper[4672]: I1206 09:15:03.664900 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6d403069cf84db8f0d760b97fb3504972f6b8bc8723552ef5b064566d6fd290" Dec 06 09:15:03 crc kubenswrapper[4672]: I1206 09:15:03.664977 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-rxxdl" Dec 06 09:15:42 crc kubenswrapper[4672]: I1206 09:15:42.320480 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:15:42 crc kubenswrapper[4672]: I1206 09:15:42.321308 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:15:49 crc kubenswrapper[4672]: I1206 09:15:49.881674 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-qscd7"] Dec 06 09:15:49 crc kubenswrapper[4672]: E1206 09:15:49.883029 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5571247b-599d-4a18-b507-f9a153ded2ec" containerName="collect-profiles" Dec 06 09:15:49 crc kubenswrapper[4672]: I1206 09:15:49.883051 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="5571247b-599d-4a18-b507-f9a153ded2ec" containerName="collect-profiles" Dec 06 09:15:49 crc kubenswrapper[4672]: I1206 09:15:49.883200 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="5571247b-599d-4a18-b507-f9a153ded2ec" containerName="collect-profiles" Dec 06 09:15:49 crc kubenswrapper[4672]: I1206 09:15:49.883879 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-qscd7" Dec 06 09:15:49 crc kubenswrapper[4672]: I1206 09:15:49.887333 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-kkdp5"] Dec 06 09:15:49 crc kubenswrapper[4672]: I1206 09:15:49.888149 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-kkdp5" Dec 06 09:15:49 crc kubenswrapper[4672]: I1206 09:15:49.910578 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 06 09:15:49 crc kubenswrapper[4672]: I1206 09:15:49.923780 4672 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-wl4cz" Dec 06 09:15:49 crc kubenswrapper[4672]: I1206 09:15:49.923783 4672 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-r6wts" Dec 06 09:15:49 crc kubenswrapper[4672]: I1206 09:15:49.923804 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 06 09:15:49 crc kubenswrapper[4672]: I1206 09:15:49.928644 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-qscd7"] Dec 06 09:15:49 crc kubenswrapper[4672]: I1206 09:15:49.932527 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-kkdp5"] Dec 06 09:15:49 crc kubenswrapper[4672]: I1206 09:15:49.953967 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-kdh29"] Dec 06 09:15:49 crc kubenswrapper[4672]: I1206 09:15:49.954933 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-kdh29" Dec 06 09:15:49 crc kubenswrapper[4672]: I1206 09:15:49.957665 4672 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-tlvs6" Dec 06 09:15:49 crc kubenswrapper[4672]: I1206 09:15:49.967324 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-kdh29"] Dec 06 09:15:50 crc kubenswrapper[4672]: I1206 09:15:50.041555 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz6wq\" (UniqueName: \"kubernetes.io/projected/9a0083d7-9175-4399-aaf0-0767c9d88faf-kube-api-access-xz6wq\") pod \"cert-manager-cainjector-7f985d654d-qscd7\" (UID: \"9a0083d7-9175-4399-aaf0-0767c9d88faf\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-qscd7" Dec 06 09:15:50 crc kubenswrapper[4672]: I1206 09:15:50.041638 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbtht\" (UniqueName: \"kubernetes.io/projected/ca049150-2cd7-48c8-a77a-90379dbd818b-kube-api-access-pbtht\") pod \"cert-manager-5b446d88c5-kkdp5\" (UID: \"ca049150-2cd7-48c8-a77a-90379dbd818b\") " pod="cert-manager/cert-manager-5b446d88c5-kkdp5" Dec 06 09:15:50 crc kubenswrapper[4672]: I1206 09:15:50.142546 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjlms\" (UniqueName: \"kubernetes.io/projected/23285e10-efd9-47e7-929b-e3fa93131669-kube-api-access-tjlms\") pod \"cert-manager-webhook-5655c58dd6-kdh29\" (UID: \"23285e10-efd9-47e7-929b-e3fa93131669\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-kdh29" Dec 06 09:15:50 crc kubenswrapper[4672]: I1206 09:15:50.142658 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz6wq\" (UniqueName: \"kubernetes.io/projected/9a0083d7-9175-4399-aaf0-0767c9d88faf-kube-api-access-xz6wq\") pod \"cert-manager-cainjector-7f985d654d-qscd7\" (UID: \"9a0083d7-9175-4399-aaf0-0767c9d88faf\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-qscd7" Dec 06 09:15:50 crc kubenswrapper[4672]: I1206 09:15:50.142685 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbtht\" (UniqueName: \"kubernetes.io/projected/ca049150-2cd7-48c8-a77a-90379dbd818b-kube-api-access-pbtht\") pod \"cert-manager-5b446d88c5-kkdp5\" (UID: \"ca049150-2cd7-48c8-a77a-90379dbd818b\") " pod="cert-manager/cert-manager-5b446d88c5-kkdp5" Dec 06 09:15:50 crc kubenswrapper[4672]: I1206 09:15:50.163095 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbtht\" (UniqueName: \"kubernetes.io/projected/ca049150-2cd7-48c8-a77a-90379dbd818b-kube-api-access-pbtht\") pod \"cert-manager-5b446d88c5-kkdp5\" (UID: \"ca049150-2cd7-48c8-a77a-90379dbd818b\") " pod="cert-manager/cert-manager-5b446d88c5-kkdp5" Dec 06 09:15:50 crc kubenswrapper[4672]: I1206 09:15:50.164788 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz6wq\" (UniqueName: \"kubernetes.io/projected/9a0083d7-9175-4399-aaf0-0767c9d88faf-kube-api-access-xz6wq\") pod \"cert-manager-cainjector-7f985d654d-qscd7\" (UID: \"9a0083d7-9175-4399-aaf0-0767c9d88faf\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-qscd7" Dec 06 09:15:50 crc kubenswrapper[4672]: I1206 09:15:50.211835 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-qscd7" Dec 06 09:15:50 crc kubenswrapper[4672]: I1206 09:15:50.222482 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-kkdp5" Dec 06 09:15:50 crc kubenswrapper[4672]: I1206 09:15:50.244634 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjlms\" (UniqueName: \"kubernetes.io/projected/23285e10-efd9-47e7-929b-e3fa93131669-kube-api-access-tjlms\") pod \"cert-manager-webhook-5655c58dd6-kdh29\" (UID: \"23285e10-efd9-47e7-929b-e3fa93131669\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-kdh29" Dec 06 09:15:50 crc kubenswrapper[4672]: I1206 09:15:50.266449 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjlms\" (UniqueName: \"kubernetes.io/projected/23285e10-efd9-47e7-929b-e3fa93131669-kube-api-access-tjlms\") pod \"cert-manager-webhook-5655c58dd6-kdh29\" (UID: \"23285e10-efd9-47e7-929b-e3fa93131669\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-kdh29" Dec 06 09:15:50 crc kubenswrapper[4672]: I1206 09:15:50.268348 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-kdh29" Dec 06 09:15:50 crc kubenswrapper[4672]: I1206 09:15:50.513927 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-qscd7"] Dec 06 09:15:50 crc kubenswrapper[4672]: I1206 09:15:50.525778 4672 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 09:15:50 crc kubenswrapper[4672]: I1206 09:15:50.765951 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-kdh29"] Dec 06 09:15:50 crc kubenswrapper[4672]: I1206 09:15:50.774563 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-kkdp5"] Dec 06 09:15:50 crc kubenswrapper[4672]: W1206 09:15:50.785503 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca049150_2cd7_48c8_a77a_90379dbd818b.slice/crio-12e6512fad36110324efb832d1bc3c34b8702a152de80644320b44185c71a7b1 WatchSource:0}: Error finding container 12e6512fad36110324efb832d1bc3c34b8702a152de80644320b44185c71a7b1: Status 404 returned error can't find the container with id 12e6512fad36110324efb832d1bc3c34b8702a152de80644320b44185c71a7b1 Dec 06 09:15:51 crc kubenswrapper[4672]: I1206 09:15:51.530209 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-qscd7" event={"ID":"9a0083d7-9175-4399-aaf0-0767c9d88faf","Type":"ContainerStarted","Data":"0dd0703a01d05d1d53d9e06eeb7100509f3fed94b03c85a6cf7d53769ee1b7c7"} Dec 06 09:15:51 crc kubenswrapper[4672]: I1206 09:15:51.532380 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-kkdp5" event={"ID":"ca049150-2cd7-48c8-a77a-90379dbd818b","Type":"ContainerStarted","Data":"12e6512fad36110324efb832d1bc3c34b8702a152de80644320b44185c71a7b1"} Dec 06 09:15:51 crc kubenswrapper[4672]: I1206 09:15:51.533513 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-kdh29" event={"ID":"23285e10-efd9-47e7-929b-e3fa93131669","Type":"ContainerStarted","Data":"55a937e99442d416899f100ffa78576d86f745067713961053cf0d2d4d74d363"} Dec 06 09:15:53 crc kubenswrapper[4672]: I1206 09:15:53.545153 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-qscd7" event={"ID":"9a0083d7-9175-4399-aaf0-0767c9d88faf","Type":"ContainerStarted","Data":"9ec14bbf21fbc2172e5f690697e25107bc11db7987679a0e9f42a7de29502055"} Dec 06 09:15:54 crc kubenswrapper[4672]: I1206 09:15:54.553465 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-kdh29" event={"ID":"23285e10-efd9-47e7-929b-e3fa93131669","Type":"ContainerStarted","Data":"af695bf1b4747d7722834a6c0df0794da81b9f7fbf7aff33bc7dd9ad50c199c9"} Dec 06 09:15:54 crc kubenswrapper[4672]: I1206 09:15:54.554166 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-kdh29" Dec 06 09:15:54 crc kubenswrapper[4672]: I1206 09:15:54.555349 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-kkdp5" event={"ID":"ca049150-2cd7-48c8-a77a-90379dbd818b","Type":"ContainerStarted","Data":"5f49ea82414a97e21334fe8113719d340428f9801f2162d379c60097b3bced99"} Dec 06 09:15:54 crc kubenswrapper[4672]: I1206 09:15:54.576470 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-qscd7" podStartSLOduration=3.658358862 podStartE2EDuration="5.576450459s" podCreationTimestamp="2025-12-06 09:15:49 +0000 UTC" firstStartedPulling="2025-12-06 09:15:50.525503601 +0000 UTC m=+568.269763878" lastFinishedPulling="2025-12-06 09:15:52.443595188 +0000 UTC m=+570.187855475" observedRunningTime="2025-12-06 09:15:53.562758528 +0000 UTC m=+571.307018835" watchObservedRunningTime="2025-12-06 09:15:54.576450459 +0000 UTC m=+572.320710756" Dec 06 09:15:54 crc kubenswrapper[4672]: I1206 09:15:54.578149 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-kdh29" podStartSLOduration=2.156291409 podStartE2EDuration="5.578141326s" podCreationTimestamp="2025-12-06 09:15:49 +0000 UTC" firstStartedPulling="2025-12-06 09:15:50.786024861 +0000 UTC m=+568.530285148" lastFinishedPulling="2025-12-06 09:15:54.207874778 +0000 UTC m=+571.952135065" observedRunningTime="2025-12-06 09:15:54.574392063 +0000 UTC m=+572.318652370" watchObservedRunningTime="2025-12-06 09:15:54.578141326 +0000 UTC m=+572.322401633" Dec 06 09:15:54 crc kubenswrapper[4672]: I1206 09:15:54.598115 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-kkdp5" podStartSLOduration=2.170084917 podStartE2EDuration="5.598088143s" podCreationTimestamp="2025-12-06 09:15:49 +0000 UTC" firstStartedPulling="2025-12-06 09:15:50.787895532 +0000 UTC m=+568.532155809" lastFinishedPulling="2025-12-06 09:15:54.215898748 +0000 UTC m=+571.960159035" observedRunningTime="2025-12-06 09:15:54.595871572 +0000 UTC m=+572.340131869" watchObservedRunningTime="2025-12-06 09:15:54.598088143 +0000 UTC m=+572.342348430" Dec 06 09:15:59 crc kubenswrapper[4672]: I1206 09:15:59.959259 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xbbs5"] Dec 06 09:15:59 crc kubenswrapper[4672]: I1206 09:15:59.959906 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="ovn-controller" containerID="cri-o://0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22" gracePeriod=30 Dec 06 09:15:59 crc kubenswrapper[4672]: I1206 09:15:59.960001 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1" gracePeriod=30 Dec 06 09:15:59 crc kubenswrapper[4672]: I1206 09:15:59.960017 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="nbdb" containerID="cri-o://7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db" gracePeriod=30 Dec 06 09:15:59 crc kubenswrapper[4672]: I1206 09:15:59.960119 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="kube-rbac-proxy-node" containerID="cri-o://68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec" gracePeriod=30 Dec 06 09:15:59 crc kubenswrapper[4672]: I1206 09:15:59.960094 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="northd" containerID="cri-o://5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b" gracePeriod=30 Dec 06 09:15:59 crc kubenswrapper[4672]: I1206 09:15:59.960283 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="sbdb" containerID="cri-o://97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c" gracePeriod=30 Dec 06 09:15:59 crc kubenswrapper[4672]: I1206 09:15:59.960384 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="ovn-acl-logging" containerID="cri-o://6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c" gracePeriod=30 Dec 06 09:15:59 crc kubenswrapper[4672]: I1206 09:15:59.992886 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="ovnkube-controller" containerID="cri-o://c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29" gracePeriod=30 Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.270812 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-kdh29" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.301296 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbbs5_713432b9-3b28-4ad0-b578-9d42aa1931aa/ovnkube-controller/3.log" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.303093 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbbs5_713432b9-3b28-4ad0-b578-9d42aa1931aa/ovn-acl-logging/0.log" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.303499 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbbs5_713432b9-3b28-4ad0-b578-9d42aa1931aa/ovn-controller/0.log" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.303971 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.353943 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wjlb9"] Dec 06 09:16:00 crc kubenswrapper[4672]: E1206 09:16:00.354186 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="northd" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.354206 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="northd" Dec 06 09:16:00 crc kubenswrapper[4672]: E1206 09:16:00.354214 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="ovnkube-controller" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.354220 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="ovnkube-controller" Dec 06 09:16:00 crc kubenswrapper[4672]: E1206 09:16:00.354231 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="ovnkube-controller" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.354239 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="ovnkube-controller" Dec 06 09:16:00 crc kubenswrapper[4672]: E1206 09:16:00.354248 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="ovn-acl-logging" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.354255 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="ovn-acl-logging" Dec 06 09:16:00 crc kubenswrapper[4672]: E1206 09:16:00.354267 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="ovnkube-controller" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.354275 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="ovnkube-controller" Dec 06 09:16:00 crc kubenswrapper[4672]: E1206 09:16:00.354287 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="kube-rbac-proxy-ovn-metrics" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.354294 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="kube-rbac-proxy-ovn-metrics" Dec 06 09:16:00 crc kubenswrapper[4672]: E1206 09:16:00.354305 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="ovn-controller" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.354311 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="ovn-controller" Dec 06 09:16:00 crc kubenswrapper[4672]: E1206 09:16:00.354318 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="nbdb" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.354324 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="nbdb" Dec 06 09:16:00 crc kubenswrapper[4672]: E1206 09:16:00.354330 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="kubecfg-setup" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.354336 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="kubecfg-setup" Dec 06 09:16:00 crc kubenswrapper[4672]: E1206 09:16:00.354346 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="kube-rbac-proxy-node" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.354351 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="kube-rbac-proxy-node" Dec 06 09:16:00 crc kubenswrapper[4672]: E1206 09:16:00.354360 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="sbdb" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.354365 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="sbdb" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.354450 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="ovnkube-controller" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.354459 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="kube-rbac-proxy-node" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.354469 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="nbdb" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.354475 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="kube-rbac-proxy-ovn-metrics" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.354482 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="ovnkube-controller" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.354492 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="ovnkube-controller" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.354499 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="ovnkube-controller" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.354507 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="sbdb" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.354515 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="ovn-acl-logging" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.354523 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="northd" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.354532 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="ovn-controller" Dec 06 09:16:00 crc kubenswrapper[4672]: E1206 09:16:00.354635 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="ovnkube-controller" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.354641 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="ovnkube-controller" Dec 06 09:16:00 crc kubenswrapper[4672]: E1206 09:16:00.354651 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="ovnkube-controller" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.354657 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="ovnkube-controller" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.354739 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerName="ovnkube-controller" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.356271 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.418289 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-kubelet\") pod \"713432b9-3b28-4ad0-b578-9d42aa1931aa\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.418413 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "713432b9-3b28-4ad0-b578-9d42aa1931aa" (UID: "713432b9-3b28-4ad0-b578-9d42aa1931aa"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.418734 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "713432b9-3b28-4ad0-b578-9d42aa1931aa" (UID: "713432b9-3b28-4ad0-b578-9d42aa1931aa"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.418734 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-run-netns\") pod \"713432b9-3b28-4ad0-b578-9d42aa1931aa\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.418805 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-slash\") pod \"713432b9-3b28-4ad0-b578-9d42aa1931aa\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.418830 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/713432b9-3b28-4ad0-b578-9d42aa1931aa-env-overrides\") pod \"713432b9-3b28-4ad0-b578-9d42aa1931aa\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.418851 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blgnn\" (UniqueName: \"kubernetes.io/projected/713432b9-3b28-4ad0-b578-9d42aa1931aa-kube-api-access-blgnn\") pod \"713432b9-3b28-4ad0-b578-9d42aa1931aa\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.418883 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-etc-openvswitch\") pod \"713432b9-3b28-4ad0-b578-9d42aa1931aa\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.418904 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-run-openvswitch\") pod \"713432b9-3b28-4ad0-b578-9d42aa1931aa\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.418927 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-log-socket\") pod \"713432b9-3b28-4ad0-b578-9d42aa1931aa\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.418945 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-cni-netd\") pod \"713432b9-3b28-4ad0-b578-9d42aa1931aa\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.418961 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-systemd-units\") pod \"713432b9-3b28-4ad0-b578-9d42aa1931aa\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.419006 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-run-systemd\") pod \"713432b9-3b28-4ad0-b578-9d42aa1931aa\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.419031 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-cni-bin\") pod \"713432b9-3b28-4ad0-b578-9d42aa1931aa\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.419053 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-run-ovn-kubernetes\") pod \"713432b9-3b28-4ad0-b578-9d42aa1931aa\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.419091 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/713432b9-3b28-4ad0-b578-9d42aa1931aa-ovnkube-config\") pod \"713432b9-3b28-4ad0-b578-9d42aa1931aa\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.419119 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-node-log\") pod \"713432b9-3b28-4ad0-b578-9d42aa1931aa\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.419135 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"713432b9-3b28-4ad0-b578-9d42aa1931aa\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.419152 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-var-lib-openvswitch\") pod \"713432b9-3b28-4ad0-b578-9d42aa1931aa\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.419170 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/713432b9-3b28-4ad0-b578-9d42aa1931aa-ovnkube-script-lib\") pod \"713432b9-3b28-4ad0-b578-9d42aa1931aa\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.419195 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-run-ovn\") pod \"713432b9-3b28-4ad0-b578-9d42aa1931aa\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.419215 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/713432b9-3b28-4ad0-b578-9d42aa1931aa-ovn-node-metrics-cert\") pod \"713432b9-3b28-4ad0-b578-9d42aa1931aa\" (UID: \"713432b9-3b28-4ad0-b578-9d42aa1931aa\") " Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.419566 4672 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.419587 4672 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.419640 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "713432b9-3b28-4ad0-b578-9d42aa1931aa" (UID: "713432b9-3b28-4ad0-b578-9d42aa1931aa"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.419696 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "713432b9-3b28-4ad0-b578-9d42aa1931aa" (UID: "713432b9-3b28-4ad0-b578-9d42aa1931aa"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.419699 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "713432b9-3b28-4ad0-b578-9d42aa1931aa" (UID: "713432b9-3b28-4ad0-b578-9d42aa1931aa"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.419721 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "713432b9-3b28-4ad0-b578-9d42aa1931aa" (UID: "713432b9-3b28-4ad0-b578-9d42aa1931aa"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.419740 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "713432b9-3b28-4ad0-b578-9d42aa1931aa" (UID: "713432b9-3b28-4ad0-b578-9d42aa1931aa"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.419776 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-slash" (OuterVolumeSpecName: "host-slash") pod "713432b9-3b28-4ad0-b578-9d42aa1931aa" (UID: "713432b9-3b28-4ad0-b578-9d42aa1931aa"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.419794 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-log-socket" (OuterVolumeSpecName: "log-socket") pod "713432b9-3b28-4ad0-b578-9d42aa1931aa" (UID: "713432b9-3b28-4ad0-b578-9d42aa1931aa"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.419822 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "713432b9-3b28-4ad0-b578-9d42aa1931aa" (UID: "713432b9-3b28-4ad0-b578-9d42aa1931aa"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.419848 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "713432b9-3b28-4ad0-b578-9d42aa1931aa" (UID: "713432b9-3b28-4ad0-b578-9d42aa1931aa"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.419873 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "713432b9-3b28-4ad0-b578-9d42aa1931aa" (UID: "713432b9-3b28-4ad0-b578-9d42aa1931aa"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.419898 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-node-log" (OuterVolumeSpecName: "node-log") pod "713432b9-3b28-4ad0-b578-9d42aa1931aa" (UID: "713432b9-3b28-4ad0-b578-9d42aa1931aa"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.420020 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "713432b9-3b28-4ad0-b578-9d42aa1931aa" (UID: "713432b9-3b28-4ad0-b578-9d42aa1931aa"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.420066 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/713432b9-3b28-4ad0-b578-9d42aa1931aa-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "713432b9-3b28-4ad0-b578-9d42aa1931aa" (UID: "713432b9-3b28-4ad0-b578-9d42aa1931aa"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.420103 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/713432b9-3b28-4ad0-b578-9d42aa1931aa-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "713432b9-3b28-4ad0-b578-9d42aa1931aa" (UID: "713432b9-3b28-4ad0-b578-9d42aa1931aa"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.420204 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/713432b9-3b28-4ad0-b578-9d42aa1931aa-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "713432b9-3b28-4ad0-b578-9d42aa1931aa" (UID: "713432b9-3b28-4ad0-b578-9d42aa1931aa"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.425533 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/713432b9-3b28-4ad0-b578-9d42aa1931aa-kube-api-access-blgnn" (OuterVolumeSpecName: "kube-api-access-blgnn") pod "713432b9-3b28-4ad0-b578-9d42aa1931aa" (UID: "713432b9-3b28-4ad0-b578-9d42aa1931aa"). InnerVolumeSpecName "kube-api-access-blgnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.425920 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713432b9-3b28-4ad0-b578-9d42aa1931aa-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "713432b9-3b28-4ad0-b578-9d42aa1931aa" (UID: "713432b9-3b28-4ad0-b578-9d42aa1931aa"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.434292 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "713432b9-3b28-4ad0-b578-9d42aa1931aa" (UID: "713432b9-3b28-4ad0-b578-9d42aa1931aa"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.521816 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-host-slash\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.521886 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-host-cni-netd\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.521918 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-ovn-node-metrics-cert\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522008 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-host-kubelet\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522072 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-run-openvswitch\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522096 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-run-systemd\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522133 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcc9j\" (UniqueName: \"kubernetes.io/projected/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-kube-api-access-fcc9j\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522154 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-ovnkube-config\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522190 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-systemd-units\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522210 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-etc-openvswitch\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522226 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-host-run-ovn-kubernetes\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522244 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-host-run-netns\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522260 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-log-socket\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522296 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-node-log\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522318 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-run-ovn\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522362 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-env-overrides\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522388 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-ovnkube-script-lib\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522412 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-var-lib-openvswitch\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522499 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-host-cni-bin\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522523 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522585 4672 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/713432b9-3b28-4ad0-b578-9d42aa1931aa-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522629 4672 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-node-log\") on node \"crc\" DevicePath \"\"" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522641 4672 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522652 4672 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522661 4672 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/713432b9-3b28-4ad0-b578-9d42aa1931aa-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522670 4672 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522679 4672 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/713432b9-3b28-4ad0-b578-9d42aa1931aa-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522690 4672 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-slash\") on node \"crc\" DevicePath \"\"" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522700 4672 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/713432b9-3b28-4ad0-b578-9d42aa1931aa-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522714 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blgnn\" (UniqueName: \"kubernetes.io/projected/713432b9-3b28-4ad0-b578-9d42aa1931aa-kube-api-access-blgnn\") on node \"crc\" DevicePath \"\"" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522726 4672 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522738 4672 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522746 4672 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-log-socket\") on node \"crc\" DevicePath \"\"" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522763 4672 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522771 4672 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522780 4672 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522788 4672 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.522796 4672 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/713432b9-3b28-4ad0-b578-9d42aa1931aa-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.596396 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ks2jd_25b493f7-0dae-4eb4-9499-0564410528f7/kube-multus/2.log" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.596952 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ks2jd_25b493f7-0dae-4eb4-9499-0564410528f7/kube-multus/1.log" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.596996 4672 generic.go:334] "Generic (PLEG): container finished" podID="25b493f7-0dae-4eb4-9499-0564410528f7" containerID="091aa187d1ee2bf8ad4eebac8370dc750f5636fb05c10d1368d28b50dd876465" exitCode=2 Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.597065 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ks2jd" event={"ID":"25b493f7-0dae-4eb4-9499-0564410528f7","Type":"ContainerDied","Data":"091aa187d1ee2bf8ad4eebac8370dc750f5636fb05c10d1368d28b50dd876465"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.597131 4672 scope.go:117] "RemoveContainer" containerID="e4d9a2a4e0be6b9ab12a348356a2cc8e8211a95855cab5a24ff9b3967b837140" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.597558 4672 scope.go:117] "RemoveContainer" containerID="091aa187d1ee2bf8ad4eebac8370dc750f5636fb05c10d1368d28b50dd876465" Dec 06 09:16:00 crc kubenswrapper[4672]: E1206 09:16:00.597744 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-ks2jd_openshift-multus(25b493f7-0dae-4eb4-9499-0564410528f7)\"" pod="openshift-multus/multus-ks2jd" podUID="25b493f7-0dae-4eb4-9499-0564410528f7" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.599029 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbbs5_713432b9-3b28-4ad0-b578-9d42aa1931aa/ovnkube-controller/3.log" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.604882 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbbs5_713432b9-3b28-4ad0-b578-9d42aa1931aa/ovn-acl-logging/0.log" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.605528 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xbbs5_713432b9-3b28-4ad0-b578-9d42aa1931aa/ovn-controller/0.log" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.605796 4672 generic.go:334] "Generic (PLEG): container finished" podID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerID="c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29" exitCode=0 Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.605817 4672 generic.go:334] "Generic (PLEG): container finished" podID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerID="97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c" exitCode=0 Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.605825 4672 generic.go:334] "Generic (PLEG): container finished" podID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerID="7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db" exitCode=0 Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.605832 4672 generic.go:334] "Generic (PLEG): container finished" podID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerID="5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b" exitCode=0 Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.605839 4672 generic.go:334] "Generic (PLEG): container finished" podID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerID="4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1" exitCode=0 Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.605845 4672 generic.go:334] "Generic (PLEG): container finished" podID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerID="68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec" exitCode=0 Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.605852 4672 generic.go:334] "Generic (PLEG): container finished" podID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerID="6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c" exitCode=143 Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.605859 4672 generic.go:334] "Generic (PLEG): container finished" podID="713432b9-3b28-4ad0-b578-9d42aa1931aa" containerID="0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22" exitCode=143 Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606171 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" event={"ID":"713432b9-3b28-4ad0-b578-9d42aa1931aa","Type":"ContainerDied","Data":"c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606215 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" event={"ID":"713432b9-3b28-4ad0-b578-9d42aa1931aa","Type":"ContainerDied","Data":"97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606241 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" event={"ID":"713432b9-3b28-4ad0-b578-9d42aa1931aa","Type":"ContainerDied","Data":"7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606251 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" event={"ID":"713432b9-3b28-4ad0-b578-9d42aa1931aa","Type":"ContainerDied","Data":"5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606260 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" event={"ID":"713432b9-3b28-4ad0-b578-9d42aa1931aa","Type":"ContainerDied","Data":"4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606268 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" event={"ID":"713432b9-3b28-4ad0-b578-9d42aa1931aa","Type":"ContainerDied","Data":"68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606279 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606292 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606300 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606307 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606314 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606320 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606326 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606333 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606340 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606346 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606356 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" event={"ID":"713432b9-3b28-4ad0-b578-9d42aa1931aa","Type":"ContainerDied","Data":"6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606369 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606377 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606383 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606388 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606394 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606398 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606404 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606409 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606413 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606418 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606425 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" event={"ID":"713432b9-3b28-4ad0-b578-9d42aa1931aa","Type":"ContainerDied","Data":"0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606433 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606440 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606445 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606450 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606455 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606459 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606464 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606469 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606475 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606480 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606487 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" event={"ID":"713432b9-3b28-4ad0-b578-9d42aa1931aa","Type":"ContainerDied","Data":"5d6a89e307227cafbb58809edb9c2c25d1c8d42087540f3466ab30c439922c71"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606495 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606502 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606507 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606512 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606519 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606524 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606530 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606535 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606540 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606546 4672 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda"} Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.606719 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xbbs5" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.623364 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-ovn-node-metrics-cert\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.623442 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-host-kubelet\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.623468 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-run-openvswitch\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.623530 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-host-kubelet\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.623647 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-run-openvswitch\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.623659 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-run-systemd\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.623700 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-run-systemd\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.623725 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcc9j\" (UniqueName: \"kubernetes.io/projected/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-kube-api-access-fcc9j\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.623781 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-ovnkube-config\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.623855 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-systemd-units\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.623882 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-etc-openvswitch\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.623931 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-host-run-ovn-kubernetes\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.623961 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-host-run-netns\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.624087 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-host-run-ovn-kubernetes\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.624100 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-etc-openvswitch\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.624131 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-log-socket\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.624153 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-host-run-netns\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.624107 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-log-socket\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.624186 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-systemd-units\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.624213 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-node-log\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.624252 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-env-overrides\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.624271 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-run-ovn\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.624285 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-node-log\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.624292 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-ovnkube-script-lib\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.624652 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-run-ovn\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.624701 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-var-lib-openvswitch\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.624744 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-host-cni-bin\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.624766 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.624821 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-host-slash\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.624847 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-host-cni-netd\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.624988 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-var-lib-openvswitch\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.625231 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.625258 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-host-slash\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.625366 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-host-cni-netd\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.625394 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-host-cni-bin\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.625878 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-env-overrides\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.626004 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-ovnkube-config\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.626260 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-ovnkube-script-lib\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.626841 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-ovn-node-metrics-cert\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.637445 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xbbs5"] Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.647220 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xbbs5"] Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.647688 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcc9j\" (UniqueName: \"kubernetes.io/projected/e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3-kube-api-access-fcc9j\") pod \"ovnkube-node-wjlb9\" (UID: \"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.669163 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.703622 4672 scope.go:117] "RemoveContainer" containerID="c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.725637 4672 scope.go:117] "RemoveContainer" containerID="749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.745405 4672 scope.go:117] "RemoveContainer" containerID="97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.763653 4672 scope.go:117] "RemoveContainer" containerID="7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.778537 4672 scope.go:117] "RemoveContainer" containerID="5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.791222 4672 scope.go:117] "RemoveContainer" containerID="4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.803546 4672 scope.go:117] "RemoveContainer" containerID="68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.820141 4672 scope.go:117] "RemoveContainer" containerID="6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.875661 4672 scope.go:117] "RemoveContainer" containerID="0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.889168 4672 scope.go:117] "RemoveContainer" containerID="10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.902427 4672 scope.go:117] "RemoveContainer" containerID="c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29" Dec 06 09:16:00 crc kubenswrapper[4672]: E1206 09:16:00.902982 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29\": container with ID starting with c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29 not found: ID does not exist" containerID="c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.903043 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29"} err="failed to get container status \"c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29\": rpc error: code = NotFound desc = could not find container \"c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29\": container with ID starting with c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29 not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.903084 4672 scope.go:117] "RemoveContainer" containerID="749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806" Dec 06 09:16:00 crc kubenswrapper[4672]: E1206 09:16:00.903413 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806\": container with ID starting with 749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806 not found: ID does not exist" containerID="749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.903454 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806"} err="failed to get container status \"749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806\": rpc error: code = NotFound desc = could not find container \"749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806\": container with ID starting with 749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806 not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.903482 4672 scope.go:117] "RemoveContainer" containerID="97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c" Dec 06 09:16:00 crc kubenswrapper[4672]: E1206 09:16:00.903764 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\": container with ID starting with 97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c not found: ID does not exist" containerID="97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.903788 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c"} err="failed to get container status \"97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\": rpc error: code = NotFound desc = could not find container \"97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\": container with ID starting with 97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.903803 4672 scope.go:117] "RemoveContainer" containerID="7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db" Dec 06 09:16:00 crc kubenswrapper[4672]: E1206 09:16:00.904018 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\": container with ID starting with 7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db not found: ID does not exist" containerID="7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.904046 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db"} err="failed to get container status \"7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\": rpc error: code = NotFound desc = could not find container \"7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\": container with ID starting with 7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.904063 4672 scope.go:117] "RemoveContainer" containerID="5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b" Dec 06 09:16:00 crc kubenswrapper[4672]: E1206 09:16:00.904294 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\": container with ID starting with 5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b not found: ID does not exist" containerID="5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.904313 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b"} err="failed to get container status \"5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\": rpc error: code = NotFound desc = could not find container \"5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\": container with ID starting with 5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.904327 4672 scope.go:117] "RemoveContainer" containerID="4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1" Dec 06 09:16:00 crc kubenswrapper[4672]: E1206 09:16:00.904509 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\": container with ID starting with 4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1 not found: ID does not exist" containerID="4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.904574 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1"} err="failed to get container status \"4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\": rpc error: code = NotFound desc = could not find container \"4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\": container with ID starting with 4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1 not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.904616 4672 scope.go:117] "RemoveContainer" containerID="68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec" Dec 06 09:16:00 crc kubenswrapper[4672]: E1206 09:16:00.904802 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\": container with ID starting with 68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec not found: ID does not exist" containerID="68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.904825 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec"} err="failed to get container status \"68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\": rpc error: code = NotFound desc = could not find container \"68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\": container with ID starting with 68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.904843 4672 scope.go:117] "RemoveContainer" containerID="6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c" Dec 06 09:16:00 crc kubenswrapper[4672]: E1206 09:16:00.905064 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\": container with ID starting with 6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c not found: ID does not exist" containerID="6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.905093 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c"} err="failed to get container status \"6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\": rpc error: code = NotFound desc = could not find container \"6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\": container with ID starting with 6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.905109 4672 scope.go:117] "RemoveContainer" containerID="0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22" Dec 06 09:16:00 crc kubenswrapper[4672]: E1206 09:16:00.905290 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\": container with ID starting with 0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22 not found: ID does not exist" containerID="0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.905318 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22"} err="failed to get container status \"0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\": rpc error: code = NotFound desc = could not find container \"0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\": container with ID starting with 0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22 not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.905336 4672 scope.go:117] "RemoveContainer" containerID="10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda" Dec 06 09:16:00 crc kubenswrapper[4672]: E1206 09:16:00.905583 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\": container with ID starting with 10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda not found: ID does not exist" containerID="10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.905663 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda"} err="failed to get container status \"10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\": rpc error: code = NotFound desc = could not find container \"10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\": container with ID starting with 10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.905692 4672 scope.go:117] "RemoveContainer" containerID="c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.905937 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29"} err="failed to get container status \"c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29\": rpc error: code = NotFound desc = could not find container \"c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29\": container with ID starting with c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29 not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.905961 4672 scope.go:117] "RemoveContainer" containerID="749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.906149 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806"} err="failed to get container status \"749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806\": rpc error: code = NotFound desc = could not find container \"749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806\": container with ID starting with 749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806 not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.906172 4672 scope.go:117] "RemoveContainer" containerID="97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.906811 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c"} err="failed to get container status \"97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\": rpc error: code = NotFound desc = could not find container \"97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\": container with ID starting with 97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.906832 4672 scope.go:117] "RemoveContainer" containerID="7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.907098 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db"} err="failed to get container status \"7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\": rpc error: code = NotFound desc = could not find container \"7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\": container with ID starting with 7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.907118 4672 scope.go:117] "RemoveContainer" containerID="5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.907858 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b"} err="failed to get container status \"5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\": rpc error: code = NotFound desc = could not find container \"5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\": container with ID starting with 5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.907882 4672 scope.go:117] "RemoveContainer" containerID="4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.908242 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1"} err="failed to get container status \"4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\": rpc error: code = NotFound desc = could not find container \"4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\": container with ID starting with 4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1 not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.908283 4672 scope.go:117] "RemoveContainer" containerID="68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.908666 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec"} err="failed to get container status \"68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\": rpc error: code = NotFound desc = could not find container \"68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\": container with ID starting with 68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.908689 4672 scope.go:117] "RemoveContainer" containerID="6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.909085 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c"} err="failed to get container status \"6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\": rpc error: code = NotFound desc = could not find container \"6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\": container with ID starting with 6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.909109 4672 scope.go:117] "RemoveContainer" containerID="0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.909466 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22"} err="failed to get container status \"0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\": rpc error: code = NotFound desc = could not find container \"0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\": container with ID starting with 0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22 not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.909484 4672 scope.go:117] "RemoveContainer" containerID="10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.909905 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda"} err="failed to get container status \"10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\": rpc error: code = NotFound desc = could not find container \"10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\": container with ID starting with 10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.909947 4672 scope.go:117] "RemoveContainer" containerID="c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.910277 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29"} err="failed to get container status \"c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29\": rpc error: code = NotFound desc = could not find container \"c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29\": container with ID starting with c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29 not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.910302 4672 scope.go:117] "RemoveContainer" containerID="749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.910739 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806"} err="failed to get container status \"749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806\": rpc error: code = NotFound desc = could not find container \"749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806\": container with ID starting with 749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806 not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.910757 4672 scope.go:117] "RemoveContainer" containerID="97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.911065 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c"} err="failed to get container status \"97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\": rpc error: code = NotFound desc = could not find container \"97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\": container with ID starting with 97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.911119 4672 scope.go:117] "RemoveContainer" containerID="7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.911574 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db"} err="failed to get container status \"7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\": rpc error: code = NotFound desc = could not find container \"7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\": container with ID starting with 7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.911615 4672 scope.go:117] "RemoveContainer" containerID="5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.911882 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b"} err="failed to get container status \"5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\": rpc error: code = NotFound desc = could not find container \"5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\": container with ID starting with 5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.911940 4672 scope.go:117] "RemoveContainer" containerID="4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.912255 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1"} err="failed to get container status \"4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\": rpc error: code = NotFound desc = could not find container \"4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\": container with ID starting with 4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1 not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.912279 4672 scope.go:117] "RemoveContainer" containerID="68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.912729 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec"} err="failed to get container status \"68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\": rpc error: code = NotFound desc = could not find container \"68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\": container with ID starting with 68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.912772 4672 scope.go:117] "RemoveContainer" containerID="6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.913261 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c"} err="failed to get container status \"6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\": rpc error: code = NotFound desc = could not find container \"6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\": container with ID starting with 6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.913287 4672 scope.go:117] "RemoveContainer" containerID="0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.913574 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22"} err="failed to get container status \"0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\": rpc error: code = NotFound desc = could not find container \"0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\": container with ID starting with 0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22 not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.913717 4672 scope.go:117] "RemoveContainer" containerID="10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.914048 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda"} err="failed to get container status \"10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\": rpc error: code = NotFound desc = could not find container \"10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\": container with ID starting with 10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.914075 4672 scope.go:117] "RemoveContainer" containerID="c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.914586 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29"} err="failed to get container status \"c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29\": rpc error: code = NotFound desc = could not find container \"c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29\": container with ID starting with c52d934ae7194f2316d35c504c2a4b72a03a3c504ad20ac5af66c13df45fda29 not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.914623 4672 scope.go:117] "RemoveContainer" containerID="749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.914894 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806"} err="failed to get container status \"749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806\": rpc error: code = NotFound desc = could not find container \"749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806\": container with ID starting with 749cf5d70a61796e0bc2258754ab0b077edbfd4f85d07f1aab5b7621a8ecc806 not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.914949 4672 scope.go:117] "RemoveContainer" containerID="97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.915938 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c"} err="failed to get container status \"97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\": rpc error: code = NotFound desc = could not find container \"97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c\": container with ID starting with 97a2a48a67fdae3da551b894d1dec03b4348299dc3b3e19a8422dc8c7ae8277c not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.915963 4672 scope.go:117] "RemoveContainer" containerID="7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.916416 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db"} err="failed to get container status \"7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\": rpc error: code = NotFound desc = could not find container \"7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db\": container with ID starting with 7eb40aee23e6564c5025d6bb5e595821a891d23444cae52c20a04a1caf1d51db not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.916434 4672 scope.go:117] "RemoveContainer" containerID="5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.916748 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b"} err="failed to get container status \"5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\": rpc error: code = NotFound desc = could not find container \"5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b\": container with ID starting with 5d7e3d1087dc569d212e221380faaccd73b130682dec6229f73f50cfe4bce14b not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.916771 4672 scope.go:117] "RemoveContainer" containerID="4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.917213 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1"} err="failed to get container status \"4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\": rpc error: code = NotFound desc = could not find container \"4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1\": container with ID starting with 4f4b62986cc9a42e167453663ebea6641bb4e6ddedabfbd13343b07823607da1 not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.917235 4672 scope.go:117] "RemoveContainer" containerID="68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.917778 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec"} err="failed to get container status \"68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\": rpc error: code = NotFound desc = could not find container \"68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec\": container with ID starting with 68f283e775bd7bc790a2e24c8927ad87b0de0ca914c61b45f0c503c551d51aec not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.917818 4672 scope.go:117] "RemoveContainer" containerID="6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.918090 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c"} err="failed to get container status \"6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\": rpc error: code = NotFound desc = could not find container \"6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c\": container with ID starting with 6d822e48b8a53a0045245dd9851cca5e53e044e2e8ed99e38dec22fdf2ee012c not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.918125 4672 scope.go:117] "RemoveContainer" containerID="0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.918456 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22"} err="failed to get container status \"0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\": rpc error: code = NotFound desc = could not find container \"0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22\": container with ID starting with 0ad59af9ce65fe3f40dfafcb4afd5083e66b04c355b8d4a578d8521147188d22 not found: ID does not exist" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.918498 4672 scope.go:117] "RemoveContainer" containerID="10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda" Dec 06 09:16:00 crc kubenswrapper[4672]: I1206 09:16:00.919084 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda"} err="failed to get container status \"10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\": rpc error: code = NotFound desc = could not find container \"10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda\": container with ID starting with 10dfddc9fb151cee3d114c4bc450f42c0841e1d0b45af0ba2082e61a74593eda not found: ID does not exist" Dec 06 09:16:01 crc kubenswrapper[4672]: I1206 09:16:01.616697 4672 generic.go:334] "Generic (PLEG): container finished" podID="e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3" containerID="67d6fca72996c67fcb9828350184b685ff488a5809180b4bb1e5d0491063852f" exitCode=0 Dec 06 09:16:01 crc kubenswrapper[4672]: I1206 09:16:01.616823 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" event={"ID":"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3","Type":"ContainerDied","Data":"67d6fca72996c67fcb9828350184b685ff488a5809180b4bb1e5d0491063852f"} Dec 06 09:16:01 crc kubenswrapper[4672]: I1206 09:16:01.617171 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" event={"ID":"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3","Type":"ContainerStarted","Data":"6e66b588dc02d60a4a25a642d313e8b4e4c59b852e2c860bd56c8357e767034e"} Dec 06 09:16:01 crc kubenswrapper[4672]: I1206 09:16:01.619153 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ks2jd_25b493f7-0dae-4eb4-9499-0564410528f7/kube-multus/2.log" Dec 06 09:16:02 crc kubenswrapper[4672]: I1206 09:16:02.572214 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="713432b9-3b28-4ad0-b578-9d42aa1931aa" path="/var/lib/kubelet/pods/713432b9-3b28-4ad0-b578-9d42aa1931aa/volumes" Dec 06 09:16:02 crc kubenswrapper[4672]: I1206 09:16:02.627876 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" event={"ID":"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3","Type":"ContainerStarted","Data":"7a46a35efe2665927a9d6a4252c861e0d6ea5447473e307609e4af0ae1d2eca5"} Dec 06 09:16:02 crc kubenswrapper[4672]: I1206 09:16:02.627929 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" event={"ID":"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3","Type":"ContainerStarted","Data":"7126303c2f52ea69506feb9440318a0db4d4c66c9da4d0137e131c547531a793"} Dec 06 09:16:02 crc kubenswrapper[4672]: I1206 09:16:02.627945 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" event={"ID":"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3","Type":"ContainerStarted","Data":"3d1c795defe9874c83003c056756667cb8e66257200e9d37560e61c6b344ae6c"} Dec 06 09:16:02 crc kubenswrapper[4672]: I1206 09:16:02.627958 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" event={"ID":"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3","Type":"ContainerStarted","Data":"0b9deea99bbc61e52864520cccfdfb7d8502f9393b8589bc0ffb018293ad6ce0"} Dec 06 09:16:02 crc kubenswrapper[4672]: I1206 09:16:02.627970 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" event={"ID":"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3","Type":"ContainerStarted","Data":"c30146372f358dcf1c87892a15fa5f998eb9586cc7bd2d9922ef14f45d9e9393"} Dec 06 09:16:02 crc kubenswrapper[4672]: I1206 09:16:02.627980 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" event={"ID":"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3","Type":"ContainerStarted","Data":"65a22229491b28aa67d8d47fa2f4b669e73d6d3286ba12695e7f127ebf06bf85"} Dec 06 09:16:04 crc kubenswrapper[4672]: I1206 09:16:04.645767 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" event={"ID":"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3","Type":"ContainerStarted","Data":"1c2a421f0687e21d86d062624bb43fffb2963d2e6f69bc471d2995014eb4079e"} Dec 06 09:16:07 crc kubenswrapper[4672]: I1206 09:16:07.664828 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" event={"ID":"e068fa0d-ab1a-4ac3-9859-1e9dd9d457e3","Type":"ContainerStarted","Data":"34e56c02c30918955ff4d921888bc4066f40cabe4b0b2f8939b204a178697710"} Dec 06 09:16:07 crc kubenswrapper[4672]: I1206 09:16:07.666067 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:07 crc kubenswrapper[4672]: I1206 09:16:07.666087 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:07 crc kubenswrapper[4672]: I1206 09:16:07.666098 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:07 crc kubenswrapper[4672]: I1206 09:16:07.690030 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:07 crc kubenswrapper[4672]: I1206 09:16:07.692021 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:07 crc kubenswrapper[4672]: I1206 09:16:07.698660 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" podStartSLOduration=7.698634118 podStartE2EDuration="7.698634118s" podCreationTimestamp="2025-12-06 09:16:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:16:07.693157768 +0000 UTC m=+585.437418095" watchObservedRunningTime="2025-12-06 09:16:07.698634118 +0000 UTC m=+585.442894425" Dec 06 09:16:12 crc kubenswrapper[4672]: I1206 09:16:12.320402 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:16:12 crc kubenswrapper[4672]: I1206 09:16:12.321248 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:16:12 crc kubenswrapper[4672]: I1206 09:16:12.559233 4672 scope.go:117] "RemoveContainer" containerID="091aa187d1ee2bf8ad4eebac8370dc750f5636fb05c10d1368d28b50dd876465" Dec 06 09:16:12 crc kubenswrapper[4672]: E1206 09:16:12.559480 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-ks2jd_openshift-multus(25b493f7-0dae-4eb4-9499-0564410528f7)\"" pod="openshift-multus/multus-ks2jd" podUID="25b493f7-0dae-4eb4-9499-0564410528f7" Dec 06 09:16:23 crc kubenswrapper[4672]: I1206 09:16:23.557678 4672 scope.go:117] "RemoveContainer" containerID="091aa187d1ee2bf8ad4eebac8370dc750f5636fb05c10d1368d28b50dd876465" Dec 06 09:16:23 crc kubenswrapper[4672]: I1206 09:16:23.769182 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ks2jd_25b493f7-0dae-4eb4-9499-0564410528f7/kube-multus/2.log" Dec 06 09:16:23 crc kubenswrapper[4672]: I1206 09:16:23.769538 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ks2jd" event={"ID":"25b493f7-0dae-4eb4-9499-0564410528f7","Type":"ContainerStarted","Data":"9f74a6303c0d0b354397e9ecfe8710f4e1ac5ea064fde9c73b62d0f1f8a670e3"} Dec 06 09:16:30 crc kubenswrapper[4672]: I1206 09:16:30.700191 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wjlb9" Dec 06 09:16:41 crc kubenswrapper[4672]: I1206 09:16:41.293873 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf"] Dec 06 09:16:41 crc kubenswrapper[4672]: I1206 09:16:41.296260 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf" Dec 06 09:16:41 crc kubenswrapper[4672]: I1206 09:16:41.303203 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 06 09:16:41 crc kubenswrapper[4672]: I1206 09:16:41.310510 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf"] Dec 06 09:16:41 crc kubenswrapper[4672]: I1206 09:16:41.398694 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08576097-cc2d-49d5-8bda-66efdd1f960a-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf\" (UID: \"08576097-cc2d-49d5-8bda-66efdd1f960a\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf" Dec 06 09:16:41 crc kubenswrapper[4672]: I1206 09:16:41.398790 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08576097-cc2d-49d5-8bda-66efdd1f960a-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf\" (UID: \"08576097-cc2d-49d5-8bda-66efdd1f960a\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf" Dec 06 09:16:41 crc kubenswrapper[4672]: I1206 09:16:41.398854 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvvln\" (UniqueName: \"kubernetes.io/projected/08576097-cc2d-49d5-8bda-66efdd1f960a-kube-api-access-lvvln\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf\" (UID: \"08576097-cc2d-49d5-8bda-66efdd1f960a\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf" Dec 06 09:16:41 crc kubenswrapper[4672]: I1206 09:16:41.499632 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08576097-cc2d-49d5-8bda-66efdd1f960a-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf\" (UID: \"08576097-cc2d-49d5-8bda-66efdd1f960a\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf" Dec 06 09:16:41 crc kubenswrapper[4672]: I1206 09:16:41.499682 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08576097-cc2d-49d5-8bda-66efdd1f960a-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf\" (UID: \"08576097-cc2d-49d5-8bda-66efdd1f960a\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf" Dec 06 09:16:41 crc kubenswrapper[4672]: I1206 09:16:41.499706 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvvln\" (UniqueName: \"kubernetes.io/projected/08576097-cc2d-49d5-8bda-66efdd1f960a-kube-api-access-lvvln\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf\" (UID: \"08576097-cc2d-49d5-8bda-66efdd1f960a\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf" Dec 06 09:16:41 crc kubenswrapper[4672]: I1206 09:16:41.500406 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08576097-cc2d-49d5-8bda-66efdd1f960a-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf\" (UID: \"08576097-cc2d-49d5-8bda-66efdd1f960a\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf" Dec 06 09:16:41 crc kubenswrapper[4672]: I1206 09:16:41.500484 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08576097-cc2d-49d5-8bda-66efdd1f960a-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf\" (UID: \"08576097-cc2d-49d5-8bda-66efdd1f960a\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf" Dec 06 09:16:41 crc kubenswrapper[4672]: I1206 09:16:41.522363 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvvln\" (UniqueName: \"kubernetes.io/projected/08576097-cc2d-49d5-8bda-66efdd1f960a-kube-api-access-lvvln\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf\" (UID: \"08576097-cc2d-49d5-8bda-66efdd1f960a\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf" Dec 06 09:16:41 crc kubenswrapper[4672]: I1206 09:16:41.617235 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf" Dec 06 09:16:41 crc kubenswrapper[4672]: I1206 09:16:41.848820 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf"] Dec 06 09:16:41 crc kubenswrapper[4672]: I1206 09:16:41.881179 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf" event={"ID":"08576097-cc2d-49d5-8bda-66efdd1f960a","Type":"ContainerStarted","Data":"b194b261846c323942e3593f22c2f5a23f827bb6c7115ee2608c6182172d8464"} Dec 06 09:16:42 crc kubenswrapper[4672]: I1206 09:16:42.319847 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:16:42 crc kubenswrapper[4672]: I1206 09:16:42.320242 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:16:42 crc kubenswrapper[4672]: I1206 09:16:42.320302 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 09:16:42 crc kubenswrapper[4672]: I1206 09:16:42.321166 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2b410864b2f905e632c9f0faa7e115cee3e4f8d1dd843cd26f566a60bf5790f9"} pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 09:16:42 crc kubenswrapper[4672]: I1206 09:16:42.321269 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" containerID="cri-o://2b410864b2f905e632c9f0faa7e115cee3e4f8d1dd843cd26f566a60bf5790f9" gracePeriod=600 Dec 06 09:16:42 crc kubenswrapper[4672]: I1206 09:16:42.888092 4672 generic.go:334] "Generic (PLEG): container finished" podID="08576097-cc2d-49d5-8bda-66efdd1f960a" containerID="7eed93ab83493a72fe68fa962d5678aab105bc36086e6c6a442d836dd869e2cc" exitCode=0 Dec 06 09:16:42 crc kubenswrapper[4672]: I1206 09:16:42.889159 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf" event={"ID":"08576097-cc2d-49d5-8bda-66efdd1f960a","Type":"ContainerDied","Data":"7eed93ab83493a72fe68fa962d5678aab105bc36086e6c6a442d836dd869e2cc"} Dec 06 09:16:42 crc kubenswrapper[4672]: I1206 09:16:42.893767 4672 generic.go:334] "Generic (PLEG): container finished" podID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerID="2b410864b2f905e632c9f0faa7e115cee3e4f8d1dd843cd26f566a60bf5790f9" exitCode=0 Dec 06 09:16:42 crc kubenswrapper[4672]: I1206 09:16:42.893822 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerDied","Data":"2b410864b2f905e632c9f0faa7e115cee3e4f8d1dd843cd26f566a60bf5790f9"} Dec 06 09:16:42 crc kubenswrapper[4672]: I1206 09:16:42.893864 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerStarted","Data":"157a1103c9931308d56d2a9afffb01b9138166ad6f81a369e330a682cba427f9"} Dec 06 09:16:42 crc kubenswrapper[4672]: I1206 09:16:42.893888 4672 scope.go:117] "RemoveContainer" containerID="c07965cc625156f67df18ec68f14cf89ea9bd464984c84ab0aa0cd0dd54f62ac" Dec 06 09:16:44 crc kubenswrapper[4672]: I1206 09:16:44.909695 4672 generic.go:334] "Generic (PLEG): container finished" podID="08576097-cc2d-49d5-8bda-66efdd1f960a" containerID="1d7906cfd95c656463c9a3dc21bddabb23f1c2515d5a3b3c257be7d63079c33c" exitCode=0 Dec 06 09:16:44 crc kubenswrapper[4672]: I1206 09:16:44.910131 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf" event={"ID":"08576097-cc2d-49d5-8bda-66efdd1f960a","Type":"ContainerDied","Data":"1d7906cfd95c656463c9a3dc21bddabb23f1c2515d5a3b3c257be7d63079c33c"} Dec 06 09:16:45 crc kubenswrapper[4672]: I1206 09:16:45.919973 4672 generic.go:334] "Generic (PLEG): container finished" podID="08576097-cc2d-49d5-8bda-66efdd1f960a" containerID="bef53a9e54a6e003461b9665e3a5499e327abf623010e8fa96d92ae1f9c2340a" exitCode=0 Dec 06 09:16:45 crc kubenswrapper[4672]: I1206 09:16:45.920020 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf" event={"ID":"08576097-cc2d-49d5-8bda-66efdd1f960a","Type":"ContainerDied","Data":"bef53a9e54a6e003461b9665e3a5499e327abf623010e8fa96d92ae1f9c2340a"} Dec 06 09:16:47 crc kubenswrapper[4672]: I1206 09:16:47.113817 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf" Dec 06 09:16:47 crc kubenswrapper[4672]: I1206 09:16:47.234961 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08576097-cc2d-49d5-8bda-66efdd1f960a-util\") pod \"08576097-cc2d-49d5-8bda-66efdd1f960a\" (UID: \"08576097-cc2d-49d5-8bda-66efdd1f960a\") " Dec 06 09:16:47 crc kubenswrapper[4672]: I1206 09:16:47.235308 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08576097-cc2d-49d5-8bda-66efdd1f960a-bundle\") pod \"08576097-cc2d-49d5-8bda-66efdd1f960a\" (UID: \"08576097-cc2d-49d5-8bda-66efdd1f960a\") " Dec 06 09:16:47 crc kubenswrapper[4672]: I1206 09:16:47.235344 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvvln\" (UniqueName: \"kubernetes.io/projected/08576097-cc2d-49d5-8bda-66efdd1f960a-kube-api-access-lvvln\") pod \"08576097-cc2d-49d5-8bda-66efdd1f960a\" (UID: \"08576097-cc2d-49d5-8bda-66efdd1f960a\") " Dec 06 09:16:47 crc kubenswrapper[4672]: I1206 09:16:47.236005 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08576097-cc2d-49d5-8bda-66efdd1f960a-bundle" (OuterVolumeSpecName: "bundle") pod "08576097-cc2d-49d5-8bda-66efdd1f960a" (UID: "08576097-cc2d-49d5-8bda-66efdd1f960a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:16:47 crc kubenswrapper[4672]: I1206 09:16:47.240777 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08576097-cc2d-49d5-8bda-66efdd1f960a-kube-api-access-lvvln" (OuterVolumeSpecName: "kube-api-access-lvvln") pod "08576097-cc2d-49d5-8bda-66efdd1f960a" (UID: "08576097-cc2d-49d5-8bda-66efdd1f960a"). InnerVolumeSpecName "kube-api-access-lvvln". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:16:47 crc kubenswrapper[4672]: I1206 09:16:47.249230 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08576097-cc2d-49d5-8bda-66efdd1f960a-util" (OuterVolumeSpecName: "util") pod "08576097-cc2d-49d5-8bda-66efdd1f960a" (UID: "08576097-cc2d-49d5-8bda-66efdd1f960a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:16:47 crc kubenswrapper[4672]: I1206 09:16:47.336846 4672 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08576097-cc2d-49d5-8bda-66efdd1f960a-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:16:47 crc kubenswrapper[4672]: I1206 09:16:47.336887 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvvln\" (UniqueName: \"kubernetes.io/projected/08576097-cc2d-49d5-8bda-66efdd1f960a-kube-api-access-lvvln\") on node \"crc\" DevicePath \"\"" Dec 06 09:16:47 crc kubenswrapper[4672]: I1206 09:16:47.336903 4672 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08576097-cc2d-49d5-8bda-66efdd1f960a-util\") on node \"crc\" DevicePath \"\"" Dec 06 09:16:47 crc kubenswrapper[4672]: I1206 09:16:47.932276 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf" event={"ID":"08576097-cc2d-49d5-8bda-66efdd1f960a","Type":"ContainerDied","Data":"b194b261846c323942e3593f22c2f5a23f827bb6c7115ee2608c6182172d8464"} Dec 06 09:16:47 crc kubenswrapper[4672]: I1206 09:16:47.932315 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b194b261846c323942e3593f22c2f5a23f827bb6c7115ee2608c6182172d8464" Dec 06 09:16:47 crc kubenswrapper[4672]: I1206 09:16:47.932334 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf" Dec 06 09:16:50 crc kubenswrapper[4672]: I1206 09:16:50.138424 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-9bwkk"] Dec 06 09:16:50 crc kubenswrapper[4672]: E1206 09:16:50.139854 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08576097-cc2d-49d5-8bda-66efdd1f960a" containerName="util" Dec 06 09:16:50 crc kubenswrapper[4672]: I1206 09:16:50.139951 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="08576097-cc2d-49d5-8bda-66efdd1f960a" containerName="util" Dec 06 09:16:50 crc kubenswrapper[4672]: E1206 09:16:50.140005 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08576097-cc2d-49d5-8bda-66efdd1f960a" containerName="extract" Dec 06 09:16:50 crc kubenswrapper[4672]: I1206 09:16:50.140067 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="08576097-cc2d-49d5-8bda-66efdd1f960a" containerName="extract" Dec 06 09:16:50 crc kubenswrapper[4672]: E1206 09:16:50.140123 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08576097-cc2d-49d5-8bda-66efdd1f960a" containerName="pull" Dec 06 09:16:50 crc kubenswrapper[4672]: I1206 09:16:50.140172 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="08576097-cc2d-49d5-8bda-66efdd1f960a" containerName="pull" Dec 06 09:16:50 crc kubenswrapper[4672]: I1206 09:16:50.140319 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="08576097-cc2d-49d5-8bda-66efdd1f960a" containerName="extract" Dec 06 09:16:50 crc kubenswrapper[4672]: I1206 09:16:50.140781 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-9bwkk" Dec 06 09:16:50 crc kubenswrapper[4672]: I1206 09:16:50.143528 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-wwcxx" Dec 06 09:16:50 crc kubenswrapper[4672]: I1206 09:16:50.143781 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 06 09:16:50 crc kubenswrapper[4672]: I1206 09:16:50.144293 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 06 09:16:50 crc kubenswrapper[4672]: I1206 09:16:50.148214 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-9bwkk"] Dec 06 09:16:50 crc kubenswrapper[4672]: I1206 09:16:50.181225 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgr87\" (UniqueName: \"kubernetes.io/projected/63eadd21-65ec-4fbd-8c8c-265a1ade0b4c-kube-api-access-pgr87\") pod \"nmstate-operator-5b5b58f5c8-9bwkk\" (UID: \"63eadd21-65ec-4fbd-8c8c-265a1ade0b4c\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-9bwkk" Dec 06 09:16:50 crc kubenswrapper[4672]: I1206 09:16:50.281921 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgr87\" (UniqueName: \"kubernetes.io/projected/63eadd21-65ec-4fbd-8c8c-265a1ade0b4c-kube-api-access-pgr87\") pod \"nmstate-operator-5b5b58f5c8-9bwkk\" (UID: \"63eadd21-65ec-4fbd-8c8c-265a1ade0b4c\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-9bwkk" Dec 06 09:16:50 crc kubenswrapper[4672]: I1206 09:16:50.301489 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgr87\" (UniqueName: \"kubernetes.io/projected/63eadd21-65ec-4fbd-8c8c-265a1ade0b4c-kube-api-access-pgr87\") pod \"nmstate-operator-5b5b58f5c8-9bwkk\" (UID: \"63eadd21-65ec-4fbd-8c8c-265a1ade0b4c\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-9bwkk" Dec 06 09:16:50 crc kubenswrapper[4672]: I1206 09:16:50.486558 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-9bwkk" Dec 06 09:16:50 crc kubenswrapper[4672]: I1206 09:16:50.899226 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-9bwkk"] Dec 06 09:16:50 crc kubenswrapper[4672]: I1206 09:16:50.950528 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-9bwkk" event={"ID":"63eadd21-65ec-4fbd-8c8c-265a1ade0b4c","Type":"ContainerStarted","Data":"4b969f375f916c9ecad6b50561d166f2127ff1de6cd984af82000b3487256607"} Dec 06 09:16:53 crc kubenswrapper[4672]: I1206 09:16:53.968750 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-9bwkk" event={"ID":"63eadd21-65ec-4fbd-8c8c-265a1ade0b4c","Type":"ContainerStarted","Data":"172540a36d082b6683191ffb9371dfb257a66448c3eb47761c3189ff44022ce5"} Dec 06 09:16:53 crc kubenswrapper[4672]: I1206 09:16:53.991228 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-9bwkk" podStartSLOduration=1.62247778 podStartE2EDuration="3.99121122s" podCreationTimestamp="2025-12-06 09:16:50 +0000 UTC" firstStartedPulling="2025-12-06 09:16:50.909163916 +0000 UTC m=+628.653424203" lastFinishedPulling="2025-12-06 09:16:53.277897356 +0000 UTC m=+631.022157643" observedRunningTime="2025-12-06 09:16:53.988046023 +0000 UTC m=+631.732306310" watchObservedRunningTime="2025-12-06 09:16:53.99121122 +0000 UTC m=+631.735471507" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.025396 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-kv76p"] Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.026710 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-kv76p" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.028455 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-57qhh" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.045191 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mp77w"] Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.045949 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjrjp\" (UniqueName: \"kubernetes.io/projected/23695df9-9be3-41a1-af24-8e35e5a875d2-kube-api-access-bjrjp\") pod \"nmstate-metrics-7f946cbc9-kv76p\" (UID: \"23695df9-9be3-41a1-af24-8e35e5a875d2\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-kv76p" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.046091 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mp77w" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.048029 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.060855 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-kv76p"] Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.074123 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-7m49h"] Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.074786 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-7m49h" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.101128 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mp77w"] Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.146687 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a127c4da-7435-45e4-b772-f8e53381bea2-nmstate-lock\") pod \"nmstate-handler-7m49h\" (UID: \"a127c4da-7435-45e4-b772-f8e53381bea2\") " pod="openshift-nmstate/nmstate-handler-7m49h" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.146719 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a127c4da-7435-45e4-b772-f8e53381bea2-ovs-socket\") pod \"nmstate-handler-7m49h\" (UID: \"a127c4da-7435-45e4-b772-f8e53381bea2\") " pod="openshift-nmstate/nmstate-handler-7m49h" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.146751 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjrjp\" (UniqueName: \"kubernetes.io/projected/23695df9-9be3-41a1-af24-8e35e5a875d2-kube-api-access-bjrjp\") pod \"nmstate-metrics-7f946cbc9-kv76p\" (UID: \"23695df9-9be3-41a1-af24-8e35e5a875d2\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-kv76p" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.146773 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/08d88a07-50e0-4273-bbb4-9d6ed17820a8-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-mp77w\" (UID: \"08d88a07-50e0-4273-bbb4-9d6ed17820a8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mp77w" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.146807 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a127c4da-7435-45e4-b772-f8e53381bea2-dbus-socket\") pod \"nmstate-handler-7m49h\" (UID: \"a127c4da-7435-45e4-b772-f8e53381bea2\") " pod="openshift-nmstate/nmstate-handler-7m49h" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.146827 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpc76\" (UniqueName: \"kubernetes.io/projected/08d88a07-50e0-4273-bbb4-9d6ed17820a8-kube-api-access-mpc76\") pod \"nmstate-webhook-5f6d4c5ccb-mp77w\" (UID: \"08d88a07-50e0-4273-bbb4-9d6ed17820a8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mp77w" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.146844 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzj7q\" (UniqueName: \"kubernetes.io/projected/a127c4da-7435-45e4-b772-f8e53381bea2-kube-api-access-fzj7q\") pod \"nmstate-handler-7m49h\" (UID: \"a127c4da-7435-45e4-b772-f8e53381bea2\") " pod="openshift-nmstate/nmstate-handler-7m49h" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.191733 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjrjp\" (UniqueName: \"kubernetes.io/projected/23695df9-9be3-41a1-af24-8e35e5a875d2-kube-api-access-bjrjp\") pod \"nmstate-metrics-7f946cbc9-kv76p\" (UID: \"23695df9-9be3-41a1-af24-8e35e5a875d2\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-kv76p" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.243687 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzxp6"] Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.245116 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzxp6" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.260298 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a127c4da-7435-45e4-b772-f8e53381bea2-dbus-socket\") pod \"nmstate-handler-7m49h\" (UID: \"a127c4da-7435-45e4-b772-f8e53381bea2\") " pod="openshift-nmstate/nmstate-handler-7m49h" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.260398 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpc76\" (UniqueName: \"kubernetes.io/projected/08d88a07-50e0-4273-bbb4-9d6ed17820a8-kube-api-access-mpc76\") pod \"nmstate-webhook-5f6d4c5ccb-mp77w\" (UID: \"08d88a07-50e0-4273-bbb4-9d6ed17820a8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mp77w" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.260433 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzj7q\" (UniqueName: \"kubernetes.io/projected/a127c4da-7435-45e4-b772-f8e53381bea2-kube-api-access-fzj7q\") pod \"nmstate-handler-7m49h\" (UID: \"a127c4da-7435-45e4-b772-f8e53381bea2\") " pod="openshift-nmstate/nmstate-handler-7m49h" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.260584 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a127c4da-7435-45e4-b772-f8e53381bea2-nmstate-lock\") pod \"nmstate-handler-7m49h\" (UID: \"a127c4da-7435-45e4-b772-f8e53381bea2\") " pod="openshift-nmstate/nmstate-handler-7m49h" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.260670 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a127c4da-7435-45e4-b772-f8e53381bea2-ovs-socket\") pod \"nmstate-handler-7m49h\" (UID: \"a127c4da-7435-45e4-b772-f8e53381bea2\") " pod="openshift-nmstate/nmstate-handler-7m49h" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.260770 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/08d88a07-50e0-4273-bbb4-9d6ed17820a8-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-mp77w\" (UID: \"08d88a07-50e0-4273-bbb4-9d6ed17820a8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mp77w" Dec 06 09:16:55 crc kubenswrapper[4672]: E1206 09:16:55.260967 4672 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 06 09:16:55 crc kubenswrapper[4672]: E1206 09:16:55.261027 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08d88a07-50e0-4273-bbb4-9d6ed17820a8-tls-key-pair podName:08d88a07-50e0-4273-bbb4-9d6ed17820a8 nodeName:}" failed. No retries permitted until 2025-12-06 09:16:55.76100736 +0000 UTC m=+633.505267647 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/08d88a07-50e0-4273-bbb4-9d6ed17820a8-tls-key-pair") pod "nmstate-webhook-5f6d4c5ccb-mp77w" (UID: "08d88a07-50e0-4273-bbb4-9d6ed17820a8") : secret "openshift-nmstate-webhook" not found Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.262787 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a127c4da-7435-45e4-b772-f8e53381bea2-ovs-socket\") pod \"nmstate-handler-7m49h\" (UID: \"a127c4da-7435-45e4-b772-f8e53381bea2\") " pod="openshift-nmstate/nmstate-handler-7m49h" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.263059 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a127c4da-7435-45e4-b772-f8e53381bea2-nmstate-lock\") pod \"nmstate-handler-7m49h\" (UID: \"a127c4da-7435-45e4-b772-f8e53381bea2\") " pod="openshift-nmstate/nmstate-handler-7m49h" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.263198 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a127c4da-7435-45e4-b772-f8e53381bea2-dbus-socket\") pod \"nmstate-handler-7m49h\" (UID: \"a127c4da-7435-45e4-b772-f8e53381bea2\") " pod="openshift-nmstate/nmstate-handler-7m49h" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.267694 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.268419 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.268675 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-w2c24" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.280493 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzxp6"] Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.303833 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzj7q\" (UniqueName: \"kubernetes.io/projected/a127c4da-7435-45e4-b772-f8e53381bea2-kube-api-access-fzj7q\") pod \"nmstate-handler-7m49h\" (UID: \"a127c4da-7435-45e4-b772-f8e53381bea2\") " pod="openshift-nmstate/nmstate-handler-7m49h" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.320480 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpc76\" (UniqueName: \"kubernetes.io/projected/08d88a07-50e0-4273-bbb4-9d6ed17820a8-kube-api-access-mpc76\") pod \"nmstate-webhook-5f6d4c5ccb-mp77w\" (UID: \"08d88a07-50e0-4273-bbb4-9d6ed17820a8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mp77w" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.342167 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-kv76p" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.368762 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8681df0a-44cf-471f-9257-bda9bae18f87-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-zzxp6\" (UID: \"8681df0a-44cf-471f-9257-bda9bae18f87\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzxp6" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.369476 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqsm4\" (UniqueName: \"kubernetes.io/projected/8681df0a-44cf-471f-9257-bda9bae18f87-kube-api-access-nqsm4\") pod \"nmstate-console-plugin-7fbb5f6569-zzxp6\" (UID: \"8681df0a-44cf-471f-9257-bda9bae18f87\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzxp6" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.369720 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8681df0a-44cf-471f-9257-bda9bae18f87-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-zzxp6\" (UID: \"8681df0a-44cf-471f-9257-bda9bae18f87\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzxp6" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.389897 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-7m49h" Dec 06 09:16:55 crc kubenswrapper[4672]: W1206 09:16:55.417523 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda127c4da_7435_45e4_b772_f8e53381bea2.slice/crio-0c290989ed7fbac5f5410a71562b82c9650a55c8e8f99f4365922133cb47d14b WatchSource:0}: Error finding container 0c290989ed7fbac5f5410a71562b82c9650a55c8e8f99f4365922133cb47d14b: Status 404 returned error can't find the container with id 0c290989ed7fbac5f5410a71562b82c9650a55c8e8f99f4365922133cb47d14b Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.470334 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8681df0a-44cf-471f-9257-bda9bae18f87-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-zzxp6\" (UID: \"8681df0a-44cf-471f-9257-bda9bae18f87\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzxp6" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.470771 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqsm4\" (UniqueName: \"kubernetes.io/projected/8681df0a-44cf-471f-9257-bda9bae18f87-kube-api-access-nqsm4\") pod \"nmstate-console-plugin-7fbb5f6569-zzxp6\" (UID: \"8681df0a-44cf-471f-9257-bda9bae18f87\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzxp6" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.470931 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8681df0a-44cf-471f-9257-bda9bae18f87-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-zzxp6\" (UID: \"8681df0a-44cf-471f-9257-bda9bae18f87\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzxp6" Dec 06 09:16:55 crc kubenswrapper[4672]: E1206 09:16:55.471170 4672 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.471380 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8681df0a-44cf-471f-9257-bda9bae18f87-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-zzxp6\" (UID: \"8681df0a-44cf-471f-9257-bda9bae18f87\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzxp6" Dec 06 09:16:55 crc kubenswrapper[4672]: E1206 09:16:55.471394 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8681df0a-44cf-471f-9257-bda9bae18f87-plugin-serving-cert podName:8681df0a-44cf-471f-9257-bda9bae18f87 nodeName:}" failed. No retries permitted until 2025-12-06 09:16:55.971367356 +0000 UTC m=+633.715627633 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/8681df0a-44cf-471f-9257-bda9bae18f87-plugin-serving-cert") pod "nmstate-console-plugin-7fbb5f6569-zzxp6" (UID: "8681df0a-44cf-471f-9257-bda9bae18f87") : secret "plugin-serving-cert" not found Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.489025 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-58cfb46cd4-qdbnq"] Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.490543 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58cfb46cd4-qdbnq" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.506181 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58cfb46cd4-qdbnq"] Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.515988 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqsm4\" (UniqueName: \"kubernetes.io/projected/8681df0a-44cf-471f-9257-bda9bae18f87-kube-api-access-nqsm4\") pod \"nmstate-console-plugin-7fbb5f6569-zzxp6\" (UID: \"8681df0a-44cf-471f-9257-bda9bae18f87\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzxp6" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.576208 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5958ea03-8c53-4c75-bab3-058d77dc8c53-service-ca\") pod \"console-58cfb46cd4-qdbnq\" (UID: \"5958ea03-8c53-4c75-bab3-058d77dc8c53\") " pod="openshift-console/console-58cfb46cd4-qdbnq" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.576536 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5958ea03-8c53-4c75-bab3-058d77dc8c53-trusted-ca-bundle\") pod \"console-58cfb46cd4-qdbnq\" (UID: \"5958ea03-8c53-4c75-bab3-058d77dc8c53\") " pod="openshift-console/console-58cfb46cd4-qdbnq" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.576581 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44mjj\" (UniqueName: \"kubernetes.io/projected/5958ea03-8c53-4c75-bab3-058d77dc8c53-kube-api-access-44mjj\") pod \"console-58cfb46cd4-qdbnq\" (UID: \"5958ea03-8c53-4c75-bab3-058d77dc8c53\") " pod="openshift-console/console-58cfb46cd4-qdbnq" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.576663 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5958ea03-8c53-4c75-bab3-058d77dc8c53-oauth-serving-cert\") pod \"console-58cfb46cd4-qdbnq\" (UID: \"5958ea03-8c53-4c75-bab3-058d77dc8c53\") " pod="openshift-console/console-58cfb46cd4-qdbnq" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.576724 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5958ea03-8c53-4c75-bab3-058d77dc8c53-console-serving-cert\") pod \"console-58cfb46cd4-qdbnq\" (UID: \"5958ea03-8c53-4c75-bab3-058d77dc8c53\") " pod="openshift-console/console-58cfb46cd4-qdbnq" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.576776 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5958ea03-8c53-4c75-bab3-058d77dc8c53-console-oauth-config\") pod \"console-58cfb46cd4-qdbnq\" (UID: \"5958ea03-8c53-4c75-bab3-058d77dc8c53\") " pod="openshift-console/console-58cfb46cd4-qdbnq" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.576811 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5958ea03-8c53-4c75-bab3-058d77dc8c53-console-config\") pod \"console-58cfb46cd4-qdbnq\" (UID: \"5958ea03-8c53-4c75-bab3-058d77dc8c53\") " pod="openshift-console/console-58cfb46cd4-qdbnq" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.678075 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5958ea03-8c53-4c75-bab3-058d77dc8c53-console-serving-cert\") pod \"console-58cfb46cd4-qdbnq\" (UID: \"5958ea03-8c53-4c75-bab3-058d77dc8c53\") " pod="openshift-console/console-58cfb46cd4-qdbnq" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.678151 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5958ea03-8c53-4c75-bab3-058d77dc8c53-console-oauth-config\") pod \"console-58cfb46cd4-qdbnq\" (UID: \"5958ea03-8c53-4c75-bab3-058d77dc8c53\") " pod="openshift-console/console-58cfb46cd4-qdbnq" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.678176 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5958ea03-8c53-4c75-bab3-058d77dc8c53-console-config\") pod \"console-58cfb46cd4-qdbnq\" (UID: \"5958ea03-8c53-4c75-bab3-058d77dc8c53\") " pod="openshift-console/console-58cfb46cd4-qdbnq" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.678213 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5958ea03-8c53-4c75-bab3-058d77dc8c53-service-ca\") pod \"console-58cfb46cd4-qdbnq\" (UID: \"5958ea03-8c53-4c75-bab3-058d77dc8c53\") " pod="openshift-console/console-58cfb46cd4-qdbnq" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.678233 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5958ea03-8c53-4c75-bab3-058d77dc8c53-trusted-ca-bundle\") pod \"console-58cfb46cd4-qdbnq\" (UID: \"5958ea03-8c53-4c75-bab3-058d77dc8c53\") " pod="openshift-console/console-58cfb46cd4-qdbnq" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.678255 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44mjj\" (UniqueName: \"kubernetes.io/projected/5958ea03-8c53-4c75-bab3-058d77dc8c53-kube-api-access-44mjj\") pod \"console-58cfb46cd4-qdbnq\" (UID: \"5958ea03-8c53-4c75-bab3-058d77dc8c53\") " pod="openshift-console/console-58cfb46cd4-qdbnq" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.678284 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5958ea03-8c53-4c75-bab3-058d77dc8c53-oauth-serving-cert\") pod \"console-58cfb46cd4-qdbnq\" (UID: \"5958ea03-8c53-4c75-bab3-058d77dc8c53\") " pod="openshift-console/console-58cfb46cd4-qdbnq" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.679205 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5958ea03-8c53-4c75-bab3-058d77dc8c53-oauth-serving-cert\") pod \"console-58cfb46cd4-qdbnq\" (UID: \"5958ea03-8c53-4c75-bab3-058d77dc8c53\") " pod="openshift-console/console-58cfb46cd4-qdbnq" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.680833 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5958ea03-8c53-4c75-bab3-058d77dc8c53-service-ca\") pod \"console-58cfb46cd4-qdbnq\" (UID: \"5958ea03-8c53-4c75-bab3-058d77dc8c53\") " pod="openshift-console/console-58cfb46cd4-qdbnq" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.681265 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5958ea03-8c53-4c75-bab3-058d77dc8c53-console-config\") pod \"console-58cfb46cd4-qdbnq\" (UID: \"5958ea03-8c53-4c75-bab3-058d77dc8c53\") " pod="openshift-console/console-58cfb46cd4-qdbnq" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.681284 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5958ea03-8c53-4c75-bab3-058d77dc8c53-trusted-ca-bundle\") pod \"console-58cfb46cd4-qdbnq\" (UID: \"5958ea03-8c53-4c75-bab3-058d77dc8c53\") " pod="openshift-console/console-58cfb46cd4-qdbnq" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.685993 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5958ea03-8c53-4c75-bab3-058d77dc8c53-console-oauth-config\") pod \"console-58cfb46cd4-qdbnq\" (UID: \"5958ea03-8c53-4c75-bab3-058d77dc8c53\") " pod="openshift-console/console-58cfb46cd4-qdbnq" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.686592 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5958ea03-8c53-4c75-bab3-058d77dc8c53-console-serving-cert\") pod \"console-58cfb46cd4-qdbnq\" (UID: \"5958ea03-8c53-4c75-bab3-058d77dc8c53\") " pod="openshift-console/console-58cfb46cd4-qdbnq" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.701637 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44mjj\" (UniqueName: \"kubernetes.io/projected/5958ea03-8c53-4c75-bab3-058d77dc8c53-kube-api-access-44mjj\") pod \"console-58cfb46cd4-qdbnq\" (UID: \"5958ea03-8c53-4c75-bab3-058d77dc8c53\") " pod="openshift-console/console-58cfb46cd4-qdbnq" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.779302 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/08d88a07-50e0-4273-bbb4-9d6ed17820a8-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-mp77w\" (UID: \"08d88a07-50e0-4273-bbb4-9d6ed17820a8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mp77w" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.782592 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/08d88a07-50e0-4273-bbb4-9d6ed17820a8-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-mp77w\" (UID: \"08d88a07-50e0-4273-bbb4-9d6ed17820a8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mp77w" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.802191 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-kv76p"] Dec 06 09:16:55 crc kubenswrapper[4672]: W1206 09:16:55.808433 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23695df9_9be3_41a1_af24_8e35e5a875d2.slice/crio-3b5612aaa8a4cb20659c3e32fcb7c10975465e38399f6aabe2d080a18b14bd81 WatchSource:0}: Error finding container 3b5612aaa8a4cb20659c3e32fcb7c10975465e38399f6aabe2d080a18b14bd81: Status 404 returned error can't find the container with id 3b5612aaa8a4cb20659c3e32fcb7c10975465e38399f6aabe2d080a18b14bd81 Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.809014 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58cfb46cd4-qdbnq" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.958072 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mp77w" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.982185 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8681df0a-44cf-471f-9257-bda9bae18f87-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-zzxp6\" (UID: \"8681df0a-44cf-471f-9257-bda9bae18f87\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzxp6" Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.984219 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-kv76p" event={"ID":"23695df9-9be3-41a1-af24-8e35e5a875d2","Type":"ContainerStarted","Data":"3b5612aaa8a4cb20659c3e32fcb7c10975465e38399f6aabe2d080a18b14bd81"} Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.985153 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-7m49h" event={"ID":"a127c4da-7435-45e4-b772-f8e53381bea2","Type":"ContainerStarted","Data":"0c290989ed7fbac5f5410a71562b82c9650a55c8e8f99f4365922133cb47d14b"} Dec 06 09:16:55 crc kubenswrapper[4672]: I1206 09:16:55.987138 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8681df0a-44cf-471f-9257-bda9bae18f87-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-zzxp6\" (UID: \"8681df0a-44cf-471f-9257-bda9bae18f87\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzxp6" Dec 06 09:16:56 crc kubenswrapper[4672]: I1206 09:16:56.031780 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58cfb46cd4-qdbnq"] Dec 06 09:16:56 crc kubenswrapper[4672]: W1206 09:16:56.037483 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5958ea03_8c53_4c75_bab3_058d77dc8c53.slice/crio-d16d4d99bdf063a972700d80eb529ec51cb49a14b6d872c369e07519617bdbcc WatchSource:0}: Error finding container d16d4d99bdf063a972700d80eb529ec51cb49a14b6d872c369e07519617bdbcc: Status 404 returned error can't find the container with id d16d4d99bdf063a972700d80eb529ec51cb49a14b6d872c369e07519617bdbcc Dec 06 09:16:56 crc kubenswrapper[4672]: I1206 09:16:56.193094 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzxp6" Dec 06 09:16:56 crc kubenswrapper[4672]: I1206 09:16:56.405863 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mp77w"] Dec 06 09:16:56 crc kubenswrapper[4672]: W1206 09:16:56.411139 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08d88a07_50e0_4273_bbb4_9d6ed17820a8.slice/crio-6998f4353495b6e91d34d24a2756190f5e1b04dcf965d116236b878ac25438fd WatchSource:0}: Error finding container 6998f4353495b6e91d34d24a2756190f5e1b04dcf965d116236b878ac25438fd: Status 404 returned error can't find the container with id 6998f4353495b6e91d34d24a2756190f5e1b04dcf965d116236b878ac25438fd Dec 06 09:16:56 crc kubenswrapper[4672]: I1206 09:16:56.456135 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzxp6"] Dec 06 09:16:56 crc kubenswrapper[4672]: I1206 09:16:56.993312 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58cfb46cd4-qdbnq" event={"ID":"5958ea03-8c53-4c75-bab3-058d77dc8c53","Type":"ContainerStarted","Data":"e4c11289b1fae5de2531a618f30eeb35d4ac1194c70af848b6d4f9b7d2157823"} Dec 06 09:16:56 crc kubenswrapper[4672]: I1206 09:16:56.993372 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58cfb46cd4-qdbnq" event={"ID":"5958ea03-8c53-4c75-bab3-058d77dc8c53","Type":"ContainerStarted","Data":"d16d4d99bdf063a972700d80eb529ec51cb49a14b6d872c369e07519617bdbcc"} Dec 06 09:16:56 crc kubenswrapper[4672]: I1206 09:16:56.994850 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzxp6" event={"ID":"8681df0a-44cf-471f-9257-bda9bae18f87","Type":"ContainerStarted","Data":"87f0bccaa85bb00f0c59e4a08e679e987d5bdcc0e109090009563be6fcbe59b3"} Dec 06 09:16:56 crc kubenswrapper[4672]: I1206 09:16:56.996111 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mp77w" event={"ID":"08d88a07-50e0-4273-bbb4-9d6ed17820a8","Type":"ContainerStarted","Data":"6998f4353495b6e91d34d24a2756190f5e1b04dcf965d116236b878ac25438fd"} Dec 06 09:16:57 crc kubenswrapper[4672]: I1206 09:16:57.010986 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-58cfb46cd4-qdbnq" podStartSLOduration=2.010972274 podStartE2EDuration="2.010972274s" podCreationTimestamp="2025-12-06 09:16:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:16:57.010670006 +0000 UTC m=+634.754930303" watchObservedRunningTime="2025-12-06 09:16:57.010972274 +0000 UTC m=+634.755232561" Dec 06 09:16:59 crc kubenswrapper[4672]: I1206 09:16:59.013444 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-7m49h" event={"ID":"a127c4da-7435-45e4-b772-f8e53381bea2","Type":"ContainerStarted","Data":"c748035eed0df09ba84adbf50893192bf66ae67f195b21553fc74068df661eea"} Dec 06 09:16:59 crc kubenswrapper[4672]: I1206 09:16:59.014263 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-7m49h" Dec 06 09:16:59 crc kubenswrapper[4672]: I1206 09:16:59.018923 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mp77w" event={"ID":"08d88a07-50e0-4273-bbb4-9d6ed17820a8","Type":"ContainerStarted","Data":"156cb7bc40b6757fd2c93cdfa29dba6361f14fc87bc854f80bb6ca6db7286be3"} Dec 06 09:16:59 crc kubenswrapper[4672]: I1206 09:16:59.019041 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mp77w" Dec 06 09:16:59 crc kubenswrapper[4672]: I1206 09:16:59.021276 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-kv76p" event={"ID":"23695df9-9be3-41a1-af24-8e35e5a875d2","Type":"ContainerStarted","Data":"8d48e9cc77aeb56688fe895605f7269347f0d5e9eb090463c7fb61d0a8bf4f57"} Dec 06 09:16:59 crc kubenswrapper[4672]: I1206 09:16:59.033382 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-7m49h" podStartSLOduration=1.184728212 podStartE2EDuration="4.03336776s" podCreationTimestamp="2025-12-06 09:16:55 +0000 UTC" firstStartedPulling="2025-12-06 09:16:55.421015847 +0000 UTC m=+633.165276134" lastFinishedPulling="2025-12-06 09:16:58.269655395 +0000 UTC m=+636.013915682" observedRunningTime="2025-12-06 09:16:59.029100321 +0000 UTC m=+636.773360608" watchObservedRunningTime="2025-12-06 09:16:59.03336776 +0000 UTC m=+636.777628047" Dec 06 09:16:59 crc kubenswrapper[4672]: I1206 09:16:59.058538 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mp77w" podStartSLOduration=2.229055665 podStartE2EDuration="4.058518059s" podCreationTimestamp="2025-12-06 09:16:55 +0000 UTC" firstStartedPulling="2025-12-06 09:16:56.412617195 +0000 UTC m=+634.156877482" lastFinishedPulling="2025-12-06 09:16:58.242079589 +0000 UTC m=+635.986339876" observedRunningTime="2025-12-06 09:16:59.044116499 +0000 UTC m=+636.788376786" watchObservedRunningTime="2025-12-06 09:16:59.058518059 +0000 UTC m=+636.802778346" Dec 06 09:17:00 crc kubenswrapper[4672]: I1206 09:17:00.034496 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzxp6" event={"ID":"8681df0a-44cf-471f-9257-bda9bae18f87","Type":"ContainerStarted","Data":"b3a6e0a3fcf101ff86c97d5490bf0e319ecc93327a5fe307654396148d0e48b8"} Dec 06 09:17:00 crc kubenswrapper[4672]: I1206 09:17:00.048866 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzxp6" podStartSLOduration=2.039732333 podStartE2EDuration="5.048853451s" podCreationTimestamp="2025-12-06 09:16:55 +0000 UTC" firstStartedPulling="2025-12-06 09:16:56.478512426 +0000 UTC m=+634.222772713" lastFinishedPulling="2025-12-06 09:16:59.487633544 +0000 UTC m=+637.231893831" observedRunningTime="2025-12-06 09:17:00.048656806 +0000 UTC m=+637.792917103" watchObservedRunningTime="2025-12-06 09:17:00.048853451 +0000 UTC m=+637.793113738" Dec 06 09:17:01 crc kubenswrapper[4672]: I1206 09:17:01.041168 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-kv76p" event={"ID":"23695df9-9be3-41a1-af24-8e35e5a875d2","Type":"ContainerStarted","Data":"163954bb98164994a317806b0a4e0d606f1af100ff371da3df5127092ce4a944"} Dec 06 09:17:01 crc kubenswrapper[4672]: I1206 09:17:01.058437 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-kv76p" podStartSLOduration=1.117979767 podStartE2EDuration="6.058420429s" podCreationTimestamp="2025-12-06 09:16:55 +0000 UTC" firstStartedPulling="2025-12-06 09:16:55.810875832 +0000 UTC m=+633.555136119" lastFinishedPulling="2025-12-06 09:17:00.751316494 +0000 UTC m=+638.495576781" observedRunningTime="2025-12-06 09:17:01.057002799 +0000 UTC m=+638.801263076" watchObservedRunningTime="2025-12-06 09:17:01.058420429 +0000 UTC m=+638.802680716" Dec 06 09:17:05 crc kubenswrapper[4672]: I1206 09:17:05.430020 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-7m49h" Dec 06 09:17:05 crc kubenswrapper[4672]: I1206 09:17:05.810016 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-58cfb46cd4-qdbnq" Dec 06 09:17:05 crc kubenswrapper[4672]: I1206 09:17:05.810091 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-58cfb46cd4-qdbnq" Dec 06 09:17:05 crc kubenswrapper[4672]: I1206 09:17:05.815163 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-58cfb46cd4-qdbnq" Dec 06 09:17:06 crc kubenswrapper[4672]: I1206 09:17:06.072689 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-58cfb46cd4-qdbnq" Dec 06 09:17:06 crc kubenswrapper[4672]: I1206 09:17:06.171370 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-dcdqg"] Dec 06 09:17:15 crc kubenswrapper[4672]: I1206 09:17:15.966853 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-mp77w" Dec 06 09:17:27 crc kubenswrapper[4672]: I1206 09:17:27.811975 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj"] Dec 06 09:17:27 crc kubenswrapper[4672]: I1206 09:17:27.813583 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj" Dec 06 09:17:27 crc kubenswrapper[4672]: I1206 09:17:27.825856 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 06 09:17:27 crc kubenswrapper[4672]: I1206 09:17:27.833242 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj"] Dec 06 09:17:27 crc kubenswrapper[4672]: I1206 09:17:27.941695 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbxch\" (UniqueName: \"kubernetes.io/projected/d6fce567-e6b2-4968-afff-b87e8c3d5058-kube-api-access-sbxch\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj\" (UID: \"d6fce567-e6b2-4968-afff-b87e8c3d5058\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj" Dec 06 09:17:27 crc kubenswrapper[4672]: I1206 09:17:27.941857 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d6fce567-e6b2-4968-afff-b87e8c3d5058-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj\" (UID: \"d6fce567-e6b2-4968-afff-b87e8c3d5058\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj" Dec 06 09:17:27 crc kubenswrapper[4672]: I1206 09:17:27.941949 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d6fce567-e6b2-4968-afff-b87e8c3d5058-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj\" (UID: \"d6fce567-e6b2-4968-afff-b87e8c3d5058\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj" Dec 06 09:17:28 crc kubenswrapper[4672]: I1206 09:17:28.043940 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbxch\" (UniqueName: \"kubernetes.io/projected/d6fce567-e6b2-4968-afff-b87e8c3d5058-kube-api-access-sbxch\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj\" (UID: \"d6fce567-e6b2-4968-afff-b87e8c3d5058\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj" Dec 06 09:17:28 crc kubenswrapper[4672]: I1206 09:17:28.044054 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d6fce567-e6b2-4968-afff-b87e8c3d5058-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj\" (UID: \"d6fce567-e6b2-4968-afff-b87e8c3d5058\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj" Dec 06 09:17:28 crc kubenswrapper[4672]: I1206 09:17:28.044114 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d6fce567-e6b2-4968-afff-b87e8c3d5058-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj\" (UID: \"d6fce567-e6b2-4968-afff-b87e8c3d5058\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj" Dec 06 09:17:28 crc kubenswrapper[4672]: I1206 09:17:28.044816 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d6fce567-e6b2-4968-afff-b87e8c3d5058-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj\" (UID: \"d6fce567-e6b2-4968-afff-b87e8c3d5058\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj" Dec 06 09:17:28 crc kubenswrapper[4672]: I1206 09:17:28.044938 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d6fce567-e6b2-4968-afff-b87e8c3d5058-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj\" (UID: \"d6fce567-e6b2-4968-afff-b87e8c3d5058\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj" Dec 06 09:17:28 crc kubenswrapper[4672]: I1206 09:17:28.068718 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbxch\" (UniqueName: \"kubernetes.io/projected/d6fce567-e6b2-4968-afff-b87e8c3d5058-kube-api-access-sbxch\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj\" (UID: \"d6fce567-e6b2-4968-afff-b87e8c3d5058\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj" Dec 06 09:17:28 crc kubenswrapper[4672]: I1206 09:17:28.132624 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj" Dec 06 09:17:28 crc kubenswrapper[4672]: I1206 09:17:28.550563 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj"] Dec 06 09:17:29 crc kubenswrapper[4672]: I1206 09:17:29.240119 4672 generic.go:334] "Generic (PLEG): container finished" podID="d6fce567-e6b2-4968-afff-b87e8c3d5058" containerID="2ba46d5bba84fcc40be45784a1b3ae6e509fb9e6de4636bce1014ec01ae70827" exitCode=0 Dec 06 09:17:29 crc kubenswrapper[4672]: I1206 09:17:29.240231 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj" event={"ID":"d6fce567-e6b2-4968-afff-b87e8c3d5058","Type":"ContainerDied","Data":"2ba46d5bba84fcc40be45784a1b3ae6e509fb9e6de4636bce1014ec01ae70827"} Dec 06 09:17:29 crc kubenswrapper[4672]: I1206 09:17:29.240545 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj" event={"ID":"d6fce567-e6b2-4968-afff-b87e8c3d5058","Type":"ContainerStarted","Data":"c065b670d3afdf8f5f50e0b1828e024ac0c71c3073efcbf719b695d5f62bd753"} Dec 06 09:17:31 crc kubenswrapper[4672]: I1206 09:17:31.216042 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-dcdqg" podUID="de34b8a9-076f-4aa5-acb7-52361b6deeb8" containerName="console" containerID="cri-o://9fae834b2b00e17e329b219768b7f8d2e6e7eec03531b7b23664c3e9f4778c3e" gracePeriod=15 Dec 06 09:17:31 crc kubenswrapper[4672]: I1206 09:17:31.258349 4672 generic.go:334] "Generic (PLEG): container finished" podID="d6fce567-e6b2-4968-afff-b87e8c3d5058" containerID="f74a7615f06b682aed7b4d8869d2a55e02d86c2de7cd6e312c928c44fa04171d" exitCode=0 Dec 06 09:17:31 crc kubenswrapper[4672]: I1206 09:17:31.258682 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj" event={"ID":"d6fce567-e6b2-4968-afff-b87e8c3d5058","Type":"ContainerDied","Data":"f74a7615f06b682aed7b4d8869d2a55e02d86c2de7cd6e312c928c44fa04171d"} Dec 06 09:17:31 crc kubenswrapper[4672]: I1206 09:17:31.570630 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-dcdqg_de34b8a9-076f-4aa5-acb7-52361b6deeb8/console/0.log" Dec 06 09:17:31 crc kubenswrapper[4672]: I1206 09:17:31.570698 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dcdqg" Dec 06 09:17:31 crc kubenswrapper[4672]: I1206 09:17:31.699334 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de34b8a9-076f-4aa5-acb7-52361b6deeb8-service-ca\") pod \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\" (UID: \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\") " Dec 06 09:17:31 crc kubenswrapper[4672]: I1206 09:17:31.699517 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de34b8a9-076f-4aa5-acb7-52361b6deeb8-console-config\") pod \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\" (UID: \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\") " Dec 06 09:17:31 crc kubenswrapper[4672]: I1206 09:17:31.699619 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnw2g\" (UniqueName: \"kubernetes.io/projected/de34b8a9-076f-4aa5-acb7-52361b6deeb8-kube-api-access-cnw2g\") pod \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\" (UID: \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\") " Dec 06 09:17:31 crc kubenswrapper[4672]: I1206 09:17:31.699665 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de34b8a9-076f-4aa5-acb7-52361b6deeb8-console-oauth-config\") pod \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\" (UID: \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\") " Dec 06 09:17:31 crc kubenswrapper[4672]: I1206 09:17:31.699703 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de34b8a9-076f-4aa5-acb7-52361b6deeb8-console-serving-cert\") pod \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\" (UID: \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\") " Dec 06 09:17:31 crc kubenswrapper[4672]: I1206 09:17:31.699734 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de34b8a9-076f-4aa5-acb7-52361b6deeb8-trusted-ca-bundle\") pod \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\" (UID: \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\") " Dec 06 09:17:31 crc kubenswrapper[4672]: I1206 09:17:31.699774 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de34b8a9-076f-4aa5-acb7-52361b6deeb8-oauth-serving-cert\") pod \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\" (UID: \"de34b8a9-076f-4aa5-acb7-52361b6deeb8\") " Dec 06 09:17:31 crc kubenswrapper[4672]: I1206 09:17:31.700284 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de34b8a9-076f-4aa5-acb7-52361b6deeb8-service-ca" (OuterVolumeSpecName: "service-ca") pod "de34b8a9-076f-4aa5-acb7-52361b6deeb8" (UID: "de34b8a9-076f-4aa5-acb7-52361b6deeb8"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:17:31 crc kubenswrapper[4672]: I1206 09:17:31.700501 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de34b8a9-076f-4aa5-acb7-52361b6deeb8-console-config" (OuterVolumeSpecName: "console-config") pod "de34b8a9-076f-4aa5-acb7-52361b6deeb8" (UID: "de34b8a9-076f-4aa5-acb7-52361b6deeb8"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:17:31 crc kubenswrapper[4672]: I1206 09:17:31.701216 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de34b8a9-076f-4aa5-acb7-52361b6deeb8-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "de34b8a9-076f-4aa5-acb7-52361b6deeb8" (UID: "de34b8a9-076f-4aa5-acb7-52361b6deeb8"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:17:31 crc kubenswrapper[4672]: I1206 09:17:31.701416 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de34b8a9-076f-4aa5-acb7-52361b6deeb8-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "de34b8a9-076f-4aa5-acb7-52361b6deeb8" (UID: "de34b8a9-076f-4aa5-acb7-52361b6deeb8"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:17:31 crc kubenswrapper[4672]: I1206 09:17:31.706746 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de34b8a9-076f-4aa5-acb7-52361b6deeb8-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "de34b8a9-076f-4aa5-acb7-52361b6deeb8" (UID: "de34b8a9-076f-4aa5-acb7-52361b6deeb8"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:17:31 crc kubenswrapper[4672]: I1206 09:17:31.706886 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de34b8a9-076f-4aa5-acb7-52361b6deeb8-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "de34b8a9-076f-4aa5-acb7-52361b6deeb8" (UID: "de34b8a9-076f-4aa5-acb7-52361b6deeb8"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:17:31 crc kubenswrapper[4672]: I1206 09:17:31.713073 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de34b8a9-076f-4aa5-acb7-52361b6deeb8-kube-api-access-cnw2g" (OuterVolumeSpecName: "kube-api-access-cnw2g") pod "de34b8a9-076f-4aa5-acb7-52361b6deeb8" (UID: "de34b8a9-076f-4aa5-acb7-52361b6deeb8"). InnerVolumeSpecName "kube-api-access-cnw2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:17:31 crc kubenswrapper[4672]: I1206 09:17:31.801956 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnw2g\" (UniqueName: \"kubernetes.io/projected/de34b8a9-076f-4aa5-acb7-52361b6deeb8-kube-api-access-cnw2g\") on node \"crc\" DevicePath \"\"" Dec 06 09:17:31 crc kubenswrapper[4672]: I1206 09:17:31.801990 4672 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de34b8a9-076f-4aa5-acb7-52361b6deeb8-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:17:31 crc kubenswrapper[4672]: I1206 09:17:31.801998 4672 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de34b8a9-076f-4aa5-acb7-52361b6deeb8-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:17:31 crc kubenswrapper[4672]: I1206 09:17:31.802006 4672 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de34b8a9-076f-4aa5-acb7-52361b6deeb8-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:17:31 crc kubenswrapper[4672]: I1206 09:17:31.802014 4672 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de34b8a9-076f-4aa5-acb7-52361b6deeb8-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:17:31 crc kubenswrapper[4672]: I1206 09:17:31.802022 4672 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de34b8a9-076f-4aa5-acb7-52361b6deeb8-service-ca\") on node \"crc\" DevicePath \"\"" Dec 06 09:17:31 crc kubenswrapper[4672]: I1206 09:17:31.802031 4672 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de34b8a9-076f-4aa5-acb7-52361b6deeb8-console-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:17:32 crc kubenswrapper[4672]: I1206 09:17:32.268753 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-dcdqg_de34b8a9-076f-4aa5-acb7-52361b6deeb8/console/0.log" Dec 06 09:17:32 crc kubenswrapper[4672]: I1206 09:17:32.268828 4672 generic.go:334] "Generic (PLEG): container finished" podID="de34b8a9-076f-4aa5-acb7-52361b6deeb8" containerID="9fae834b2b00e17e329b219768b7f8d2e6e7eec03531b7b23664c3e9f4778c3e" exitCode=2 Dec 06 09:17:32 crc kubenswrapper[4672]: I1206 09:17:32.268932 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dcdqg" Dec 06 09:17:32 crc kubenswrapper[4672]: I1206 09:17:32.268946 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dcdqg" event={"ID":"de34b8a9-076f-4aa5-acb7-52361b6deeb8","Type":"ContainerDied","Data":"9fae834b2b00e17e329b219768b7f8d2e6e7eec03531b7b23664c3e9f4778c3e"} Dec 06 09:17:32 crc kubenswrapper[4672]: I1206 09:17:32.268983 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dcdqg" event={"ID":"de34b8a9-076f-4aa5-acb7-52361b6deeb8","Type":"ContainerDied","Data":"b7240216e085894892f947c16b9d6387fb36461f29a2d91654a708ccb061a26b"} Dec 06 09:17:32 crc kubenswrapper[4672]: I1206 09:17:32.269006 4672 scope.go:117] "RemoveContainer" containerID="9fae834b2b00e17e329b219768b7f8d2e6e7eec03531b7b23664c3e9f4778c3e" Dec 06 09:17:32 crc kubenswrapper[4672]: I1206 09:17:32.273753 4672 generic.go:334] "Generic (PLEG): container finished" podID="d6fce567-e6b2-4968-afff-b87e8c3d5058" containerID="5f4091ee6282a9196fcd4238517d29538f067e76333ed6990d897ec7d2160f5d" exitCode=0 Dec 06 09:17:32 crc kubenswrapper[4672]: I1206 09:17:32.273816 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj" event={"ID":"d6fce567-e6b2-4968-afff-b87e8c3d5058","Type":"ContainerDied","Data":"5f4091ee6282a9196fcd4238517d29538f067e76333ed6990d897ec7d2160f5d"} Dec 06 09:17:32 crc kubenswrapper[4672]: I1206 09:17:32.295737 4672 scope.go:117] "RemoveContainer" containerID="9fae834b2b00e17e329b219768b7f8d2e6e7eec03531b7b23664c3e9f4778c3e" Dec 06 09:17:32 crc kubenswrapper[4672]: E1206 09:17:32.296222 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fae834b2b00e17e329b219768b7f8d2e6e7eec03531b7b23664c3e9f4778c3e\": container with ID starting with 9fae834b2b00e17e329b219768b7f8d2e6e7eec03531b7b23664c3e9f4778c3e not found: ID does not exist" containerID="9fae834b2b00e17e329b219768b7f8d2e6e7eec03531b7b23664c3e9f4778c3e" Dec 06 09:17:32 crc kubenswrapper[4672]: I1206 09:17:32.296275 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fae834b2b00e17e329b219768b7f8d2e6e7eec03531b7b23664c3e9f4778c3e"} err="failed to get container status \"9fae834b2b00e17e329b219768b7f8d2e6e7eec03531b7b23664c3e9f4778c3e\": rpc error: code = NotFound desc = could not find container \"9fae834b2b00e17e329b219768b7f8d2e6e7eec03531b7b23664c3e9f4778c3e\": container with ID starting with 9fae834b2b00e17e329b219768b7f8d2e6e7eec03531b7b23664c3e9f4778c3e not found: ID does not exist" Dec 06 09:17:32 crc kubenswrapper[4672]: I1206 09:17:32.333371 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-dcdqg"] Dec 06 09:17:32 crc kubenswrapper[4672]: I1206 09:17:32.338102 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-dcdqg"] Dec 06 09:17:32 crc kubenswrapper[4672]: I1206 09:17:32.566288 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de34b8a9-076f-4aa5-acb7-52361b6deeb8" path="/var/lib/kubelet/pods/de34b8a9-076f-4aa5-acb7-52361b6deeb8/volumes" Dec 06 09:17:33 crc kubenswrapper[4672]: I1206 09:17:33.540396 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj" Dec 06 09:17:33 crc kubenswrapper[4672]: I1206 09:17:33.625689 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d6fce567-e6b2-4968-afff-b87e8c3d5058-util\") pod \"d6fce567-e6b2-4968-afff-b87e8c3d5058\" (UID: \"d6fce567-e6b2-4968-afff-b87e8c3d5058\") " Dec 06 09:17:33 crc kubenswrapper[4672]: I1206 09:17:33.625745 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d6fce567-e6b2-4968-afff-b87e8c3d5058-bundle\") pod \"d6fce567-e6b2-4968-afff-b87e8c3d5058\" (UID: \"d6fce567-e6b2-4968-afff-b87e8c3d5058\") " Dec 06 09:17:33 crc kubenswrapper[4672]: I1206 09:17:33.625843 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbxch\" (UniqueName: \"kubernetes.io/projected/d6fce567-e6b2-4968-afff-b87e8c3d5058-kube-api-access-sbxch\") pod \"d6fce567-e6b2-4968-afff-b87e8c3d5058\" (UID: \"d6fce567-e6b2-4968-afff-b87e8c3d5058\") " Dec 06 09:17:33 crc kubenswrapper[4672]: I1206 09:17:33.627627 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6fce567-e6b2-4968-afff-b87e8c3d5058-bundle" (OuterVolumeSpecName: "bundle") pod "d6fce567-e6b2-4968-afff-b87e8c3d5058" (UID: "d6fce567-e6b2-4968-afff-b87e8c3d5058"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:17:33 crc kubenswrapper[4672]: I1206 09:17:33.630326 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6fce567-e6b2-4968-afff-b87e8c3d5058-kube-api-access-sbxch" (OuterVolumeSpecName: "kube-api-access-sbxch") pod "d6fce567-e6b2-4968-afff-b87e8c3d5058" (UID: "d6fce567-e6b2-4968-afff-b87e8c3d5058"). InnerVolumeSpecName "kube-api-access-sbxch". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:17:33 crc kubenswrapper[4672]: I1206 09:17:33.639758 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6fce567-e6b2-4968-afff-b87e8c3d5058-util" (OuterVolumeSpecName: "util") pod "d6fce567-e6b2-4968-afff-b87e8c3d5058" (UID: "d6fce567-e6b2-4968-afff-b87e8c3d5058"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:17:33 crc kubenswrapper[4672]: I1206 09:17:33.726955 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbxch\" (UniqueName: \"kubernetes.io/projected/d6fce567-e6b2-4968-afff-b87e8c3d5058-kube-api-access-sbxch\") on node \"crc\" DevicePath \"\"" Dec 06 09:17:33 crc kubenswrapper[4672]: I1206 09:17:33.726998 4672 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d6fce567-e6b2-4968-afff-b87e8c3d5058-util\") on node \"crc\" DevicePath \"\"" Dec 06 09:17:33 crc kubenswrapper[4672]: I1206 09:17:33.727010 4672 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d6fce567-e6b2-4968-afff-b87e8c3d5058-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:17:34 crc kubenswrapper[4672]: I1206 09:17:34.302785 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj" event={"ID":"d6fce567-e6b2-4968-afff-b87e8c3d5058","Type":"ContainerDied","Data":"c065b670d3afdf8f5f50e0b1828e024ac0c71c3073efcbf719b695d5f62bd753"} Dec 06 09:17:34 crc kubenswrapper[4672]: I1206 09:17:34.302833 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c065b670d3afdf8f5f50e0b1828e024ac0c71c3073efcbf719b695d5f62bd753" Dec 06 09:17:34 crc kubenswrapper[4672]: I1206 09:17:34.302874 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj" Dec 06 09:17:43 crc kubenswrapper[4672]: I1206 09:17:43.635477 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-765868b4fd-qt2wp"] Dec 06 09:17:43 crc kubenswrapper[4672]: E1206 09:17:43.636309 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6fce567-e6b2-4968-afff-b87e8c3d5058" containerName="pull" Dec 06 09:17:43 crc kubenswrapper[4672]: I1206 09:17:43.636325 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6fce567-e6b2-4968-afff-b87e8c3d5058" containerName="pull" Dec 06 09:17:43 crc kubenswrapper[4672]: E1206 09:17:43.636338 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6fce567-e6b2-4968-afff-b87e8c3d5058" containerName="extract" Dec 06 09:17:43 crc kubenswrapper[4672]: I1206 09:17:43.636345 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6fce567-e6b2-4968-afff-b87e8c3d5058" containerName="extract" Dec 06 09:17:43 crc kubenswrapper[4672]: E1206 09:17:43.636357 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de34b8a9-076f-4aa5-acb7-52361b6deeb8" containerName="console" Dec 06 09:17:43 crc kubenswrapper[4672]: I1206 09:17:43.636364 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="de34b8a9-076f-4aa5-acb7-52361b6deeb8" containerName="console" Dec 06 09:17:43 crc kubenswrapper[4672]: E1206 09:17:43.636377 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6fce567-e6b2-4968-afff-b87e8c3d5058" containerName="util" Dec 06 09:17:43 crc kubenswrapper[4672]: I1206 09:17:43.636384 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6fce567-e6b2-4968-afff-b87e8c3d5058" containerName="util" Dec 06 09:17:43 crc kubenswrapper[4672]: I1206 09:17:43.636490 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6fce567-e6b2-4968-afff-b87e8c3d5058" containerName="extract" Dec 06 09:17:43 crc kubenswrapper[4672]: I1206 09:17:43.636511 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="de34b8a9-076f-4aa5-acb7-52361b6deeb8" containerName="console" Dec 06 09:17:43 crc kubenswrapper[4672]: I1206 09:17:43.637110 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-765868b4fd-qt2wp" Dec 06 09:17:43 crc kubenswrapper[4672]: I1206 09:17:43.638789 4672 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 06 09:17:43 crc kubenswrapper[4672]: I1206 09:17:43.644094 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 06 09:17:43 crc kubenswrapper[4672]: I1206 09:17:43.644567 4672 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 06 09:17:43 crc kubenswrapper[4672]: I1206 09:17:43.653998 4672 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-5kr5w" Dec 06 09:17:43 crc kubenswrapper[4672]: I1206 09:17:43.654483 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 06 09:17:43 crc kubenswrapper[4672]: I1206 09:17:43.664044 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-765868b4fd-qt2wp"] Dec 06 09:17:43 crc kubenswrapper[4672]: I1206 09:17:43.763663 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/216580e9-9198-4b66-bf50-46df3a04c88e-apiservice-cert\") pod \"metallb-operator-controller-manager-765868b4fd-qt2wp\" (UID: \"216580e9-9198-4b66-bf50-46df3a04c88e\") " pod="metallb-system/metallb-operator-controller-manager-765868b4fd-qt2wp" Dec 06 09:17:43 crc kubenswrapper[4672]: I1206 09:17:43.763780 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/216580e9-9198-4b66-bf50-46df3a04c88e-webhook-cert\") pod \"metallb-operator-controller-manager-765868b4fd-qt2wp\" (UID: \"216580e9-9198-4b66-bf50-46df3a04c88e\") " pod="metallb-system/metallb-operator-controller-manager-765868b4fd-qt2wp" Dec 06 09:17:43 crc kubenswrapper[4672]: I1206 09:17:43.763807 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdfsg\" (UniqueName: \"kubernetes.io/projected/216580e9-9198-4b66-bf50-46df3a04c88e-kube-api-access-pdfsg\") pod \"metallb-operator-controller-manager-765868b4fd-qt2wp\" (UID: \"216580e9-9198-4b66-bf50-46df3a04c88e\") " pod="metallb-system/metallb-operator-controller-manager-765868b4fd-qt2wp" Dec 06 09:17:43 crc kubenswrapper[4672]: I1206 09:17:43.864689 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/216580e9-9198-4b66-bf50-46df3a04c88e-apiservice-cert\") pod \"metallb-operator-controller-manager-765868b4fd-qt2wp\" (UID: \"216580e9-9198-4b66-bf50-46df3a04c88e\") " pod="metallb-system/metallb-operator-controller-manager-765868b4fd-qt2wp" Dec 06 09:17:43 crc kubenswrapper[4672]: I1206 09:17:43.864777 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/216580e9-9198-4b66-bf50-46df3a04c88e-webhook-cert\") pod \"metallb-operator-controller-manager-765868b4fd-qt2wp\" (UID: \"216580e9-9198-4b66-bf50-46df3a04c88e\") " pod="metallb-system/metallb-operator-controller-manager-765868b4fd-qt2wp" Dec 06 09:17:43 crc kubenswrapper[4672]: I1206 09:17:43.864800 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdfsg\" (UniqueName: \"kubernetes.io/projected/216580e9-9198-4b66-bf50-46df3a04c88e-kube-api-access-pdfsg\") pod \"metallb-operator-controller-manager-765868b4fd-qt2wp\" (UID: \"216580e9-9198-4b66-bf50-46df3a04c88e\") " pod="metallb-system/metallb-operator-controller-manager-765868b4fd-qt2wp" Dec 06 09:17:43 crc kubenswrapper[4672]: I1206 09:17:43.872493 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/216580e9-9198-4b66-bf50-46df3a04c88e-webhook-cert\") pod \"metallb-operator-controller-manager-765868b4fd-qt2wp\" (UID: \"216580e9-9198-4b66-bf50-46df3a04c88e\") " pod="metallb-system/metallb-operator-controller-manager-765868b4fd-qt2wp" Dec 06 09:17:43 crc kubenswrapper[4672]: I1206 09:17:43.888489 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/216580e9-9198-4b66-bf50-46df3a04c88e-apiservice-cert\") pod \"metallb-operator-controller-manager-765868b4fd-qt2wp\" (UID: \"216580e9-9198-4b66-bf50-46df3a04c88e\") " pod="metallb-system/metallb-operator-controller-manager-765868b4fd-qt2wp" Dec 06 09:17:43 crc kubenswrapper[4672]: I1206 09:17:43.907410 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6df5976447-kzfnr"] Dec 06 09:17:43 crc kubenswrapper[4672]: I1206 09:17:43.908152 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6df5976447-kzfnr" Dec 06 09:17:43 crc kubenswrapper[4672]: I1206 09:17:43.911356 4672 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 06 09:17:43 crc kubenswrapper[4672]: I1206 09:17:43.911377 4672 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 06 09:17:43 crc kubenswrapper[4672]: I1206 09:17:43.911356 4672 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-kmw7s" Dec 06 09:17:43 crc kubenswrapper[4672]: I1206 09:17:43.918388 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdfsg\" (UniqueName: \"kubernetes.io/projected/216580e9-9198-4b66-bf50-46df3a04c88e-kube-api-access-pdfsg\") pod \"metallb-operator-controller-manager-765868b4fd-qt2wp\" (UID: \"216580e9-9198-4b66-bf50-46df3a04c88e\") " pod="metallb-system/metallb-operator-controller-manager-765868b4fd-qt2wp" Dec 06 09:17:43 crc kubenswrapper[4672]: I1206 09:17:43.942536 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6df5976447-kzfnr"] Dec 06 09:17:43 crc kubenswrapper[4672]: I1206 09:17:43.956916 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-765868b4fd-qt2wp" Dec 06 09:17:44 crc kubenswrapper[4672]: I1206 09:17:44.067492 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8faaf896-2bc9-489b-97dc-29e0efa86a91-apiservice-cert\") pod \"metallb-operator-webhook-server-6df5976447-kzfnr\" (UID: \"8faaf896-2bc9-489b-97dc-29e0efa86a91\") " pod="metallb-system/metallb-operator-webhook-server-6df5976447-kzfnr" Dec 06 09:17:44 crc kubenswrapper[4672]: I1206 09:17:44.067906 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p89ss\" (UniqueName: \"kubernetes.io/projected/8faaf896-2bc9-489b-97dc-29e0efa86a91-kube-api-access-p89ss\") pod \"metallb-operator-webhook-server-6df5976447-kzfnr\" (UID: \"8faaf896-2bc9-489b-97dc-29e0efa86a91\") " pod="metallb-system/metallb-operator-webhook-server-6df5976447-kzfnr" Dec 06 09:17:44 crc kubenswrapper[4672]: I1206 09:17:44.068017 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8faaf896-2bc9-489b-97dc-29e0efa86a91-webhook-cert\") pod \"metallb-operator-webhook-server-6df5976447-kzfnr\" (UID: \"8faaf896-2bc9-489b-97dc-29e0efa86a91\") " pod="metallb-system/metallb-operator-webhook-server-6df5976447-kzfnr" Dec 06 09:17:44 crc kubenswrapper[4672]: I1206 09:17:44.172098 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8faaf896-2bc9-489b-97dc-29e0efa86a91-apiservice-cert\") pod \"metallb-operator-webhook-server-6df5976447-kzfnr\" (UID: \"8faaf896-2bc9-489b-97dc-29e0efa86a91\") " pod="metallb-system/metallb-operator-webhook-server-6df5976447-kzfnr" Dec 06 09:17:44 crc kubenswrapper[4672]: I1206 09:17:44.172763 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p89ss\" (UniqueName: \"kubernetes.io/projected/8faaf896-2bc9-489b-97dc-29e0efa86a91-kube-api-access-p89ss\") pod \"metallb-operator-webhook-server-6df5976447-kzfnr\" (UID: \"8faaf896-2bc9-489b-97dc-29e0efa86a91\") " pod="metallb-system/metallb-operator-webhook-server-6df5976447-kzfnr" Dec 06 09:17:44 crc kubenswrapper[4672]: I1206 09:17:44.172810 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8faaf896-2bc9-489b-97dc-29e0efa86a91-webhook-cert\") pod \"metallb-operator-webhook-server-6df5976447-kzfnr\" (UID: \"8faaf896-2bc9-489b-97dc-29e0efa86a91\") " pod="metallb-system/metallb-operator-webhook-server-6df5976447-kzfnr" Dec 06 09:17:44 crc kubenswrapper[4672]: I1206 09:17:44.183074 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8faaf896-2bc9-489b-97dc-29e0efa86a91-apiservice-cert\") pod \"metallb-operator-webhook-server-6df5976447-kzfnr\" (UID: \"8faaf896-2bc9-489b-97dc-29e0efa86a91\") " pod="metallb-system/metallb-operator-webhook-server-6df5976447-kzfnr" Dec 06 09:17:44 crc kubenswrapper[4672]: I1206 09:17:44.183512 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8faaf896-2bc9-489b-97dc-29e0efa86a91-webhook-cert\") pod \"metallb-operator-webhook-server-6df5976447-kzfnr\" (UID: \"8faaf896-2bc9-489b-97dc-29e0efa86a91\") " pod="metallb-system/metallb-operator-webhook-server-6df5976447-kzfnr" Dec 06 09:17:44 crc kubenswrapper[4672]: I1206 09:17:44.195200 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p89ss\" (UniqueName: \"kubernetes.io/projected/8faaf896-2bc9-489b-97dc-29e0efa86a91-kube-api-access-p89ss\") pod \"metallb-operator-webhook-server-6df5976447-kzfnr\" (UID: \"8faaf896-2bc9-489b-97dc-29e0efa86a91\") " pod="metallb-system/metallb-operator-webhook-server-6df5976447-kzfnr" Dec 06 09:17:44 crc kubenswrapper[4672]: I1206 09:17:44.248862 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6df5976447-kzfnr" Dec 06 09:17:44 crc kubenswrapper[4672]: I1206 09:17:44.328995 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-765868b4fd-qt2wp"] Dec 06 09:17:44 crc kubenswrapper[4672]: I1206 09:17:44.373031 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-765868b4fd-qt2wp" event={"ID":"216580e9-9198-4b66-bf50-46df3a04c88e","Type":"ContainerStarted","Data":"c8e2bb398be35627affee56ba5b84450f89fbfd01cb3cafa4baa4c10485dae05"} Dec 06 09:17:44 crc kubenswrapper[4672]: I1206 09:17:44.605526 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6df5976447-kzfnr"] Dec 06 09:17:45 crc kubenswrapper[4672]: I1206 09:17:45.380055 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6df5976447-kzfnr" event={"ID":"8faaf896-2bc9-489b-97dc-29e0efa86a91","Type":"ContainerStarted","Data":"3dcec929d9b17cc6e0f8f1c9103a9ff632307773295ea406a726872216d59599"} Dec 06 09:17:51 crc kubenswrapper[4672]: I1206 09:17:51.417020 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6df5976447-kzfnr" event={"ID":"8faaf896-2bc9-489b-97dc-29e0efa86a91","Type":"ContainerStarted","Data":"2aed176f64baa5c76ec653a95bd4011449f66c84e3369dc0d66d36e1e554d895"} Dec 06 09:17:51 crc kubenswrapper[4672]: I1206 09:17:51.417780 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6df5976447-kzfnr" Dec 06 09:17:51 crc kubenswrapper[4672]: I1206 09:17:51.419767 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-765868b4fd-qt2wp" event={"ID":"216580e9-9198-4b66-bf50-46df3a04c88e","Type":"ContainerStarted","Data":"0fe7c184cbdb93d4b13ea51d4aac44dcb97f97233093cb6115a896d9195cc88b"} Dec 06 09:17:51 crc kubenswrapper[4672]: I1206 09:17:51.419999 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-765868b4fd-qt2wp" Dec 06 09:17:51 crc kubenswrapper[4672]: I1206 09:17:51.437786 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6df5976447-kzfnr" podStartSLOduration=1.9538905899999999 podStartE2EDuration="8.4377649s" podCreationTimestamp="2025-12-06 09:17:43 +0000 UTC" firstStartedPulling="2025-12-06 09:17:44.613845182 +0000 UTC m=+682.358105469" lastFinishedPulling="2025-12-06 09:17:51.097719492 +0000 UTC m=+688.841979779" observedRunningTime="2025-12-06 09:17:51.435429655 +0000 UTC m=+689.179689932" watchObservedRunningTime="2025-12-06 09:17:51.4377649 +0000 UTC m=+689.182025187" Dec 06 09:17:51 crc kubenswrapper[4672]: I1206 09:17:51.460184 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-765868b4fd-qt2wp" podStartSLOduration=1.796275498 podStartE2EDuration="8.460168588s" podCreationTimestamp="2025-12-06 09:17:43 +0000 UTC" firstStartedPulling="2025-12-06 09:17:44.34354136 +0000 UTC m=+682.087801647" lastFinishedPulling="2025-12-06 09:17:51.00743445 +0000 UTC m=+688.751694737" observedRunningTime="2025-12-06 09:17:51.457467694 +0000 UTC m=+689.201727981" watchObservedRunningTime="2025-12-06 09:17:51.460168588 +0000 UTC m=+689.204428875" Dec 06 09:18:04 crc kubenswrapper[4672]: I1206 09:18:04.265240 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6df5976447-kzfnr" Dec 06 09:18:23 crc kubenswrapper[4672]: I1206 09:18:23.960134 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-765868b4fd-qt2wp" Dec 06 09:18:24 crc kubenswrapper[4672]: I1206 09:18:24.824343 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-wjtmh"] Dec 06 09:18:24 crc kubenswrapper[4672]: I1206 09:18:24.826700 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-wjtmh" Dec 06 09:18:24 crc kubenswrapper[4672]: I1206 09:18:24.829213 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-mqk7f"] Dec 06 09:18:24 crc kubenswrapper[4672]: I1206 09:18:24.830103 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mqk7f" Dec 06 09:18:24 crc kubenswrapper[4672]: I1206 09:18:24.834271 4672 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 06 09:18:24 crc kubenswrapper[4672]: I1206 09:18:24.839635 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 06 09:18:24 crc kubenswrapper[4672]: I1206 09:18:24.842302 4672 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 06 09:18:24 crc kubenswrapper[4672]: I1206 09:18:24.843226 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-mqk7f"] Dec 06 09:18:24 crc kubenswrapper[4672]: I1206 09:18:24.846853 4672 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-lxnw7" Dec 06 09:18:24 crc kubenswrapper[4672]: I1206 09:18:24.945092 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-skjzl"] Dec 06 09:18:24 crc kubenswrapper[4672]: I1206 09:18:24.945930 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-skjzl" Dec 06 09:18:24 crc kubenswrapper[4672]: I1206 09:18:24.958147 4672 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 06 09:18:24 crc kubenswrapper[4672]: I1206 09:18:24.958207 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 06 09:18:24 crc kubenswrapper[4672]: I1206 09:18:24.958443 4672 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-9hv5l" Dec 06 09:18:24 crc kubenswrapper[4672]: I1206 09:18:24.958823 4672 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 06 09:18:24 crc kubenswrapper[4672]: I1206 09:18:24.962572 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp4z9\" (UniqueName: \"kubernetes.io/projected/faa92f29-2bae-4481-ab38-1a0b681d73d9-kube-api-access-bp4z9\") pod \"frr-k8s-wjtmh\" (UID: \"faa92f29-2bae-4481-ab38-1a0b681d73d9\") " pod="metallb-system/frr-k8s-wjtmh" Dec 06 09:18:24 crc kubenswrapper[4672]: I1206 09:18:24.962660 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/faa92f29-2bae-4481-ab38-1a0b681d73d9-metrics-certs\") pod \"frr-k8s-wjtmh\" (UID: \"faa92f29-2bae-4481-ab38-1a0b681d73d9\") " pod="metallb-system/frr-k8s-wjtmh" Dec 06 09:18:24 crc kubenswrapper[4672]: I1206 09:18:24.962691 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/faa92f29-2bae-4481-ab38-1a0b681d73d9-frr-conf\") pod \"frr-k8s-wjtmh\" (UID: \"faa92f29-2bae-4481-ab38-1a0b681d73d9\") " pod="metallb-system/frr-k8s-wjtmh" Dec 06 09:18:24 crc kubenswrapper[4672]: I1206 09:18:24.962707 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/faa92f29-2bae-4481-ab38-1a0b681d73d9-frr-sockets\") pod \"frr-k8s-wjtmh\" (UID: \"faa92f29-2bae-4481-ab38-1a0b681d73d9\") " pod="metallb-system/frr-k8s-wjtmh" Dec 06 09:18:24 crc kubenswrapper[4672]: I1206 09:18:24.962732 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/faa92f29-2bae-4481-ab38-1a0b681d73d9-metrics\") pod \"frr-k8s-wjtmh\" (UID: \"faa92f29-2bae-4481-ab38-1a0b681d73d9\") " pod="metallb-system/frr-k8s-wjtmh" Dec 06 09:18:24 crc kubenswrapper[4672]: I1206 09:18:24.962754 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/faa92f29-2bae-4481-ab38-1a0b681d73d9-reloader\") pod \"frr-k8s-wjtmh\" (UID: \"faa92f29-2bae-4481-ab38-1a0b681d73d9\") " pod="metallb-system/frr-k8s-wjtmh" Dec 06 09:18:24 crc kubenswrapper[4672]: I1206 09:18:24.962775 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnjtz\" (UniqueName: \"kubernetes.io/projected/2a795467-0c6f-4dae-bd0e-0595c9eb88b4-kube-api-access-gnjtz\") pod \"frr-k8s-webhook-server-7fcb986d4-mqk7f\" (UID: \"2a795467-0c6f-4dae-bd0e-0595c9eb88b4\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mqk7f" Dec 06 09:18:24 crc kubenswrapper[4672]: I1206 09:18:24.962789 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/faa92f29-2bae-4481-ab38-1a0b681d73d9-frr-startup\") pod \"frr-k8s-wjtmh\" (UID: \"faa92f29-2bae-4481-ab38-1a0b681d73d9\") " pod="metallb-system/frr-k8s-wjtmh" Dec 06 09:18:24 crc kubenswrapper[4672]: I1206 09:18:24.962817 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a795467-0c6f-4dae-bd0e-0595c9eb88b4-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-mqk7f\" (UID: \"2a795467-0c6f-4dae-bd0e-0595c9eb88b4\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mqk7f" Dec 06 09:18:24 crc kubenswrapper[4672]: I1206 09:18:24.966856 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-ljcvb"] Dec 06 09:18:24 crc kubenswrapper[4672]: I1206 09:18:24.967796 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-ljcvb" Dec 06 09:18:24 crc kubenswrapper[4672]: I1206 09:18:24.969194 4672 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 06 09:18:24 crc kubenswrapper[4672]: I1206 09:18:24.986627 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-ljcvb"] Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.064319 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/faa92f29-2bae-4481-ab38-1a0b681d73d9-metrics\") pod \"frr-k8s-wjtmh\" (UID: \"faa92f29-2bae-4481-ab38-1a0b681d73d9\") " pod="metallb-system/frr-k8s-wjtmh" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.064647 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/35023ac9-ea1e-4576-b700-4afe57f59230-metrics-certs\") pod \"controller-f8648f98b-ljcvb\" (UID: \"35023ac9-ea1e-4576-b700-4afe57f59230\") " pod="metallb-system/controller-f8648f98b-ljcvb" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.064735 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5gmn\" (UniqueName: \"kubernetes.io/projected/47d9472b-be65-46ea-8eff-fa70e315ed49-kube-api-access-h5gmn\") pod \"speaker-skjzl\" (UID: \"47d9472b-be65-46ea-8eff-fa70e315ed49\") " pod="metallb-system/speaker-skjzl" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.064810 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/faa92f29-2bae-4481-ab38-1a0b681d73d9-reloader\") pod \"frr-k8s-wjtmh\" (UID: \"faa92f29-2bae-4481-ab38-1a0b681d73d9\") " pod="metallb-system/frr-k8s-wjtmh" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.064894 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47d9472b-be65-46ea-8eff-fa70e315ed49-metrics-certs\") pod \"speaker-skjzl\" (UID: \"47d9472b-be65-46ea-8eff-fa70e315ed49\") " pod="metallb-system/speaker-skjzl" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.064990 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35023ac9-ea1e-4576-b700-4afe57f59230-cert\") pod \"controller-f8648f98b-ljcvb\" (UID: \"35023ac9-ea1e-4576-b700-4afe57f59230\") " pod="metallb-system/controller-f8648f98b-ljcvb" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.065090 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnjtz\" (UniqueName: \"kubernetes.io/projected/2a795467-0c6f-4dae-bd0e-0595c9eb88b4-kube-api-access-gnjtz\") pod \"frr-k8s-webhook-server-7fcb986d4-mqk7f\" (UID: \"2a795467-0c6f-4dae-bd0e-0595c9eb88b4\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mqk7f" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.065192 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/faa92f29-2bae-4481-ab38-1a0b681d73d9-frr-startup\") pod \"frr-k8s-wjtmh\" (UID: \"faa92f29-2bae-4481-ab38-1a0b681d73d9\") " pod="metallb-system/frr-k8s-wjtmh" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.065293 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22hv5\" (UniqueName: \"kubernetes.io/projected/35023ac9-ea1e-4576-b700-4afe57f59230-kube-api-access-22hv5\") pod \"controller-f8648f98b-ljcvb\" (UID: \"35023ac9-ea1e-4576-b700-4afe57f59230\") " pod="metallb-system/controller-f8648f98b-ljcvb" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.065395 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a795467-0c6f-4dae-bd0e-0595c9eb88b4-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-mqk7f\" (UID: \"2a795467-0c6f-4dae-bd0e-0595c9eb88b4\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mqk7f" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.065492 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/47d9472b-be65-46ea-8eff-fa70e315ed49-memberlist\") pod \"speaker-skjzl\" (UID: \"47d9472b-be65-46ea-8eff-fa70e315ed49\") " pod="metallb-system/speaker-skjzl" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.065206 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/faa92f29-2bae-4481-ab38-1a0b681d73d9-reloader\") pod \"frr-k8s-wjtmh\" (UID: \"faa92f29-2bae-4481-ab38-1a0b681d73d9\") " pod="metallb-system/frr-k8s-wjtmh" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.064900 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/faa92f29-2bae-4481-ab38-1a0b681d73d9-metrics\") pod \"frr-k8s-wjtmh\" (UID: \"faa92f29-2bae-4481-ab38-1a0b681d73d9\") " pod="metallb-system/frr-k8s-wjtmh" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.065739 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp4z9\" (UniqueName: \"kubernetes.io/projected/faa92f29-2bae-4481-ab38-1a0b681d73d9-kube-api-access-bp4z9\") pod \"frr-k8s-wjtmh\" (UID: \"faa92f29-2bae-4481-ab38-1a0b681d73d9\") " pod="metallb-system/frr-k8s-wjtmh" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.065794 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/faa92f29-2bae-4481-ab38-1a0b681d73d9-metrics-certs\") pod \"frr-k8s-wjtmh\" (UID: \"faa92f29-2bae-4481-ab38-1a0b681d73d9\") " pod="metallb-system/frr-k8s-wjtmh" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.065870 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/faa92f29-2bae-4481-ab38-1a0b681d73d9-frr-conf\") pod \"frr-k8s-wjtmh\" (UID: \"faa92f29-2bae-4481-ab38-1a0b681d73d9\") " pod="metallb-system/frr-k8s-wjtmh" Dec 06 09:18:25 crc kubenswrapper[4672]: E1206 09:18:25.065886 4672 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.065895 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/faa92f29-2bae-4481-ab38-1a0b681d73d9-frr-sockets\") pod \"frr-k8s-wjtmh\" (UID: \"faa92f29-2bae-4481-ab38-1a0b681d73d9\") " pod="metallb-system/frr-k8s-wjtmh" Dec 06 09:18:25 crc kubenswrapper[4672]: E1206 09:18:25.065938 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/faa92f29-2bae-4481-ab38-1a0b681d73d9-metrics-certs podName:faa92f29-2bae-4481-ab38-1a0b681d73d9 nodeName:}" failed. No retries permitted until 2025-12-06 09:18:25.565921235 +0000 UTC m=+723.310181522 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/faa92f29-2bae-4481-ab38-1a0b681d73d9-metrics-certs") pod "frr-k8s-wjtmh" (UID: "faa92f29-2bae-4481-ab38-1a0b681d73d9") : secret "frr-k8s-certs-secret" not found Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.065952 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/47d9472b-be65-46ea-8eff-fa70e315ed49-metallb-excludel2\") pod \"speaker-skjzl\" (UID: \"47d9472b-be65-46ea-8eff-fa70e315ed49\") " pod="metallb-system/speaker-skjzl" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.066298 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/faa92f29-2bae-4481-ab38-1a0b681d73d9-frr-conf\") pod \"frr-k8s-wjtmh\" (UID: \"faa92f29-2bae-4481-ab38-1a0b681d73d9\") " pod="metallb-system/frr-k8s-wjtmh" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.066403 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/faa92f29-2bae-4481-ab38-1a0b681d73d9-frr-startup\") pod \"frr-k8s-wjtmh\" (UID: \"faa92f29-2bae-4481-ab38-1a0b681d73d9\") " pod="metallb-system/frr-k8s-wjtmh" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.066558 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/faa92f29-2bae-4481-ab38-1a0b681d73d9-frr-sockets\") pod \"frr-k8s-wjtmh\" (UID: \"faa92f29-2bae-4481-ab38-1a0b681d73d9\") " pod="metallb-system/frr-k8s-wjtmh" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.083748 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a795467-0c6f-4dae-bd0e-0595c9eb88b4-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-mqk7f\" (UID: \"2a795467-0c6f-4dae-bd0e-0595c9eb88b4\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mqk7f" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.086115 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp4z9\" (UniqueName: \"kubernetes.io/projected/faa92f29-2bae-4481-ab38-1a0b681d73d9-kube-api-access-bp4z9\") pod \"frr-k8s-wjtmh\" (UID: \"faa92f29-2bae-4481-ab38-1a0b681d73d9\") " pod="metallb-system/frr-k8s-wjtmh" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.087698 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnjtz\" (UniqueName: \"kubernetes.io/projected/2a795467-0c6f-4dae-bd0e-0595c9eb88b4-kube-api-access-gnjtz\") pod \"frr-k8s-webhook-server-7fcb986d4-mqk7f\" (UID: \"2a795467-0c6f-4dae-bd0e-0595c9eb88b4\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mqk7f" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.149735 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mqk7f" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.167399 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5gmn\" (UniqueName: \"kubernetes.io/projected/47d9472b-be65-46ea-8eff-fa70e315ed49-kube-api-access-h5gmn\") pod \"speaker-skjzl\" (UID: \"47d9472b-be65-46ea-8eff-fa70e315ed49\") " pod="metallb-system/speaker-skjzl" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.167442 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47d9472b-be65-46ea-8eff-fa70e315ed49-metrics-certs\") pod \"speaker-skjzl\" (UID: \"47d9472b-be65-46ea-8eff-fa70e315ed49\") " pod="metallb-system/speaker-skjzl" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.167461 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35023ac9-ea1e-4576-b700-4afe57f59230-cert\") pod \"controller-f8648f98b-ljcvb\" (UID: \"35023ac9-ea1e-4576-b700-4afe57f59230\") " pod="metallb-system/controller-f8648f98b-ljcvb" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.167499 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22hv5\" (UniqueName: \"kubernetes.io/projected/35023ac9-ea1e-4576-b700-4afe57f59230-kube-api-access-22hv5\") pod \"controller-f8648f98b-ljcvb\" (UID: \"35023ac9-ea1e-4576-b700-4afe57f59230\") " pod="metallb-system/controller-f8648f98b-ljcvb" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.167535 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/47d9472b-be65-46ea-8eff-fa70e315ed49-memberlist\") pod \"speaker-skjzl\" (UID: \"47d9472b-be65-46ea-8eff-fa70e315ed49\") " pod="metallb-system/speaker-skjzl" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.167615 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/47d9472b-be65-46ea-8eff-fa70e315ed49-metallb-excludel2\") pod \"speaker-skjzl\" (UID: \"47d9472b-be65-46ea-8eff-fa70e315ed49\") " pod="metallb-system/speaker-skjzl" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.167643 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/35023ac9-ea1e-4576-b700-4afe57f59230-metrics-certs\") pod \"controller-f8648f98b-ljcvb\" (UID: \"35023ac9-ea1e-4576-b700-4afe57f59230\") " pod="metallb-system/controller-f8648f98b-ljcvb" Dec 06 09:18:25 crc kubenswrapper[4672]: E1206 09:18:25.167770 4672 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 06 09:18:25 crc kubenswrapper[4672]: E1206 09:18:25.167826 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35023ac9-ea1e-4576-b700-4afe57f59230-metrics-certs podName:35023ac9-ea1e-4576-b700-4afe57f59230 nodeName:}" failed. No retries permitted until 2025-12-06 09:18:25.667804314 +0000 UTC m=+723.412064601 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/35023ac9-ea1e-4576-b700-4afe57f59230-metrics-certs") pod "controller-f8648f98b-ljcvb" (UID: "35023ac9-ea1e-4576-b700-4afe57f59230") : secret "controller-certs-secret" not found Dec 06 09:18:25 crc kubenswrapper[4672]: E1206 09:18:25.167847 4672 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 06 09:18:25 crc kubenswrapper[4672]: E1206 09:18:25.167895 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47d9472b-be65-46ea-8eff-fa70e315ed49-memberlist podName:47d9472b-be65-46ea-8eff-fa70e315ed49 nodeName:}" failed. No retries permitted until 2025-12-06 09:18:25.667881526 +0000 UTC m=+723.412141813 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/47d9472b-be65-46ea-8eff-fa70e315ed49-memberlist") pod "speaker-skjzl" (UID: "47d9472b-be65-46ea-8eff-fa70e315ed49") : secret "metallb-memberlist" not found Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.168533 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/47d9472b-be65-46ea-8eff-fa70e315ed49-metallb-excludel2\") pod \"speaker-skjzl\" (UID: \"47d9472b-be65-46ea-8eff-fa70e315ed49\") " pod="metallb-system/speaker-skjzl" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.170487 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47d9472b-be65-46ea-8eff-fa70e315ed49-metrics-certs\") pod \"speaker-skjzl\" (UID: \"47d9472b-be65-46ea-8eff-fa70e315ed49\") " pod="metallb-system/speaker-skjzl" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.176779 4672 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.181732 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35023ac9-ea1e-4576-b700-4afe57f59230-cert\") pod \"controller-f8648f98b-ljcvb\" (UID: \"35023ac9-ea1e-4576-b700-4afe57f59230\") " pod="metallb-system/controller-f8648f98b-ljcvb" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.198188 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5gmn\" (UniqueName: \"kubernetes.io/projected/47d9472b-be65-46ea-8eff-fa70e315ed49-kube-api-access-h5gmn\") pod \"speaker-skjzl\" (UID: \"47d9472b-be65-46ea-8eff-fa70e315ed49\") " pod="metallb-system/speaker-skjzl" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.200233 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22hv5\" (UniqueName: \"kubernetes.io/projected/35023ac9-ea1e-4576-b700-4afe57f59230-kube-api-access-22hv5\") pod \"controller-f8648f98b-ljcvb\" (UID: \"35023ac9-ea1e-4576-b700-4afe57f59230\") " pod="metallb-system/controller-f8648f98b-ljcvb" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.593329 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/faa92f29-2bae-4481-ab38-1a0b681d73d9-metrics-certs\") pod \"frr-k8s-wjtmh\" (UID: \"faa92f29-2bae-4481-ab38-1a0b681d73d9\") " pod="metallb-system/frr-k8s-wjtmh" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.599657 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/faa92f29-2bae-4481-ab38-1a0b681d73d9-metrics-certs\") pod \"frr-k8s-wjtmh\" (UID: \"faa92f29-2bae-4481-ab38-1a0b681d73d9\") " pod="metallb-system/frr-k8s-wjtmh" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.654327 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-mqk7f"] Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.695357 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/47d9472b-be65-46ea-8eff-fa70e315ed49-memberlist\") pod \"speaker-skjzl\" (UID: \"47d9472b-be65-46ea-8eff-fa70e315ed49\") " pod="metallb-system/speaker-skjzl" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.695459 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/35023ac9-ea1e-4576-b700-4afe57f59230-metrics-certs\") pod \"controller-f8648f98b-ljcvb\" (UID: \"35023ac9-ea1e-4576-b700-4afe57f59230\") " pod="metallb-system/controller-f8648f98b-ljcvb" Dec 06 09:18:25 crc kubenswrapper[4672]: E1206 09:18:25.695548 4672 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 06 09:18:25 crc kubenswrapper[4672]: E1206 09:18:25.695653 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47d9472b-be65-46ea-8eff-fa70e315ed49-memberlist podName:47d9472b-be65-46ea-8eff-fa70e315ed49 nodeName:}" failed. No retries permitted until 2025-12-06 09:18:26.695628703 +0000 UTC m=+724.439888990 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/47d9472b-be65-46ea-8eff-fa70e315ed49-memberlist") pod "speaker-skjzl" (UID: "47d9472b-be65-46ea-8eff-fa70e315ed49") : secret "metallb-memberlist" not found Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.698451 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/35023ac9-ea1e-4576-b700-4afe57f59230-metrics-certs\") pod \"controller-f8648f98b-ljcvb\" (UID: \"35023ac9-ea1e-4576-b700-4afe57f59230\") " pod="metallb-system/controller-f8648f98b-ljcvb" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.743284 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-wjtmh" Dec 06 09:18:25 crc kubenswrapper[4672]: I1206 09:18:25.883015 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-ljcvb" Dec 06 09:18:26 crc kubenswrapper[4672]: I1206 09:18:26.082641 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-ljcvb"] Dec 06 09:18:26 crc kubenswrapper[4672]: I1206 09:18:26.618613 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wjtmh" event={"ID":"faa92f29-2bae-4481-ab38-1a0b681d73d9","Type":"ContainerStarted","Data":"3f21fdd85bcd777dd663bbad9f5345e8eef99e88b90d6bf2e0cae4c63b4a752b"} Dec 06 09:18:26 crc kubenswrapper[4672]: I1206 09:18:26.622773 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-ljcvb" event={"ID":"35023ac9-ea1e-4576-b700-4afe57f59230","Type":"ContainerStarted","Data":"18ba43d7c1d237def82320bb5a83f2307071f47bb46e64a19730283d55fc6ed3"} Dec 06 09:18:26 crc kubenswrapper[4672]: I1206 09:18:26.622817 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-ljcvb" event={"ID":"35023ac9-ea1e-4576-b700-4afe57f59230","Type":"ContainerStarted","Data":"191274de8dbbfc71e60d24ac5ef235303782cd7d82f00f545ba2a3e903150fff"} Dec 06 09:18:26 crc kubenswrapper[4672]: I1206 09:18:26.624200 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mqk7f" event={"ID":"2a795467-0c6f-4dae-bd0e-0595c9eb88b4","Type":"ContainerStarted","Data":"5098a3f0c836ffe64639e6759056387ae26e0efff6b40c0d2fd4260245dd0117"} Dec 06 09:18:26 crc kubenswrapper[4672]: I1206 09:18:26.712097 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/47d9472b-be65-46ea-8eff-fa70e315ed49-memberlist\") pod \"speaker-skjzl\" (UID: \"47d9472b-be65-46ea-8eff-fa70e315ed49\") " pod="metallb-system/speaker-skjzl" Dec 06 09:18:26 crc kubenswrapper[4672]: I1206 09:18:26.734731 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/47d9472b-be65-46ea-8eff-fa70e315ed49-memberlist\") pod \"speaker-skjzl\" (UID: \"47d9472b-be65-46ea-8eff-fa70e315ed49\") " pod="metallb-system/speaker-skjzl" Dec 06 09:18:26 crc kubenswrapper[4672]: I1206 09:18:26.761619 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-skjzl" Dec 06 09:18:27 crc kubenswrapper[4672]: I1206 09:18:27.630838 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-ljcvb" event={"ID":"35023ac9-ea1e-4576-b700-4afe57f59230","Type":"ContainerStarted","Data":"90a61a5bd1bbfa7bbd9436855b51735e9cb2f31c0586cc743f20c8662140fa69"} Dec 06 09:18:27 crc kubenswrapper[4672]: I1206 09:18:27.631629 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-ljcvb" Dec 06 09:18:27 crc kubenswrapper[4672]: I1206 09:18:27.633658 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-skjzl" event={"ID":"47d9472b-be65-46ea-8eff-fa70e315ed49","Type":"ContainerStarted","Data":"a015171afb431ae0e96ac3095d338c5d335cb821478ff13d92c33ad655262ff6"} Dec 06 09:18:27 crc kubenswrapper[4672]: I1206 09:18:27.633677 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-skjzl" event={"ID":"47d9472b-be65-46ea-8eff-fa70e315ed49","Type":"ContainerStarted","Data":"79c85d9cb6096277a5891b94df977f4029e44d1ecf6e19c2f5b23bba84f746ab"} Dec 06 09:18:27 crc kubenswrapper[4672]: I1206 09:18:27.633686 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-skjzl" event={"ID":"47d9472b-be65-46ea-8eff-fa70e315ed49","Type":"ContainerStarted","Data":"e89c5d9396c0c2c9e61b1ed6573d7471fd2f563482f08adbcb85997b71ad786c"} Dec 06 09:18:27 crc kubenswrapper[4672]: I1206 09:18:27.634017 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-skjzl" Dec 06 09:18:27 crc kubenswrapper[4672]: I1206 09:18:27.647405 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-ljcvb" podStartSLOduration=3.647389251 podStartE2EDuration="3.647389251s" podCreationTimestamp="2025-12-06 09:18:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:18:27.647207386 +0000 UTC m=+725.391467673" watchObservedRunningTime="2025-12-06 09:18:27.647389251 +0000 UTC m=+725.391649538" Dec 06 09:18:27 crc kubenswrapper[4672]: I1206 09:18:27.686260 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-skjzl" podStartSLOduration=3.686246262 podStartE2EDuration="3.686246262s" podCreationTimestamp="2025-12-06 09:18:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:18:27.683517732 +0000 UTC m=+725.427778019" watchObservedRunningTime="2025-12-06 09:18:27.686246262 +0000 UTC m=+725.430506549" Dec 06 09:18:33 crc kubenswrapper[4672]: I1206 09:18:33.688903 4672 generic.go:334] "Generic (PLEG): container finished" podID="faa92f29-2bae-4481-ab38-1a0b681d73d9" containerID="0305efc48f6f492c2e9737188d3ce160b0d69e1fecc6be7c25745e9627125c2e" exitCode=0 Dec 06 09:18:33 crc kubenswrapper[4672]: I1206 09:18:33.688963 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wjtmh" event={"ID":"faa92f29-2bae-4481-ab38-1a0b681d73d9","Type":"ContainerDied","Data":"0305efc48f6f492c2e9737188d3ce160b0d69e1fecc6be7c25745e9627125c2e"} Dec 06 09:18:33 crc kubenswrapper[4672]: I1206 09:18:33.691729 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mqk7f" event={"ID":"2a795467-0c6f-4dae-bd0e-0595c9eb88b4","Type":"ContainerStarted","Data":"2dee187c06a26f55f8e51c19a1d5e3680eddba891258a931210a65db25bf25c7"} Dec 06 09:18:33 crc kubenswrapper[4672]: I1206 09:18:33.692167 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mqk7f" Dec 06 09:18:34 crc kubenswrapper[4672]: I1206 09:18:34.700972 4672 generic.go:334] "Generic (PLEG): container finished" podID="faa92f29-2bae-4481-ab38-1a0b681d73d9" containerID="031eb96493bbf50730de32842ce7a4d2c27e5ad1072a33910965bdc8ab350c16" exitCode=0 Dec 06 09:18:34 crc kubenswrapper[4672]: I1206 09:18:34.701039 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wjtmh" event={"ID":"faa92f29-2bae-4481-ab38-1a0b681d73d9","Type":"ContainerDied","Data":"031eb96493bbf50730de32842ce7a4d2c27e5ad1072a33910965bdc8ab350c16"} Dec 06 09:18:34 crc kubenswrapper[4672]: I1206 09:18:34.732254 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mqk7f" podStartSLOduration=2.926431232 podStartE2EDuration="10.732236026s" podCreationTimestamp="2025-12-06 09:18:24 +0000 UTC" firstStartedPulling="2025-12-06 09:18:25.659149923 +0000 UTC m=+723.403410210" lastFinishedPulling="2025-12-06 09:18:33.464954717 +0000 UTC m=+731.209215004" observedRunningTime="2025-12-06 09:18:33.736529266 +0000 UTC m=+731.480789553" watchObservedRunningTime="2025-12-06 09:18:34.732236026 +0000 UTC m=+732.476496313" Dec 06 09:18:35 crc kubenswrapper[4672]: I1206 09:18:35.710087 4672 generic.go:334] "Generic (PLEG): container finished" podID="faa92f29-2bae-4481-ab38-1a0b681d73d9" containerID="5ebb71b8a975228b5fb8b0713e6894550eba4ef81a04c490ba6c2daffdd685a9" exitCode=0 Dec 06 09:18:35 crc kubenswrapper[4672]: I1206 09:18:35.710191 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wjtmh" event={"ID":"faa92f29-2bae-4481-ab38-1a0b681d73d9","Type":"ContainerDied","Data":"5ebb71b8a975228b5fb8b0713e6894550eba4ef81a04c490ba6c2daffdd685a9"} Dec 06 09:18:36 crc kubenswrapper[4672]: I1206 09:18:36.736424 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wjtmh" event={"ID":"faa92f29-2bae-4481-ab38-1a0b681d73d9","Type":"ContainerStarted","Data":"f11737d4094387e260a136167ecb804e83befd8a3662987f6cc87958d14f6fe5"} Dec 06 09:18:36 crc kubenswrapper[4672]: I1206 09:18:36.736499 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wjtmh" event={"ID":"faa92f29-2bae-4481-ab38-1a0b681d73d9","Type":"ContainerStarted","Data":"e3a8bb17b60360712ccd1d84e17782fd955515ba1cf42f408f8943c2e6789716"} Dec 06 09:18:36 crc kubenswrapper[4672]: I1206 09:18:36.736513 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wjtmh" event={"ID":"faa92f29-2bae-4481-ab38-1a0b681d73d9","Type":"ContainerStarted","Data":"4181767edbb4f4df6add32e1ec9e225dcf2e6e47d85c61ab5e470a2bea282a28"} Dec 06 09:18:36 crc kubenswrapper[4672]: I1206 09:18:36.736524 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wjtmh" event={"ID":"faa92f29-2bae-4481-ab38-1a0b681d73d9","Type":"ContainerStarted","Data":"72ed740c234c9557424dd1326764e889e8a210ccaff1b3423a9284c89402c525"} Dec 06 09:18:36 crc kubenswrapper[4672]: I1206 09:18:36.736551 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wjtmh" event={"ID":"faa92f29-2bae-4481-ab38-1a0b681d73d9","Type":"ContainerStarted","Data":"5aa5dff6b1d02230891c58892f426b541023b394633e5d6f35d0450ffa7945ce"} Dec 06 09:18:36 crc kubenswrapper[4672]: I1206 09:18:36.736578 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wjtmh" event={"ID":"faa92f29-2bae-4481-ab38-1a0b681d73d9","Type":"ContainerStarted","Data":"ac1b472825cd5079831c03da97a0c0c8fdd6040a9d261f0e4847454e655c3e43"} Dec 06 09:18:36 crc kubenswrapper[4672]: I1206 09:18:36.736955 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-wjtmh" Dec 06 09:18:40 crc kubenswrapper[4672]: I1206 09:18:40.743970 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-wjtmh" Dec 06 09:18:40 crc kubenswrapper[4672]: I1206 09:18:40.829076 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-wjtmh" Dec 06 09:18:40 crc kubenswrapper[4672]: I1206 09:18:40.866274 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-wjtmh" podStartSLOduration=9.920386521 podStartE2EDuration="16.866257189s" podCreationTimestamp="2025-12-06 09:18:24 +0000 UTC" firstStartedPulling="2025-12-06 09:18:26.524131487 +0000 UTC m=+724.268391774" lastFinishedPulling="2025-12-06 09:18:33.470002145 +0000 UTC m=+731.214262442" observedRunningTime="2025-12-06 09:18:36.777165948 +0000 UTC m=+734.521426235" watchObservedRunningTime="2025-12-06 09:18:40.866257189 +0000 UTC m=+738.610517476" Dec 06 09:18:42 crc kubenswrapper[4672]: I1206 09:18:42.319282 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:18:42 crc kubenswrapper[4672]: I1206 09:18:42.319362 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:18:45 crc kubenswrapper[4672]: I1206 09:18:45.156479 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mqk7f" Dec 06 09:18:45 crc kubenswrapper[4672]: I1206 09:18:45.749939 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-wjtmh" Dec 06 09:18:45 crc kubenswrapper[4672]: I1206 09:18:45.886635 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-ljcvb" Dec 06 09:18:46 crc kubenswrapper[4672]: I1206 09:18:46.765833 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-skjzl" Dec 06 09:18:49 crc kubenswrapper[4672]: I1206 09:18:49.606740 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-89vhg"] Dec 06 09:18:49 crc kubenswrapper[4672]: I1206 09:18:49.608272 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-89vhg" Dec 06 09:18:49 crc kubenswrapper[4672]: I1206 09:18:49.614233 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 06 09:18:49 crc kubenswrapper[4672]: I1206 09:18:49.614381 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 06 09:18:49 crc kubenswrapper[4672]: I1206 09:18:49.615141 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-xp877" Dec 06 09:18:49 crc kubenswrapper[4672]: I1206 09:18:49.636226 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-89vhg"] Dec 06 09:18:49 crc kubenswrapper[4672]: I1206 09:18:49.682646 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6x2h\" (UniqueName: \"kubernetes.io/projected/46000da3-def7-4b11-84a8-9e596ddae8ea-kube-api-access-d6x2h\") pod \"openstack-operator-index-89vhg\" (UID: \"46000da3-def7-4b11-84a8-9e596ddae8ea\") " pod="openstack-operators/openstack-operator-index-89vhg" Dec 06 09:18:49 crc kubenswrapper[4672]: I1206 09:18:49.783661 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6x2h\" (UniqueName: \"kubernetes.io/projected/46000da3-def7-4b11-84a8-9e596ddae8ea-kube-api-access-d6x2h\") pod \"openstack-operator-index-89vhg\" (UID: \"46000da3-def7-4b11-84a8-9e596ddae8ea\") " pod="openstack-operators/openstack-operator-index-89vhg" Dec 06 09:18:49 crc kubenswrapper[4672]: I1206 09:18:49.807212 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6x2h\" (UniqueName: \"kubernetes.io/projected/46000da3-def7-4b11-84a8-9e596ddae8ea-kube-api-access-d6x2h\") pod \"openstack-operator-index-89vhg\" (UID: \"46000da3-def7-4b11-84a8-9e596ddae8ea\") " pod="openstack-operators/openstack-operator-index-89vhg" Dec 06 09:18:49 crc kubenswrapper[4672]: I1206 09:18:49.930234 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-89vhg" Dec 06 09:18:50 crc kubenswrapper[4672]: I1206 09:18:50.340018 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-89vhg"] Dec 06 09:18:50 crc kubenswrapper[4672]: W1206 09:18:50.350501 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46000da3_def7_4b11_84a8_9e596ddae8ea.slice/crio-f76fdde1807179323278124cfb8ee02cc786263d27727cb3a8da9663249b75bc WatchSource:0}: Error finding container f76fdde1807179323278124cfb8ee02cc786263d27727cb3a8da9663249b75bc: Status 404 returned error can't find the container with id f76fdde1807179323278124cfb8ee02cc786263d27727cb3a8da9663249b75bc Dec 06 09:18:50 crc kubenswrapper[4672]: I1206 09:18:50.849474 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-89vhg" event={"ID":"46000da3-def7-4b11-84a8-9e596ddae8ea","Type":"ContainerStarted","Data":"f76fdde1807179323278124cfb8ee02cc786263d27727cb3a8da9663249b75bc"} Dec 06 09:18:52 crc kubenswrapper[4672]: I1206 09:18:52.379566 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-89vhg"] Dec 06 09:18:52 crc kubenswrapper[4672]: I1206 09:18:52.989806 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-rx48l"] Dec 06 09:18:52 crc kubenswrapper[4672]: I1206 09:18:52.991061 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rx48l" Dec 06 09:18:53 crc kubenswrapper[4672]: I1206 09:18:53.000787 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rx48l"] Dec 06 09:18:53 crc kubenswrapper[4672]: I1206 09:18:53.025344 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psxhc\" (UniqueName: \"kubernetes.io/projected/250af723-f950-4125-8748-d7eac336f4c1-kube-api-access-psxhc\") pod \"openstack-operator-index-rx48l\" (UID: \"250af723-f950-4125-8748-d7eac336f4c1\") " pod="openstack-operators/openstack-operator-index-rx48l" Dec 06 09:18:53 crc kubenswrapper[4672]: I1206 09:18:53.126193 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psxhc\" (UniqueName: \"kubernetes.io/projected/250af723-f950-4125-8748-d7eac336f4c1-kube-api-access-psxhc\") pod \"openstack-operator-index-rx48l\" (UID: \"250af723-f950-4125-8748-d7eac336f4c1\") " pod="openstack-operators/openstack-operator-index-rx48l" Dec 06 09:18:53 crc kubenswrapper[4672]: I1206 09:18:53.145524 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psxhc\" (UniqueName: \"kubernetes.io/projected/250af723-f950-4125-8748-d7eac336f4c1-kube-api-access-psxhc\") pod \"openstack-operator-index-rx48l\" (UID: \"250af723-f950-4125-8748-d7eac336f4c1\") " pod="openstack-operators/openstack-operator-index-rx48l" Dec 06 09:18:53 crc kubenswrapper[4672]: I1206 09:18:53.348184 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rx48l" Dec 06 09:18:53 crc kubenswrapper[4672]: I1206 09:18:53.752404 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rx48l"] Dec 06 09:18:53 crc kubenswrapper[4672]: I1206 09:18:53.869738 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-89vhg" event={"ID":"46000da3-def7-4b11-84a8-9e596ddae8ea","Type":"ContainerStarted","Data":"c3de0a81a7c58cc6b0365431d615778668d2a1bc4812c2729ed9cbd4a23bbf67"} Dec 06 09:18:53 crc kubenswrapper[4672]: I1206 09:18:53.869869 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-89vhg" podUID="46000da3-def7-4b11-84a8-9e596ddae8ea" containerName="registry-server" containerID="cri-o://c3de0a81a7c58cc6b0365431d615778668d2a1bc4812c2729ed9cbd4a23bbf67" gracePeriod=2 Dec 06 09:18:53 crc kubenswrapper[4672]: I1206 09:18:53.871292 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rx48l" event={"ID":"250af723-f950-4125-8748-d7eac336f4c1","Type":"ContainerStarted","Data":"3dae4754c03cc9945036bbb425754acb8efb86e7a4c5e803b787e83dd355bbe8"} Dec 06 09:18:53 crc kubenswrapper[4672]: I1206 09:18:53.889562 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-89vhg" podStartSLOduration=2.07090218 podStartE2EDuration="4.889546627s" podCreationTimestamp="2025-12-06 09:18:49 +0000 UTC" firstStartedPulling="2025-12-06 09:18:50.35227712 +0000 UTC m=+748.096537417" lastFinishedPulling="2025-12-06 09:18:53.170921577 +0000 UTC m=+750.915181864" observedRunningTime="2025-12-06 09:18:53.886166098 +0000 UTC m=+751.630426425" watchObservedRunningTime="2025-12-06 09:18:53.889546627 +0000 UTC m=+751.633806914" Dec 06 09:18:54 crc kubenswrapper[4672]: I1206 09:18:54.397571 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-89vhg" Dec 06 09:18:54 crc kubenswrapper[4672]: I1206 09:18:54.456772 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6x2h\" (UniqueName: \"kubernetes.io/projected/46000da3-def7-4b11-84a8-9e596ddae8ea-kube-api-access-d6x2h\") pod \"46000da3-def7-4b11-84a8-9e596ddae8ea\" (UID: \"46000da3-def7-4b11-84a8-9e596ddae8ea\") " Dec 06 09:18:54 crc kubenswrapper[4672]: I1206 09:18:54.462756 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46000da3-def7-4b11-84a8-9e596ddae8ea-kube-api-access-d6x2h" (OuterVolumeSpecName: "kube-api-access-d6x2h") pod "46000da3-def7-4b11-84a8-9e596ddae8ea" (UID: "46000da3-def7-4b11-84a8-9e596ddae8ea"). InnerVolumeSpecName "kube-api-access-d6x2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:18:54 crc kubenswrapper[4672]: I1206 09:18:54.558045 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6x2h\" (UniqueName: \"kubernetes.io/projected/46000da3-def7-4b11-84a8-9e596ddae8ea-kube-api-access-d6x2h\") on node \"crc\" DevicePath \"\"" Dec 06 09:18:54 crc kubenswrapper[4672]: I1206 09:18:54.879532 4672 generic.go:334] "Generic (PLEG): container finished" podID="46000da3-def7-4b11-84a8-9e596ddae8ea" containerID="c3de0a81a7c58cc6b0365431d615778668d2a1bc4812c2729ed9cbd4a23bbf67" exitCode=0 Dec 06 09:18:54 crc kubenswrapper[4672]: I1206 09:18:54.879716 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-89vhg" event={"ID":"46000da3-def7-4b11-84a8-9e596ddae8ea","Type":"ContainerDied","Data":"c3de0a81a7c58cc6b0365431d615778668d2a1bc4812c2729ed9cbd4a23bbf67"} Dec 06 09:18:54 crc kubenswrapper[4672]: I1206 09:18:54.879750 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-89vhg" event={"ID":"46000da3-def7-4b11-84a8-9e596ddae8ea","Type":"ContainerDied","Data":"f76fdde1807179323278124cfb8ee02cc786263d27727cb3a8da9663249b75bc"} Dec 06 09:18:54 crc kubenswrapper[4672]: I1206 09:18:54.879759 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-89vhg" Dec 06 09:18:54 crc kubenswrapper[4672]: I1206 09:18:54.879768 4672 scope.go:117] "RemoveContainer" containerID="c3de0a81a7c58cc6b0365431d615778668d2a1bc4812c2729ed9cbd4a23bbf67" Dec 06 09:18:54 crc kubenswrapper[4672]: I1206 09:18:54.882358 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rx48l" event={"ID":"250af723-f950-4125-8748-d7eac336f4c1","Type":"ContainerStarted","Data":"b0166d191b1857e4f2e360658f43de1cfd8c3fff9043df28da3b217331849a23"} Dec 06 09:18:54 crc kubenswrapper[4672]: I1206 09:18:54.900486 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-89vhg"] Dec 06 09:18:54 crc kubenswrapper[4672]: I1206 09:18:54.902014 4672 scope.go:117] "RemoveContainer" containerID="c3de0a81a7c58cc6b0365431d615778668d2a1bc4812c2729ed9cbd4a23bbf67" Dec 06 09:18:54 crc kubenswrapper[4672]: E1206 09:18:54.902519 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3de0a81a7c58cc6b0365431d615778668d2a1bc4812c2729ed9cbd4a23bbf67\": container with ID starting with c3de0a81a7c58cc6b0365431d615778668d2a1bc4812c2729ed9cbd4a23bbf67 not found: ID does not exist" containerID="c3de0a81a7c58cc6b0365431d615778668d2a1bc4812c2729ed9cbd4a23bbf67" Dec 06 09:18:54 crc kubenswrapper[4672]: I1206 09:18:54.902557 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3de0a81a7c58cc6b0365431d615778668d2a1bc4812c2729ed9cbd4a23bbf67"} err="failed to get container status \"c3de0a81a7c58cc6b0365431d615778668d2a1bc4812c2729ed9cbd4a23bbf67\": rpc error: code = NotFound desc = could not find container \"c3de0a81a7c58cc6b0365431d615778668d2a1bc4812c2729ed9cbd4a23bbf67\": container with ID starting with c3de0a81a7c58cc6b0365431d615778668d2a1bc4812c2729ed9cbd4a23bbf67 not found: ID does not exist" Dec 06 09:18:54 crc kubenswrapper[4672]: I1206 09:18:54.906362 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-89vhg"] Dec 06 09:18:56 crc kubenswrapper[4672]: I1206 09:18:56.568195 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46000da3-def7-4b11-84a8-9e596ddae8ea" path="/var/lib/kubelet/pods/46000da3-def7-4b11-84a8-9e596ddae8ea/volumes" Dec 06 09:19:03 crc kubenswrapper[4672]: I1206 09:19:03.349033 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-rx48l" Dec 06 09:19:03 crc kubenswrapper[4672]: I1206 09:19:03.351296 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-rx48l" Dec 06 09:19:03 crc kubenswrapper[4672]: I1206 09:19:03.380249 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-rx48l" Dec 06 09:19:03 crc kubenswrapper[4672]: I1206 09:19:03.395117 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-rx48l" podStartSLOduration=10.970223852 podStartE2EDuration="11.395101911s" podCreationTimestamp="2025-12-06 09:18:52 +0000 UTC" firstStartedPulling="2025-12-06 09:18:53.761444227 +0000 UTC m=+751.505704514" lastFinishedPulling="2025-12-06 09:18:54.186322266 +0000 UTC m=+751.930582573" observedRunningTime="2025-12-06 09:18:54.914824335 +0000 UTC m=+752.659084622" watchObservedRunningTime="2025-12-06 09:19:03.395101911 +0000 UTC m=+761.139362198" Dec 06 09:19:03 crc kubenswrapper[4672]: I1206 09:19:03.980941 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-rx48l" Dec 06 09:19:06 crc kubenswrapper[4672]: I1206 09:19:06.028380 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm"] Dec 06 09:19:06 crc kubenswrapper[4672]: E1206 09:19:06.029117 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46000da3-def7-4b11-84a8-9e596ddae8ea" containerName="registry-server" Dec 06 09:19:06 crc kubenswrapper[4672]: I1206 09:19:06.029140 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="46000da3-def7-4b11-84a8-9e596ddae8ea" containerName="registry-server" Dec 06 09:19:06 crc kubenswrapper[4672]: I1206 09:19:06.029331 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="46000da3-def7-4b11-84a8-9e596ddae8ea" containerName="registry-server" Dec 06 09:19:06 crc kubenswrapper[4672]: I1206 09:19:06.030626 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm" Dec 06 09:19:06 crc kubenswrapper[4672]: I1206 09:19:06.034951 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm"] Dec 06 09:19:06 crc kubenswrapper[4672]: I1206 09:19:06.036405 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-hp54m" Dec 06 09:19:06 crc kubenswrapper[4672]: I1206 09:19:06.223842 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a2459c7d-a6d6-48c8-9a18-48d05c0129a9-bundle\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm\" (UID: \"a2459c7d-a6d6-48c8-9a18-48d05c0129a9\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm" Dec 06 09:19:06 crc kubenswrapper[4672]: I1206 09:19:06.223957 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nvj6\" (UniqueName: \"kubernetes.io/projected/a2459c7d-a6d6-48c8-9a18-48d05c0129a9-kube-api-access-2nvj6\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm\" (UID: \"a2459c7d-a6d6-48c8-9a18-48d05c0129a9\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm" Dec 06 09:19:06 crc kubenswrapper[4672]: I1206 09:19:06.224000 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a2459c7d-a6d6-48c8-9a18-48d05c0129a9-util\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm\" (UID: \"a2459c7d-a6d6-48c8-9a18-48d05c0129a9\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm" Dec 06 09:19:06 crc kubenswrapper[4672]: I1206 09:19:06.324833 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a2459c7d-a6d6-48c8-9a18-48d05c0129a9-bundle\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm\" (UID: \"a2459c7d-a6d6-48c8-9a18-48d05c0129a9\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm" Dec 06 09:19:06 crc kubenswrapper[4672]: I1206 09:19:06.324905 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nvj6\" (UniqueName: \"kubernetes.io/projected/a2459c7d-a6d6-48c8-9a18-48d05c0129a9-kube-api-access-2nvj6\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm\" (UID: \"a2459c7d-a6d6-48c8-9a18-48d05c0129a9\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm" Dec 06 09:19:06 crc kubenswrapper[4672]: I1206 09:19:06.324934 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a2459c7d-a6d6-48c8-9a18-48d05c0129a9-util\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm\" (UID: \"a2459c7d-a6d6-48c8-9a18-48d05c0129a9\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm" Dec 06 09:19:06 crc kubenswrapper[4672]: I1206 09:19:06.325319 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a2459c7d-a6d6-48c8-9a18-48d05c0129a9-util\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm\" (UID: \"a2459c7d-a6d6-48c8-9a18-48d05c0129a9\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm" Dec 06 09:19:06 crc kubenswrapper[4672]: I1206 09:19:06.325532 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a2459c7d-a6d6-48c8-9a18-48d05c0129a9-bundle\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm\" (UID: \"a2459c7d-a6d6-48c8-9a18-48d05c0129a9\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm" Dec 06 09:19:06 crc kubenswrapper[4672]: I1206 09:19:06.351173 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nvj6\" (UniqueName: \"kubernetes.io/projected/a2459c7d-a6d6-48c8-9a18-48d05c0129a9-kube-api-access-2nvj6\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm\" (UID: \"a2459c7d-a6d6-48c8-9a18-48d05c0129a9\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm" Dec 06 09:19:06 crc kubenswrapper[4672]: I1206 09:19:06.355271 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm" Dec 06 09:19:06 crc kubenswrapper[4672]: I1206 09:19:06.831912 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm"] Dec 06 09:19:06 crc kubenswrapper[4672]: W1206 09:19:06.844314 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2459c7d_a6d6_48c8_9a18_48d05c0129a9.slice/crio-98c94d5f1368eadcd8cecc15566bdf1482a227c7a5e5e433435aa2cc28c39d52 WatchSource:0}: Error finding container 98c94d5f1368eadcd8cecc15566bdf1482a227c7a5e5e433435aa2cc28c39d52: Status 404 returned error can't find the container with id 98c94d5f1368eadcd8cecc15566bdf1482a227c7a5e5e433435aa2cc28c39d52 Dec 06 09:19:06 crc kubenswrapper[4672]: I1206 09:19:06.980955 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm" event={"ID":"a2459c7d-a6d6-48c8-9a18-48d05c0129a9","Type":"ContainerStarted","Data":"98c94d5f1368eadcd8cecc15566bdf1482a227c7a5e5e433435aa2cc28c39d52"} Dec 06 09:19:07 crc kubenswrapper[4672]: I1206 09:19:07.990404 4672 generic.go:334] "Generic (PLEG): container finished" podID="a2459c7d-a6d6-48c8-9a18-48d05c0129a9" containerID="72abe00dda7731520722cc49667dc440b458c72521b0f00037c3a6e9967b2ded" exitCode=0 Dec 06 09:19:07 crc kubenswrapper[4672]: I1206 09:19:07.990453 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm" event={"ID":"a2459c7d-a6d6-48c8-9a18-48d05c0129a9","Type":"ContainerDied","Data":"72abe00dda7731520722cc49667dc440b458c72521b0f00037c3a6e9967b2ded"} Dec 06 09:19:08 crc kubenswrapper[4672]: I1206 09:19:08.671357 4672 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 06 09:19:08 crc kubenswrapper[4672]: I1206 09:19:08.996103 4672 generic.go:334] "Generic (PLEG): container finished" podID="a2459c7d-a6d6-48c8-9a18-48d05c0129a9" containerID="94fb476cbcaa9b0610816659d7b679025ece0a981ea213d000cd824812c1acc6" exitCode=0 Dec 06 09:19:08 crc kubenswrapper[4672]: I1206 09:19:08.996153 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm" event={"ID":"a2459c7d-a6d6-48c8-9a18-48d05c0129a9","Type":"ContainerDied","Data":"94fb476cbcaa9b0610816659d7b679025ece0a981ea213d000cd824812c1acc6"} Dec 06 09:19:10 crc kubenswrapper[4672]: I1206 09:19:10.005586 4672 generic.go:334] "Generic (PLEG): container finished" podID="a2459c7d-a6d6-48c8-9a18-48d05c0129a9" containerID="f36f2887c392eb8ee4af99619433a6ebc81724fdc3bcaab8c922282c457ca155" exitCode=0 Dec 06 09:19:10 crc kubenswrapper[4672]: I1206 09:19:10.005885 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm" event={"ID":"a2459c7d-a6d6-48c8-9a18-48d05c0129a9","Type":"ContainerDied","Data":"f36f2887c392eb8ee4af99619433a6ebc81724fdc3bcaab8c922282c457ca155"} Dec 06 09:19:11 crc kubenswrapper[4672]: I1206 09:19:11.363024 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm" Dec 06 09:19:11 crc kubenswrapper[4672]: I1206 09:19:11.498520 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a2459c7d-a6d6-48c8-9a18-48d05c0129a9-bundle\") pod \"a2459c7d-a6d6-48c8-9a18-48d05c0129a9\" (UID: \"a2459c7d-a6d6-48c8-9a18-48d05c0129a9\") " Dec 06 09:19:11 crc kubenswrapper[4672]: I1206 09:19:11.498919 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nvj6\" (UniqueName: \"kubernetes.io/projected/a2459c7d-a6d6-48c8-9a18-48d05c0129a9-kube-api-access-2nvj6\") pod \"a2459c7d-a6d6-48c8-9a18-48d05c0129a9\" (UID: \"a2459c7d-a6d6-48c8-9a18-48d05c0129a9\") " Dec 06 09:19:11 crc kubenswrapper[4672]: I1206 09:19:11.499089 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a2459c7d-a6d6-48c8-9a18-48d05c0129a9-util\") pod \"a2459c7d-a6d6-48c8-9a18-48d05c0129a9\" (UID: \"a2459c7d-a6d6-48c8-9a18-48d05c0129a9\") " Dec 06 09:19:11 crc kubenswrapper[4672]: I1206 09:19:11.499502 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2459c7d-a6d6-48c8-9a18-48d05c0129a9-bundle" (OuterVolumeSpecName: "bundle") pod "a2459c7d-a6d6-48c8-9a18-48d05c0129a9" (UID: "a2459c7d-a6d6-48c8-9a18-48d05c0129a9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:19:11 crc kubenswrapper[4672]: I1206 09:19:11.499800 4672 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a2459c7d-a6d6-48c8-9a18-48d05c0129a9-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:19:11 crc kubenswrapper[4672]: I1206 09:19:11.504854 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2459c7d-a6d6-48c8-9a18-48d05c0129a9-kube-api-access-2nvj6" (OuterVolumeSpecName: "kube-api-access-2nvj6") pod "a2459c7d-a6d6-48c8-9a18-48d05c0129a9" (UID: "a2459c7d-a6d6-48c8-9a18-48d05c0129a9"). InnerVolumeSpecName "kube-api-access-2nvj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:19:11 crc kubenswrapper[4672]: I1206 09:19:11.516566 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2459c7d-a6d6-48c8-9a18-48d05c0129a9-util" (OuterVolumeSpecName: "util") pod "a2459c7d-a6d6-48c8-9a18-48d05c0129a9" (UID: "a2459c7d-a6d6-48c8-9a18-48d05c0129a9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:19:11 crc kubenswrapper[4672]: I1206 09:19:11.601475 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nvj6\" (UniqueName: \"kubernetes.io/projected/a2459c7d-a6d6-48c8-9a18-48d05c0129a9-kube-api-access-2nvj6\") on node \"crc\" DevicePath \"\"" Dec 06 09:19:11 crc kubenswrapper[4672]: I1206 09:19:11.601535 4672 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a2459c7d-a6d6-48c8-9a18-48d05c0129a9-util\") on node \"crc\" DevicePath \"\"" Dec 06 09:19:12 crc kubenswrapper[4672]: I1206 09:19:12.026170 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm" event={"ID":"a2459c7d-a6d6-48c8-9a18-48d05c0129a9","Type":"ContainerDied","Data":"98c94d5f1368eadcd8cecc15566bdf1482a227c7a5e5e433435aa2cc28c39d52"} Dec 06 09:19:12 crc kubenswrapper[4672]: I1206 09:19:12.026253 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm" Dec 06 09:19:12 crc kubenswrapper[4672]: I1206 09:19:12.026266 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98c94d5f1368eadcd8cecc15566bdf1482a227c7a5e5e433435aa2cc28c39d52" Dec 06 09:19:12 crc kubenswrapper[4672]: I1206 09:19:12.320386 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:19:12 crc kubenswrapper[4672]: I1206 09:19:12.320475 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:19:18 crc kubenswrapper[4672]: I1206 09:19:18.258088 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55b6fb9447-vc2xx"] Dec 06 09:19:18 crc kubenswrapper[4672]: E1206 09:19:18.258743 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2459c7d-a6d6-48c8-9a18-48d05c0129a9" containerName="util" Dec 06 09:19:18 crc kubenswrapper[4672]: I1206 09:19:18.258754 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2459c7d-a6d6-48c8-9a18-48d05c0129a9" containerName="util" Dec 06 09:19:18 crc kubenswrapper[4672]: E1206 09:19:18.258770 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2459c7d-a6d6-48c8-9a18-48d05c0129a9" containerName="extract" Dec 06 09:19:18 crc kubenswrapper[4672]: I1206 09:19:18.258776 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2459c7d-a6d6-48c8-9a18-48d05c0129a9" containerName="extract" Dec 06 09:19:18 crc kubenswrapper[4672]: E1206 09:19:18.258789 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2459c7d-a6d6-48c8-9a18-48d05c0129a9" containerName="pull" Dec 06 09:19:18 crc kubenswrapper[4672]: I1206 09:19:18.258795 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2459c7d-a6d6-48c8-9a18-48d05c0129a9" containerName="pull" Dec 06 09:19:18 crc kubenswrapper[4672]: I1206 09:19:18.258908 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2459c7d-a6d6-48c8-9a18-48d05c0129a9" containerName="extract" Dec 06 09:19:18 crc kubenswrapper[4672]: I1206 09:19:18.259283 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-vc2xx" Dec 06 09:19:18 crc kubenswrapper[4672]: I1206 09:19:18.262340 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-2nkmw" Dec 06 09:19:18 crc kubenswrapper[4672]: I1206 09:19:18.347374 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55b6fb9447-vc2xx"] Dec 06 09:19:18 crc kubenswrapper[4672]: I1206 09:19:18.393689 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txwtm\" (UniqueName: \"kubernetes.io/projected/63582a9a-093b-44e1-8932-4b910f301e52-kube-api-access-txwtm\") pod \"openstack-operator-controller-operator-55b6fb9447-vc2xx\" (UID: \"63582a9a-093b-44e1-8932-4b910f301e52\") " pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-vc2xx" Dec 06 09:19:18 crc kubenswrapper[4672]: I1206 09:19:18.495311 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txwtm\" (UniqueName: \"kubernetes.io/projected/63582a9a-093b-44e1-8932-4b910f301e52-kube-api-access-txwtm\") pod \"openstack-operator-controller-operator-55b6fb9447-vc2xx\" (UID: \"63582a9a-093b-44e1-8932-4b910f301e52\") " pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-vc2xx" Dec 06 09:19:18 crc kubenswrapper[4672]: I1206 09:19:18.514807 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txwtm\" (UniqueName: \"kubernetes.io/projected/63582a9a-093b-44e1-8932-4b910f301e52-kube-api-access-txwtm\") pod \"openstack-operator-controller-operator-55b6fb9447-vc2xx\" (UID: \"63582a9a-093b-44e1-8932-4b910f301e52\") " pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-vc2xx" Dec 06 09:19:18 crc kubenswrapper[4672]: I1206 09:19:18.574742 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-vc2xx" Dec 06 09:19:19 crc kubenswrapper[4672]: I1206 09:19:19.047934 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55b6fb9447-vc2xx"] Dec 06 09:19:19 crc kubenswrapper[4672]: I1206 09:19:19.072400 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-vc2xx" event={"ID":"63582a9a-093b-44e1-8932-4b910f301e52","Type":"ContainerStarted","Data":"a2632d2a5a7003639136758cb9df9322a39d0c741fe59b1c770c869e3231d517"} Dec 06 09:19:25 crc kubenswrapper[4672]: I1206 09:19:25.108110 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-vc2xx" event={"ID":"63582a9a-093b-44e1-8932-4b910f301e52","Type":"ContainerStarted","Data":"ffbd398e3970ef5999a1938e3f75501fecdbd208e2609553780a8f303de42d27"} Dec 06 09:19:25 crc kubenswrapper[4672]: I1206 09:19:25.108674 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-vc2xx" Dec 06 09:19:25 crc kubenswrapper[4672]: I1206 09:19:25.150913 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-vc2xx" podStartSLOduration=2.226558976 podStartE2EDuration="7.15088655s" podCreationTimestamp="2025-12-06 09:19:18 +0000 UTC" firstStartedPulling="2025-12-06 09:19:19.0514102 +0000 UTC m=+776.795670497" lastFinishedPulling="2025-12-06 09:19:23.975737784 +0000 UTC m=+781.719998071" observedRunningTime="2025-12-06 09:19:25.144016989 +0000 UTC m=+782.888277296" watchObservedRunningTime="2025-12-06 09:19:25.15088655 +0000 UTC m=+782.895146847" Dec 06 09:19:38 crc kubenswrapper[4672]: I1206 09:19:38.577830 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-vc2xx" Dec 06 09:19:42 crc kubenswrapper[4672]: I1206 09:19:42.319795 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:19:42 crc kubenswrapper[4672]: I1206 09:19:42.319884 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:19:42 crc kubenswrapper[4672]: I1206 09:19:42.319948 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 09:19:42 crc kubenswrapper[4672]: I1206 09:19:42.320815 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"157a1103c9931308d56d2a9afffb01b9138166ad6f81a369e330a682cba427f9"} pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 09:19:42 crc kubenswrapper[4672]: I1206 09:19:42.320918 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" containerID="cri-o://157a1103c9931308d56d2a9afffb01b9138166ad6f81a369e330a682cba427f9" gracePeriod=600 Dec 06 09:19:44 crc kubenswrapper[4672]: I1206 09:19:44.224189 4672 generic.go:334] "Generic (PLEG): container finished" podID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerID="157a1103c9931308d56d2a9afffb01b9138166ad6f81a369e330a682cba427f9" exitCode=0 Dec 06 09:19:44 crc kubenswrapper[4672]: I1206 09:19:44.224329 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerDied","Data":"157a1103c9931308d56d2a9afffb01b9138166ad6f81a369e330a682cba427f9"} Dec 06 09:19:44 crc kubenswrapper[4672]: I1206 09:19:44.224522 4672 scope.go:117] "RemoveContainer" containerID="2b410864b2f905e632c9f0faa7e115cee3e4f8d1dd843cd26f566a60bf5790f9" Dec 06 09:19:45 crc kubenswrapper[4672]: I1206 09:19:45.234244 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerStarted","Data":"6dc0e941a4dd3e79f056ce0d1f08eb3aa888fb31efcafdbd3ecc3f28c01b9f06"} Dec 06 09:19:58 crc kubenswrapper[4672]: I1206 09:19:58.923318 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-lh7x2"] Dec 06 09:19:58 crc kubenswrapper[4672]: I1206 09:19:58.924536 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-lh7x2" Dec 06 09:19:58 crc kubenswrapper[4672]: I1206 09:19:58.931872 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-85ljj" Dec 06 09:19:58 crc kubenswrapper[4672]: I1206 09:19:58.951490 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-lh7x2"] Dec 06 09:19:58 crc kubenswrapper[4672]: I1206 09:19:58.959490 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-cpc5n"] Dec 06 09:19:58 crc kubenswrapper[4672]: I1206 09:19:58.960351 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-cpc5n" Dec 06 09:19:58 crc kubenswrapper[4672]: I1206 09:19:58.962110 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-k4fhv" Dec 06 09:19:58 crc kubenswrapper[4672]: I1206 09:19:58.983538 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-6jcpj"] Dec 06 09:19:58 crc kubenswrapper[4672]: I1206 09:19:58.984392 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-6jcpj" Dec 06 09:19:58 crc kubenswrapper[4672]: I1206 09:19:58.988234 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-kkg6q" Dec 06 09:19:58 crc kubenswrapper[4672]: I1206 09:19:58.990138 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-cpc5n"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.005169 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-6jcpj"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.048422 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-p7c94"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.049447 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-p7c94" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.052464 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-96q6c" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.061660 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-2zwxr"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.062737 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-2zwxr" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.065279 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-p7c94"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.082264 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-8blqj" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.086672 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2tks\" (UniqueName: \"kubernetes.io/projected/7e99a7a0-5a1d-4143-a8b7-9fb170d119a2-kube-api-access-t2tks\") pod \"designate-operator-controller-manager-78b4bc895b-6jcpj\" (UID: \"7e99a7a0-5a1d-4143-a8b7-9fb170d119a2\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-6jcpj" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.096926 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7twm7\" (UniqueName: \"kubernetes.io/projected/ce4e8b8a-4f3a-4303-9455-8eb984c06f57-kube-api-access-7twm7\") pod \"barbican-operator-controller-manager-7d9dfd778-lh7x2\" (UID: \"ce4e8b8a-4f3a-4303-9455-8eb984c06f57\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-lh7x2" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.097166 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drl48\" (UniqueName: \"kubernetes.io/projected/7dc29189-4c37-4886-af89-7c6cb57f237e-kube-api-access-drl48\") pod \"cinder-operator-controller-manager-859b6ccc6-cpc5n\" (UID: \"7dc29189-4c37-4886-af89-7c6cb57f237e\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-cpc5n" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.140421 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dvzm4"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.156233 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dvzm4" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.158751 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-2zwxr"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.170070 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-7xw8q" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.202303 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nl2x\" (UniqueName: \"kubernetes.io/projected/96ee3cc6-bf15-4fa0-9efc-7a0aa1338b43-kube-api-access-2nl2x\") pod \"heat-operator-controller-manager-5f64f6f8bb-2zwxr\" (UID: \"96ee3cc6-bf15-4fa0-9efc-7a0aa1338b43\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-2zwxr" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.202382 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2tks\" (UniqueName: \"kubernetes.io/projected/7e99a7a0-5a1d-4143-a8b7-9fb170d119a2-kube-api-access-t2tks\") pod \"designate-operator-controller-manager-78b4bc895b-6jcpj\" (UID: \"7e99a7a0-5a1d-4143-a8b7-9fb170d119a2\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-6jcpj" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.202430 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjbtr\" (UniqueName: \"kubernetes.io/projected/018edeb2-cc58-49fe-a7ea-15a8b6646ddd-kube-api-access-gjbtr\") pod \"glance-operator-controller-manager-77987cd8cd-p7c94\" (UID: \"018edeb2-cc58-49fe-a7ea-15a8b6646ddd\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-p7c94" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.202447 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7twm7\" (UniqueName: \"kubernetes.io/projected/ce4e8b8a-4f3a-4303-9455-8eb984c06f57-kube-api-access-7twm7\") pod \"barbican-operator-controller-manager-7d9dfd778-lh7x2\" (UID: \"ce4e8b8a-4f3a-4303-9455-8eb984c06f57\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-lh7x2" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.202469 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drl48\" (UniqueName: \"kubernetes.io/projected/7dc29189-4c37-4886-af89-7c6cb57f237e-kube-api-access-drl48\") pod \"cinder-operator-controller-manager-859b6ccc6-cpc5n\" (UID: \"7dc29189-4c37-4886-af89-7c6cb57f237e\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-cpc5n" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.241319 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dvzm4"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.241405 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-rwjvr"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.242773 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-rwjvr" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.248140 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7twm7\" (UniqueName: \"kubernetes.io/projected/ce4e8b8a-4f3a-4303-9455-8eb984c06f57-kube-api-access-7twm7\") pod \"barbican-operator-controller-manager-7d9dfd778-lh7x2\" (UID: \"ce4e8b8a-4f3a-4303-9455-8eb984c06f57\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-lh7x2" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.255198 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-79tgh" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.255440 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-8ql2p"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.256727 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-8ql2p" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.257119 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.261498 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2tks\" (UniqueName: \"kubernetes.io/projected/7e99a7a0-5a1d-4143-a8b7-9fb170d119a2-kube-api-access-t2tks\") pod \"designate-operator-controller-manager-78b4bc895b-6jcpj\" (UID: \"7e99a7a0-5a1d-4143-a8b7-9fb170d119a2\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-6jcpj" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.261966 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-lwzlq" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.269670 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-j7cvj"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.270570 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drl48\" (UniqueName: \"kubernetes.io/projected/7dc29189-4c37-4886-af89-7c6cb57f237e-kube-api-access-drl48\") pod \"cinder-operator-controller-manager-859b6ccc6-cpc5n\" (UID: \"7dc29189-4c37-4886-af89-7c6cb57f237e\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-cpc5n" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.270590 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-j7cvj" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.274238 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-9twms" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.274384 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-rwjvr"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.287270 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-cpc5n" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.300334 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-6jcpj" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.304310 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nl2x\" (UniqueName: \"kubernetes.io/projected/96ee3cc6-bf15-4fa0-9efc-7a0aa1338b43-kube-api-access-2nl2x\") pod \"heat-operator-controller-manager-5f64f6f8bb-2zwxr\" (UID: \"96ee3cc6-bf15-4fa0-9efc-7a0aa1338b43\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-2zwxr" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.304391 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl4gp\" (UniqueName: \"kubernetes.io/projected/7753548d-df52-4a65-b447-d20dcd379cde-kube-api-access-xl4gp\") pod \"horizon-operator-controller-manager-68c6d99b8f-dvzm4\" (UID: \"7753548d-df52-4a65-b447-d20dcd379cde\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dvzm4" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.304481 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjbtr\" (UniqueName: \"kubernetes.io/projected/018edeb2-cc58-49fe-a7ea-15a8b6646ddd-kube-api-access-gjbtr\") pod \"glance-operator-controller-manager-77987cd8cd-p7c94\" (UID: \"018edeb2-cc58-49fe-a7ea-15a8b6646ddd\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-p7c94" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.316301 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-8ql2p"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.338185 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjbtr\" (UniqueName: \"kubernetes.io/projected/018edeb2-cc58-49fe-a7ea-15a8b6646ddd-kube-api-access-gjbtr\") pod \"glance-operator-controller-manager-77987cd8cd-p7c94\" (UID: \"018edeb2-cc58-49fe-a7ea-15a8b6646ddd\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-p7c94" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.369688 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-p7c94" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.370374 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-zxcvx"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.371549 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-j7cvj"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.371641 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zxcvx" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.381265 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nl2x\" (UniqueName: \"kubernetes.io/projected/96ee3cc6-bf15-4fa0-9efc-7a0aa1338b43-kube-api-access-2nl2x\") pod \"heat-operator-controller-manager-5f64f6f8bb-2zwxr\" (UID: \"96ee3cc6-bf15-4fa0-9efc-7a0aa1338b43\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-2zwxr" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.383342 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-crvwd" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.383491 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-crbgz"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.384440 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-crbgz" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.390840 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-k26d8" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.398755 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-zxcvx"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.405500 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-2zwxr" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.406260 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8plzz\" (UniqueName: \"kubernetes.io/projected/6bbb7d8a-ba3a-476a-b09d-0fd084fc325e-kube-api-access-8plzz\") pod \"infra-operator-controller-manager-57548d458d-rwjvr\" (UID: \"6bbb7d8a-ba3a-476a-b09d-0fd084fc325e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-rwjvr" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.406311 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmm96\" (UniqueName: \"kubernetes.io/projected/308c58b1-3c6a-4c79-88fc-b4d515efd96d-kube-api-access-fmm96\") pod \"keystone-operator-controller-manager-7765d96ddf-j7cvj\" (UID: \"308c58b1-3c6a-4c79-88fc-b4d515efd96d\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-j7cvj" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.406345 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6bbb7d8a-ba3a-476a-b09d-0fd084fc325e-cert\") pod \"infra-operator-controller-manager-57548d458d-rwjvr\" (UID: \"6bbb7d8a-ba3a-476a-b09d-0fd084fc325e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-rwjvr" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.406365 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr7cp\" (UniqueName: \"kubernetes.io/projected/9977f421-c235-40ef-8d9f-2e0125bf3593-kube-api-access-zr7cp\") pod \"ironic-operator-controller-manager-6c548fd776-8ql2p\" (UID: \"9977f421-c235-40ef-8d9f-2e0125bf3593\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-8ql2p" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.406392 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl4gp\" (UniqueName: \"kubernetes.io/projected/7753548d-df52-4a65-b447-d20dcd379cde-kube-api-access-xl4gp\") pod \"horizon-operator-controller-manager-68c6d99b8f-dvzm4\" (UID: \"7753548d-df52-4a65-b447-d20dcd379cde\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dvzm4" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.450868 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl4gp\" (UniqueName: \"kubernetes.io/projected/7753548d-df52-4a65-b447-d20dcd379cde-kube-api-access-xl4gp\") pod \"horizon-operator-controller-manager-68c6d99b8f-dvzm4\" (UID: \"7753548d-df52-4a65-b447-d20dcd379cde\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dvzm4" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.451147 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pqnb9"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.462248 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pqnb9" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.473161 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-crbgz"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.479614 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dvzm4" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.484671 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pqnb9"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.489706 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-wwlw7" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.490528 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-kpmch"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.491870 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-kpmch" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.506735 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-h82hc" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.514807 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8plzz\" (UniqueName: \"kubernetes.io/projected/6bbb7d8a-ba3a-476a-b09d-0fd084fc325e-kube-api-access-8plzz\") pod \"infra-operator-controller-manager-57548d458d-rwjvr\" (UID: \"6bbb7d8a-ba3a-476a-b09d-0fd084fc325e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-rwjvr" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.514918 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz867\" (UniqueName: \"kubernetes.io/projected/27d7a7f5-ab93-40b6-8718-0a8b930d2c0f-kube-api-access-tz867\") pod \"mariadb-operator-controller-manager-56bbcc9d85-crbgz\" (UID: \"27d7a7f5-ab93-40b6-8718-0a8b930d2c0f\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-crbgz" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.514965 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmm96\" (UniqueName: \"kubernetes.io/projected/308c58b1-3c6a-4c79-88fc-b4d515efd96d-kube-api-access-fmm96\") pod \"keystone-operator-controller-manager-7765d96ddf-j7cvj\" (UID: \"308c58b1-3c6a-4c79-88fc-b4d515efd96d\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-j7cvj" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.515011 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57k4f\" (UniqueName: \"kubernetes.io/projected/3fda2255-f593-42c6-b17e-2996a6ce7c5e-kube-api-access-57k4f\") pod \"manila-operator-controller-manager-7c79b5df47-zxcvx\" (UID: \"3fda2255-f593-42c6-b17e-2996a6ce7c5e\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zxcvx" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.515045 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr7cp\" (UniqueName: \"kubernetes.io/projected/9977f421-c235-40ef-8d9f-2e0125bf3593-kube-api-access-zr7cp\") pod \"ironic-operator-controller-manager-6c548fd776-8ql2p\" (UID: \"9977f421-c235-40ef-8d9f-2e0125bf3593\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-8ql2p" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.515062 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6bbb7d8a-ba3a-476a-b09d-0fd084fc325e-cert\") pod \"infra-operator-controller-manager-57548d458d-rwjvr\" (UID: \"6bbb7d8a-ba3a-476a-b09d-0fd084fc325e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-rwjvr" Dec 06 09:19:59 crc kubenswrapper[4672]: E1206 09:19:59.515204 4672 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 06 09:19:59 crc kubenswrapper[4672]: E1206 09:19:59.515256 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bbb7d8a-ba3a-476a-b09d-0fd084fc325e-cert podName:6bbb7d8a-ba3a-476a-b09d-0fd084fc325e nodeName:}" failed. No retries permitted until 2025-12-06 09:20:00.015238011 +0000 UTC m=+817.759498298 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6bbb7d8a-ba3a-476a-b09d-0fd084fc325e-cert") pod "infra-operator-controller-manager-57548d458d-rwjvr" (UID: "6bbb7d8a-ba3a-476a-b09d-0fd084fc325e") : secret "infra-operator-webhook-server-cert" not found Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.551773 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-lh7x2" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.553564 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-kpmch"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.558770 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-nkk8g"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.561568 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-nkk8g" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.567757 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-c6jff" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.579435 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8plzz\" (UniqueName: \"kubernetes.io/projected/6bbb7d8a-ba3a-476a-b09d-0fd084fc325e-kube-api-access-8plzz\") pod \"infra-operator-controller-manager-57548d458d-rwjvr\" (UID: \"6bbb7d8a-ba3a-476a-b09d-0fd084fc325e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-rwjvr" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.581773 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f586tjc"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.588739 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmm96\" (UniqueName: \"kubernetes.io/projected/308c58b1-3c6a-4c79-88fc-b4d515efd96d-kube-api-access-fmm96\") pod \"keystone-operator-controller-manager-7765d96ddf-j7cvj\" (UID: \"308c58b1-3c6a-4c79-88fc-b4d515efd96d\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-j7cvj" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.605736 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr7cp\" (UniqueName: \"kubernetes.io/projected/9977f421-c235-40ef-8d9f-2e0125bf3593-kube-api-access-zr7cp\") pod \"ironic-operator-controller-manager-6c548fd776-8ql2p\" (UID: \"9977f421-c235-40ef-8d9f-2e0125bf3593\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-8ql2p" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.615757 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rf68\" (UniqueName: \"kubernetes.io/projected/8244458a-10b4-4c4f-8f9e-dc93e90329af-kube-api-access-2rf68\") pod \"nova-operator-controller-manager-697bc559fc-kpmch\" (UID: \"8244458a-10b4-4c4f-8f9e-dc93e90329af\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-kpmch" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.615893 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57k4f\" (UniqueName: \"kubernetes.io/projected/3fda2255-f593-42c6-b17e-2996a6ce7c5e-kube-api-access-57k4f\") pod \"manila-operator-controller-manager-7c79b5df47-zxcvx\" (UID: \"3fda2255-f593-42c6-b17e-2996a6ce7c5e\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zxcvx" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.616023 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks2zq\" (UniqueName: \"kubernetes.io/projected/a59bea52-a8d1-4ac9-8ce0-0a623efcb009-kube-api-access-ks2zq\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-pqnb9\" (UID: \"a59bea52-a8d1-4ac9-8ce0-0a623efcb009\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pqnb9" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.616130 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz867\" (UniqueName: \"kubernetes.io/projected/27d7a7f5-ab93-40b6-8718-0a8b930d2c0f-kube-api-access-tz867\") pod \"mariadb-operator-controller-manager-56bbcc9d85-crbgz\" (UID: \"27d7a7f5-ab93-40b6-8718-0a8b930d2c0f\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-crbgz" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.633244 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-nkk8g"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.633336 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f586tjc" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.634566 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-nqh5d"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.638100 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nqh5d" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.645153 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-ts242" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.655038 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.658376 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-97gjs" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.660446 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57k4f\" (UniqueName: \"kubernetes.io/projected/3fda2255-f593-42c6-b17e-2996a6ce7c5e-kube-api-access-57k4f\") pod \"manila-operator-controller-manager-7c79b5df47-zxcvx\" (UID: \"3fda2255-f593-42c6-b17e-2996a6ce7c5e\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zxcvx" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.664678 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f586tjc"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.668093 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-nqh5d"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.673642 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz867\" (UniqueName: \"kubernetes.io/projected/27d7a7f5-ab93-40b6-8718-0a8b930d2c0f-kube-api-access-tz867\") pod \"mariadb-operator-controller-manager-56bbcc9d85-crbgz\" (UID: \"27d7a7f5-ab93-40b6-8718-0a8b930d2c0f\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-crbgz" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.684563 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5z9dq"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.695047 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5z9dq" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.751102 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-vxgjl"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.752847 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-vxgjl" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.759119 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-bbstz" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.767147 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-w4x8m" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.770406 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rf68\" (UniqueName: \"kubernetes.io/projected/8244458a-10b4-4c4f-8f9e-dc93e90329af-kube-api-access-2rf68\") pod \"nova-operator-controller-manager-697bc559fc-kpmch\" (UID: \"8244458a-10b4-4c4f-8f9e-dc93e90329af\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-kpmch" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.770528 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xkv8\" (UniqueName: \"kubernetes.io/projected/d1ba66a9-3383-413f-b2d3-fb13a4e4592b-kube-api-access-4xkv8\") pod \"placement-operator-controller-manager-78f8948974-vxgjl\" (UID: \"d1ba66a9-3383-413f-b2d3-fb13a4e4592b\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-vxgjl" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.770590 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s9r9\" (UniqueName: \"kubernetes.io/projected/4794dd53-214a-4537-90c9-0527db628c8b-kube-api-access-2s9r9\") pod \"openstack-baremetal-operator-controller-manager-55c85496f586tjc\" (UID: \"4794dd53-214a-4537-90c9-0527db628c8b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f586tjc" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.770638 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks2zq\" (UniqueName: \"kubernetes.io/projected/a59bea52-a8d1-4ac9-8ce0-0a623efcb009-kube-api-access-ks2zq\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-pqnb9\" (UID: \"a59bea52-a8d1-4ac9-8ce0-0a623efcb009\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pqnb9" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.770661 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5p76\" (UniqueName: \"kubernetes.io/projected/d6abdea8-a426-4553-b4e7-8998d96eaed3-kube-api-access-v5p76\") pod \"swift-operator-controller-manager-5f8c65bbfc-5z9dq\" (UID: \"d6abdea8-a426-4553-b4e7-8998d96eaed3\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5z9dq" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.770719 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4j49\" (UniqueName: \"kubernetes.io/projected/73aa720c-9e22-4ef9-a5b4-512c0194f0a4-kube-api-access-c4j49\") pod \"octavia-operator-controller-manager-998648c74-nkk8g\" (UID: \"73aa720c-9e22-4ef9-a5b4-512c0194f0a4\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-nkk8g" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.770752 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlmcn\" (UniqueName: \"kubernetes.io/projected/e25e6854-1001-4962-bd9b-f4cb37ebefe1-kube-api-access-xlmcn\") pod \"ovn-operator-controller-manager-b6456fdb6-nqh5d\" (UID: \"e25e6854-1001-4962-bd9b-f4cb37ebefe1\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nqh5d" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.770822 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4794dd53-214a-4537-90c9-0527db628c8b-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f586tjc\" (UID: \"4794dd53-214a-4537-90c9-0527db628c8b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f586tjc" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.799304 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-8ql2p" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.856902 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-j7cvj" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.873200 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rf68\" (UniqueName: \"kubernetes.io/projected/8244458a-10b4-4c4f-8f9e-dc93e90329af-kube-api-access-2rf68\") pod \"nova-operator-controller-manager-697bc559fc-kpmch\" (UID: \"8244458a-10b4-4c4f-8f9e-dc93e90329af\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-kpmch" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.878716 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks2zq\" (UniqueName: \"kubernetes.io/projected/a59bea52-a8d1-4ac9-8ce0-0a623efcb009-kube-api-access-ks2zq\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-pqnb9\" (UID: \"a59bea52-a8d1-4ac9-8ce0-0a623efcb009\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pqnb9" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.904440 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zxcvx" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.906001 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xkv8\" (UniqueName: \"kubernetes.io/projected/d1ba66a9-3383-413f-b2d3-fb13a4e4592b-kube-api-access-4xkv8\") pod \"placement-operator-controller-manager-78f8948974-vxgjl\" (UID: \"d1ba66a9-3383-413f-b2d3-fb13a4e4592b\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-vxgjl" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.906062 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s9r9\" (UniqueName: \"kubernetes.io/projected/4794dd53-214a-4537-90c9-0527db628c8b-kube-api-access-2s9r9\") pod \"openstack-baremetal-operator-controller-manager-55c85496f586tjc\" (UID: \"4794dd53-214a-4537-90c9-0527db628c8b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f586tjc" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.906098 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5p76\" (UniqueName: \"kubernetes.io/projected/d6abdea8-a426-4553-b4e7-8998d96eaed3-kube-api-access-v5p76\") pod \"swift-operator-controller-manager-5f8c65bbfc-5z9dq\" (UID: \"d6abdea8-a426-4553-b4e7-8998d96eaed3\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5z9dq" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.906151 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4j49\" (UniqueName: \"kubernetes.io/projected/73aa720c-9e22-4ef9-a5b4-512c0194f0a4-kube-api-access-c4j49\") pod \"octavia-operator-controller-manager-998648c74-nkk8g\" (UID: \"73aa720c-9e22-4ef9-a5b4-512c0194f0a4\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-nkk8g" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.928676 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-vxgjl"] Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.935457 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlmcn\" (UniqueName: \"kubernetes.io/projected/e25e6854-1001-4962-bd9b-f4cb37ebefe1-kube-api-access-xlmcn\") pod \"ovn-operator-controller-manager-b6456fdb6-nqh5d\" (UID: \"e25e6854-1001-4962-bd9b-f4cb37ebefe1\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nqh5d" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.935719 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4794dd53-214a-4537-90c9-0527db628c8b-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f586tjc\" (UID: \"4794dd53-214a-4537-90c9-0527db628c8b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f586tjc" Dec 06 09:19:59 crc kubenswrapper[4672]: E1206 09:19:59.936141 4672 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 09:19:59 crc kubenswrapper[4672]: E1206 09:19:59.936226 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4794dd53-214a-4537-90c9-0527db628c8b-cert podName:4794dd53-214a-4537-90c9-0527db628c8b nodeName:}" failed. No retries permitted until 2025-12-06 09:20:00.436199838 +0000 UTC m=+818.180460125 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4794dd53-214a-4537-90c9-0527db628c8b-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f586tjc" (UID: "4794dd53-214a-4537-90c9-0527db628c8b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.957273 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-crbgz" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.962852 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xkv8\" (UniqueName: \"kubernetes.io/projected/d1ba66a9-3383-413f-b2d3-fb13a4e4592b-kube-api-access-4xkv8\") pod \"placement-operator-controller-manager-78f8948974-vxgjl\" (UID: \"d1ba66a9-3383-413f-b2d3-fb13a4e4592b\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-vxgjl" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.968637 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s9r9\" (UniqueName: \"kubernetes.io/projected/4794dd53-214a-4537-90c9-0527db628c8b-kube-api-access-2s9r9\") pod \"openstack-baremetal-operator-controller-manager-55c85496f586tjc\" (UID: \"4794dd53-214a-4537-90c9-0527db628c8b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f586tjc" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.980083 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5p76\" (UniqueName: \"kubernetes.io/projected/d6abdea8-a426-4553-b4e7-8998d96eaed3-kube-api-access-v5p76\") pod \"swift-operator-controller-manager-5f8c65bbfc-5z9dq\" (UID: \"d6abdea8-a426-4553-b4e7-8998d96eaed3\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5z9dq" Dec 06 09:19:59 crc kubenswrapper[4672]: I1206 09:19:59.987644 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5z9dq"] Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.056251 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlmcn\" (UniqueName: \"kubernetes.io/projected/e25e6854-1001-4962-bd9b-f4cb37ebefe1-kube-api-access-xlmcn\") pod \"ovn-operator-controller-manager-b6456fdb6-nqh5d\" (UID: \"e25e6854-1001-4962-bd9b-f4cb37ebefe1\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nqh5d" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.059492 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pqnb9" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.065279 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-kpmch" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.067139 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-49652"] Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.072583 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6bbb7d8a-ba3a-476a-b09d-0fd084fc325e-cert\") pod \"infra-operator-controller-manager-57548d458d-rwjvr\" (UID: \"6bbb7d8a-ba3a-476a-b09d-0fd084fc325e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-rwjvr" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.073045 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4j49\" (UniqueName: \"kubernetes.io/projected/73aa720c-9e22-4ef9-a5b4-512c0194f0a4-kube-api-access-c4j49\") pod \"octavia-operator-controller-manager-998648c74-nkk8g\" (UID: \"73aa720c-9e22-4ef9-a5b4-512c0194f0a4\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-nkk8g" Dec 06 09:20:00 crc kubenswrapper[4672]: E1206 09:20:00.073204 4672 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.076708 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-49652" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.082756 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-c8h66" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.084646 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nqh5d" Dec 06 09:20:00 crc kubenswrapper[4672]: E1206 09:20:00.084436 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bbb7d8a-ba3a-476a-b09d-0fd084fc325e-cert podName:6bbb7d8a-ba3a-476a-b09d-0fd084fc325e nodeName:}" failed. No retries permitted until 2025-12-06 09:20:01.08440649 +0000 UTC m=+818.828666777 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6bbb7d8a-ba3a-476a-b09d-0fd084fc325e-cert") pod "infra-operator-controller-manager-57548d458d-rwjvr" (UID: "6bbb7d8a-ba3a-476a-b09d-0fd084fc325e") : secret "infra-operator-webhook-server-cert" not found Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.086937 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5z9dq" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.095870 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-9p8xf"] Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.097527 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-9p8xf" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.100222 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-vxgjl" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.101267 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-v8kff" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.149672 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-49652"] Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.162223 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-9p8xf"] Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.171674 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-xbspr"] Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.173072 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xbspr" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.175272 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjffr\" (UniqueName: \"kubernetes.io/projected/30a955f4-c456-4d9e-9621-dce7e9f7b8b8-kube-api-access-vjffr\") pod \"telemetry-operator-controller-manager-76cc84c6bb-49652\" (UID: \"30a955f4-c456-4d9e-9621-dce7e9f7b8b8\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-49652" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.175777 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgj6v\" (UniqueName: \"kubernetes.io/projected/b88a6b36-14ee-4898-beb2-dae9d2be7600-kube-api-access-mgj6v\") pod \"test-operator-controller-manager-5854674fcc-9p8xf\" (UID: \"b88a6b36-14ee-4898-beb2-dae9d2be7600\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-9p8xf" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.181318 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-xbspr"] Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.186607 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-n56gg" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.274689 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54bdf956c4-zpt5t"] Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.275582 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-zpt5t" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.280340 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-75tn7" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.280807 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.280942 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.283862 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zgl4\" (UniqueName: \"kubernetes.io/projected/274d0d53-a194-47e5-b20d-e56155f01e72-kube-api-access-9zgl4\") pod \"watcher-operator-controller-manager-769dc69bc-xbspr\" (UID: \"274d0d53-a194-47e5-b20d-e56155f01e72\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xbspr" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.283913 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgj6v\" (UniqueName: \"kubernetes.io/projected/b88a6b36-14ee-4898-beb2-dae9d2be7600-kube-api-access-mgj6v\") pod \"test-operator-controller-manager-5854674fcc-9p8xf\" (UID: \"b88a6b36-14ee-4898-beb2-dae9d2be7600\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-9p8xf" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.283954 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjffr\" (UniqueName: \"kubernetes.io/projected/30a955f4-c456-4d9e-9621-dce7e9f7b8b8-kube-api-access-vjffr\") pod \"telemetry-operator-controller-manager-76cc84c6bb-49652\" (UID: \"30a955f4-c456-4d9e-9621-dce7e9f7b8b8\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-49652" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.305225 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54bdf956c4-zpt5t"] Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.310234 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgj6v\" (UniqueName: \"kubernetes.io/projected/b88a6b36-14ee-4898-beb2-dae9d2be7600-kube-api-access-mgj6v\") pod \"test-operator-controller-manager-5854674fcc-9p8xf\" (UID: \"b88a6b36-14ee-4898-beb2-dae9d2be7600\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-9p8xf" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.323314 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjffr\" (UniqueName: \"kubernetes.io/projected/30a955f4-c456-4d9e-9621-dce7e9f7b8b8-kube-api-access-vjffr\") pod \"telemetry-operator-controller-manager-76cc84c6bb-49652\" (UID: \"30a955f4-c456-4d9e-9621-dce7e9f7b8b8\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-49652" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.356401 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ntvgh"] Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.357450 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ntvgh" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.365271 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-5lchr" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.376647 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-nkk8g" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.381841 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ntvgh"] Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.386735 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-zpt5t\" (UID: \"72a85d5f-d856-47b2-b0d6-f1fe23722f39\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-zpt5t" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.386812 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zgl4\" (UniqueName: \"kubernetes.io/projected/274d0d53-a194-47e5-b20d-e56155f01e72-kube-api-access-9zgl4\") pod \"watcher-operator-controller-manager-769dc69bc-xbspr\" (UID: \"274d0d53-a194-47e5-b20d-e56155f01e72\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xbspr" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.386851 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crcgq\" (UniqueName: \"kubernetes.io/projected/72a85d5f-d856-47b2-b0d6-f1fe23722f39-kube-api-access-crcgq\") pod \"openstack-operator-controller-manager-54bdf956c4-zpt5t\" (UID: \"72a85d5f-d856-47b2-b0d6-f1fe23722f39\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-zpt5t" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.387177 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-zpt5t\" (UID: \"72a85d5f-d856-47b2-b0d6-f1fe23722f39\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-zpt5t" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.387640 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tjg9\" (UniqueName: \"kubernetes.io/projected/dd2774f1-51aa-4387-aaf1-02cd8329ae1d-kube-api-access-2tjg9\") pod \"rabbitmq-cluster-operator-manager-668c99d594-ntvgh\" (UID: \"dd2774f1-51aa-4387-aaf1-02cd8329ae1d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ntvgh" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.412020 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zgl4\" (UniqueName: \"kubernetes.io/projected/274d0d53-a194-47e5-b20d-e56155f01e72-kube-api-access-9zgl4\") pod \"watcher-operator-controller-manager-769dc69bc-xbspr\" (UID: \"274d0d53-a194-47e5-b20d-e56155f01e72\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xbspr" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.443327 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-49652" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.478716 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-9p8xf" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.488363 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-zpt5t\" (UID: \"72a85d5f-d856-47b2-b0d6-f1fe23722f39\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-zpt5t" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.497331 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tjg9\" (UniqueName: \"kubernetes.io/projected/dd2774f1-51aa-4387-aaf1-02cd8329ae1d-kube-api-access-2tjg9\") pod \"rabbitmq-cluster-operator-manager-668c99d594-ntvgh\" (UID: \"dd2774f1-51aa-4387-aaf1-02cd8329ae1d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ntvgh" Dec 06 09:20:00 crc kubenswrapper[4672]: E1206 09:20:00.488876 4672 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 06 09:20:00 crc kubenswrapper[4672]: E1206 09:20:00.504503 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-metrics-certs podName:72a85d5f-d856-47b2-b0d6-f1fe23722f39 nodeName:}" failed. No retries permitted until 2025-12-06 09:20:01.004113632 +0000 UTC m=+818.748373919 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-zpt5t" (UID: "72a85d5f-d856-47b2-b0d6-f1fe23722f39") : secret "metrics-server-cert" not found Dec 06 09:20:00 crc kubenswrapper[4672]: E1206 09:20:00.499021 4672 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 09:20:00 crc kubenswrapper[4672]: E1206 09:20:00.505485 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4794dd53-214a-4537-90c9-0527db628c8b-cert podName:4794dd53-214a-4537-90c9-0527db628c8b nodeName:}" failed. No retries permitted until 2025-12-06 09:20:01.505473699 +0000 UTC m=+819.249733986 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4794dd53-214a-4537-90c9-0527db628c8b-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f586tjc" (UID: "4794dd53-214a-4537-90c9-0527db628c8b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.498868 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4794dd53-214a-4537-90c9-0527db628c8b-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f586tjc\" (UID: \"4794dd53-214a-4537-90c9-0527db628c8b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f586tjc" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.505718 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-zpt5t\" (UID: \"72a85d5f-d856-47b2-b0d6-f1fe23722f39\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-zpt5t" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.505888 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crcgq\" (UniqueName: \"kubernetes.io/projected/72a85d5f-d856-47b2-b0d6-f1fe23722f39-kube-api-access-crcgq\") pod \"openstack-operator-controller-manager-54bdf956c4-zpt5t\" (UID: \"72a85d5f-d856-47b2-b0d6-f1fe23722f39\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-zpt5t" Dec 06 09:20:00 crc kubenswrapper[4672]: E1206 09:20:00.506778 4672 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 06 09:20:00 crc kubenswrapper[4672]: E1206 09:20:00.506897 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-webhook-certs podName:72a85d5f-d856-47b2-b0d6-f1fe23722f39 nodeName:}" failed. No retries permitted until 2025-12-06 09:20:01.006884789 +0000 UTC m=+818.751145076 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-zpt5t" (UID: "72a85d5f-d856-47b2-b0d6-f1fe23722f39") : secret "webhook-server-cert" not found Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.532759 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crcgq\" (UniqueName: \"kubernetes.io/projected/72a85d5f-d856-47b2-b0d6-f1fe23722f39-kube-api-access-crcgq\") pod \"openstack-operator-controller-manager-54bdf956c4-zpt5t\" (UID: \"72a85d5f-d856-47b2-b0d6-f1fe23722f39\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-zpt5t" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.534471 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tjg9\" (UniqueName: \"kubernetes.io/projected/dd2774f1-51aa-4387-aaf1-02cd8329ae1d-kube-api-access-2tjg9\") pod \"rabbitmq-cluster-operator-manager-668c99d594-ntvgh\" (UID: \"dd2774f1-51aa-4387-aaf1-02cd8329ae1d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ntvgh" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.547272 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xbspr" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.651930 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-6jcpj"] Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.777132 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ntvgh" Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.784709 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-p7c94"] Dec 06 09:20:00 crc kubenswrapper[4672]: I1206 09:20:00.800802 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-lh7x2"] Dec 06 09:20:01 crc kubenswrapper[4672]: I1206 09:20:01.017989 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-zpt5t\" (UID: \"72a85d5f-d856-47b2-b0d6-f1fe23722f39\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-zpt5t" Dec 06 09:20:01 crc kubenswrapper[4672]: I1206 09:20:01.018059 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-zpt5t\" (UID: \"72a85d5f-d856-47b2-b0d6-f1fe23722f39\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-zpt5t" Dec 06 09:20:01 crc kubenswrapper[4672]: E1206 09:20:01.018238 4672 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 06 09:20:01 crc kubenswrapper[4672]: E1206 09:20:01.018282 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-webhook-certs podName:72a85d5f-d856-47b2-b0d6-f1fe23722f39 nodeName:}" failed. No retries permitted until 2025-12-06 09:20:02.018269223 +0000 UTC m=+819.762529510 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-zpt5t" (UID: "72a85d5f-d856-47b2-b0d6-f1fe23722f39") : secret "webhook-server-cert" not found Dec 06 09:20:01 crc kubenswrapper[4672]: E1206 09:20:01.018327 4672 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 06 09:20:01 crc kubenswrapper[4672]: E1206 09:20:01.018345 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-metrics-certs podName:72a85d5f-d856-47b2-b0d6-f1fe23722f39 nodeName:}" failed. No retries permitted until 2025-12-06 09:20:02.018338805 +0000 UTC m=+819.762599092 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-zpt5t" (UID: "72a85d5f-d856-47b2-b0d6-f1fe23722f39") : secret "metrics-server-cert" not found Dec 06 09:20:01 crc kubenswrapper[4672]: I1206 09:20:01.124151 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6bbb7d8a-ba3a-476a-b09d-0fd084fc325e-cert\") pod \"infra-operator-controller-manager-57548d458d-rwjvr\" (UID: \"6bbb7d8a-ba3a-476a-b09d-0fd084fc325e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-rwjvr" Dec 06 09:20:01 crc kubenswrapper[4672]: E1206 09:20:01.124516 4672 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 06 09:20:01 crc kubenswrapper[4672]: E1206 09:20:01.124584 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bbb7d8a-ba3a-476a-b09d-0fd084fc325e-cert podName:6bbb7d8a-ba3a-476a-b09d-0fd084fc325e nodeName:}" failed. No retries permitted until 2025-12-06 09:20:03.124557644 +0000 UTC m=+820.868817931 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6bbb7d8a-ba3a-476a-b09d-0fd084fc325e-cert") pod "infra-operator-controller-manager-57548d458d-rwjvr" (UID: "6bbb7d8a-ba3a-476a-b09d-0fd084fc325e") : secret "infra-operator-webhook-server-cert" not found Dec 06 09:20:01 crc kubenswrapper[4672]: I1206 09:20:01.260118 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-cpc5n"] Dec 06 09:20:01 crc kubenswrapper[4672]: I1206 09:20:01.264457 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-kpmch"] Dec 06 09:20:01 crc kubenswrapper[4672]: I1206 09:20:01.296374 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-j7cvj"] Dec 06 09:20:01 crc kubenswrapper[4672]: I1206 09:20:01.323820 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-2zwxr"] Dec 06 09:20:01 crc kubenswrapper[4672]: I1206 09:20:01.340339 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-8ql2p"] Dec 06 09:20:01 crc kubenswrapper[4672]: I1206 09:20:01.360259 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-nqh5d"] Dec 06 09:20:01 crc kubenswrapper[4672]: I1206 09:20:01.392371 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-6jcpj" event={"ID":"7e99a7a0-5a1d-4143-a8b7-9fb170d119a2","Type":"ContainerStarted","Data":"75029c25e05494ac1f9b92ff870570828ba75dc12cb387783e9b6b9893bbe68c"} Dec 06 09:20:01 crc kubenswrapper[4672]: I1206 09:20:01.396014 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-kpmch" event={"ID":"8244458a-10b4-4c4f-8f9e-dc93e90329af","Type":"ContainerStarted","Data":"c948eb64537f1d7d79fe0fb1edad5ea530a9078899920f3a016b9be32199a107"} Dec 06 09:20:01 crc kubenswrapper[4672]: W1206 09:20:01.397942 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda59bea52_a8d1_4ac9_8ce0_0a623efcb009.slice/crio-df8f606cb416de656a5002ce425383fb5b08f200797b9c5a89475659cea37763 WatchSource:0}: Error finding container df8f606cb416de656a5002ce425383fb5b08f200797b9c5a89475659cea37763: Status 404 returned error can't find the container with id df8f606cb416de656a5002ce425383fb5b08f200797b9c5a89475659cea37763 Dec 06 09:20:01 crc kubenswrapper[4672]: I1206 09:20:01.398441 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-crbgz"] Dec 06 09:20:01 crc kubenswrapper[4672]: I1206 09:20:01.404562 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-2zwxr" event={"ID":"96ee3cc6-bf15-4fa0-9efc-7a0aa1338b43","Type":"ContainerStarted","Data":"98fc67d634d07614272c58b342f1a47488b29284e094708e7789d5d3e479a954"} Dec 06 09:20:01 crc kubenswrapper[4672]: W1206 09:20:01.412185 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7753548d_df52_4a65_b447_d20dcd379cde.slice/crio-ad708796093604a39ef504234d28d5d8a5c6eb89982eb326dc3b98eee8063428 WatchSource:0}: Error finding container ad708796093604a39ef504234d28d5d8a5c6eb89982eb326dc3b98eee8063428: Status 404 returned error can't find the container with id ad708796093604a39ef504234d28d5d8a5c6eb89982eb326dc3b98eee8063428 Dec 06 09:20:01 crc kubenswrapper[4672]: I1206 09:20:01.415039 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-lh7x2" event={"ID":"ce4e8b8a-4f3a-4303-9455-8eb984c06f57","Type":"ContainerStarted","Data":"0c6d2d974270e33d7b2ce4c24518909c32df0ab6e7030337faf26accaf6c1f20"} Dec 06 09:20:01 crc kubenswrapper[4672]: I1206 09:20:01.423011 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-p7c94" event={"ID":"018edeb2-cc58-49fe-a7ea-15a8b6646ddd","Type":"ContainerStarted","Data":"e0bd639d0f74f179271eb44f2a80d83528b34adfd71e3cdedff0e310e6f9969d"} Dec 06 09:20:01 crc kubenswrapper[4672]: I1206 09:20:01.424746 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-j7cvj" event={"ID":"308c58b1-3c6a-4c79-88fc-b4d515efd96d","Type":"ContainerStarted","Data":"500e774cc1266952302ab7d9f2a509e3a92ad0a6b4f89058dfd71895df07f0a3"} Dec 06 09:20:01 crc kubenswrapper[4672]: I1206 09:20:01.427132 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-cpc5n" event={"ID":"7dc29189-4c37-4886-af89-7c6cb57f237e","Type":"ContainerStarted","Data":"facbe8ca3d20cb47daaaea1f0addb6899d46143d55855557aa889ecfd5ce91e7"} Dec 06 09:20:01 crc kubenswrapper[4672]: I1206 09:20:01.434498 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pqnb9"] Dec 06 09:20:01 crc kubenswrapper[4672]: I1206 09:20:01.450205 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dvzm4"] Dec 06 09:20:01 crc kubenswrapper[4672]: I1206 09:20:01.450263 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-zxcvx"] Dec 06 09:20:01 crc kubenswrapper[4672]: I1206 09:20:01.536143 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4794dd53-214a-4537-90c9-0527db628c8b-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f586tjc\" (UID: \"4794dd53-214a-4537-90c9-0527db628c8b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f586tjc" Dec 06 09:20:01 crc kubenswrapper[4672]: E1206 09:20:01.536706 4672 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 09:20:01 crc kubenswrapper[4672]: E1206 09:20:01.536759 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4794dd53-214a-4537-90c9-0527db628c8b-cert podName:4794dd53-214a-4537-90c9-0527db628c8b nodeName:}" failed. No retries permitted until 2025-12-06 09:20:03.536745655 +0000 UTC m=+821.281005942 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4794dd53-214a-4537-90c9-0527db628c8b-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f586tjc" (UID: "4794dd53-214a-4537-90c9-0527db628c8b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 09:20:01 crc kubenswrapper[4672]: I1206 09:20:01.754020 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-49652"] Dec 06 09:20:01 crc kubenswrapper[4672]: W1206 09:20:01.762648 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73aa720c_9e22_4ef9_a5b4_512c0194f0a4.slice/crio-03fd355f141dfd60f6ce42a267104def30e9daf8154af4e84b02c7f0260c73e8 WatchSource:0}: Error finding container 03fd355f141dfd60f6ce42a267104def30e9daf8154af4e84b02c7f0260c73e8: Status 404 returned error can't find the container with id 03fd355f141dfd60f6ce42a267104def30e9daf8154af4e84b02c7f0260c73e8 Dec 06 09:20:01 crc kubenswrapper[4672]: I1206 09:20:01.767213 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-vxgjl"] Dec 06 09:20:01 crc kubenswrapper[4672]: W1206 09:20:01.768087 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb88a6b36_14ee_4898_beb2_dae9d2be7600.slice/crio-e96541399b0e301bb588b9696357c53bad6da79e352e0a9d6244bd0e06214ffe WatchSource:0}: Error finding container e96541399b0e301bb588b9696357c53bad6da79e352e0a9d6244bd0e06214ffe: Status 404 returned error can't find the container with id e96541399b0e301bb588b9696357c53bad6da79e352e0a9d6244bd0e06214ffe Dec 06 09:20:01 crc kubenswrapper[4672]: W1206 09:20:01.769005 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1ba66a9_3383_413f_b2d3_fb13a4e4592b.slice/crio-d382bf593d1ce8301913aeb2b15dfe63ca5d268d68fe0d85dbb206f2f2b8969d WatchSource:0}: Error finding container d382bf593d1ce8301913aeb2b15dfe63ca5d268d68fe0d85dbb206f2f2b8969d: Status 404 returned error can't find the container with id d382bf593d1ce8301913aeb2b15dfe63ca5d268d68fe0d85dbb206f2f2b8969d Dec 06 09:20:01 crc kubenswrapper[4672]: E1206 09:20:01.772726 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mgj6v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-9p8xf_openstack-operators(b88a6b36-14ee-4898-beb2-dae9d2be7600): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 09:20:01 crc kubenswrapper[4672]: W1206 09:20:01.774625 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd2774f1_51aa_4387_aaf1_02cd8329ae1d.slice/crio-214716f9061aad7a3403823fbefa2bd5a0b04bb1653d04cd2d54f895075249e1 WatchSource:0}: Error finding container 214716f9061aad7a3403823fbefa2bd5a0b04bb1653d04cd2d54f895075249e1: Status 404 returned error can't find the container with id 214716f9061aad7a3403823fbefa2bd5a0b04bb1653d04cd2d54f895075249e1 Dec 06 09:20:01 crc kubenswrapper[4672]: E1206 09:20:01.776274 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4xkv8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-vxgjl_openstack-operators(d1ba66a9-3383-413f-b2d3-fb13a4e4592b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 09:20:01 crc kubenswrapper[4672]: E1206 09:20:01.777706 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mgj6v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-9p8xf_openstack-operators(b88a6b36-14ee-4898-beb2-dae9d2be7600): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 09:20:01 crc kubenswrapper[4672]: E1206 09:20:01.778673 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2tjg9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-ntvgh_openstack-operators(dd2774f1-51aa-4387-aaf1-02cd8329ae1d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 09:20:01 crc kubenswrapper[4672]: E1206 09:20:01.778747 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4xkv8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-vxgjl_openstack-operators(d1ba66a9-3383-413f-b2d3-fb13a4e4592b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 09:20:01 crc kubenswrapper[4672]: E1206 09:20:01.778805 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-9p8xf" podUID="b88a6b36-14ee-4898-beb2-dae9d2be7600" Dec 06 09:20:01 crc kubenswrapper[4672]: E1206 09:20:01.779926 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-vxgjl" podUID="d1ba66a9-3383-413f-b2d3-fb13a4e4592b" Dec 06 09:20:01 crc kubenswrapper[4672]: E1206 09:20:01.779956 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ntvgh" podUID="dd2774f1-51aa-4387-aaf1-02cd8329ae1d" Dec 06 09:20:01 crc kubenswrapper[4672]: I1206 09:20:01.781872 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-nkk8g"] Dec 06 09:20:01 crc kubenswrapper[4672]: I1206 09:20:01.791670 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5z9dq"] Dec 06 09:20:01 crc kubenswrapper[4672]: E1206 09:20:01.793348 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9zgl4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-xbspr_openstack-operators(274d0d53-a194-47e5-b20d-e56155f01e72): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 09:20:01 crc kubenswrapper[4672]: E1206 09:20:01.795050 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9zgl4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-xbspr_openstack-operators(274d0d53-a194-47e5-b20d-e56155f01e72): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 09:20:01 crc kubenswrapper[4672]: E1206 09:20:01.796180 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xbspr" podUID="274d0d53-a194-47e5-b20d-e56155f01e72" Dec 06 09:20:01 crc kubenswrapper[4672]: I1206 09:20:01.798087 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-9p8xf"] Dec 06 09:20:01 crc kubenswrapper[4672]: I1206 09:20:01.802498 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ntvgh"] Dec 06 09:20:01 crc kubenswrapper[4672]: I1206 09:20:01.808333 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-xbspr"] Dec 06 09:20:01 crc kubenswrapper[4672]: E1206 09:20:01.816471 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v5p76,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-5z9dq_openstack-operators(d6abdea8-a426-4553-b4e7-8998d96eaed3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 09:20:01 crc kubenswrapper[4672]: E1206 09:20:01.822748 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v5p76,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-5z9dq_openstack-operators(d6abdea8-a426-4553-b4e7-8998d96eaed3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 09:20:01 crc kubenswrapper[4672]: E1206 09:20:01.824162 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5z9dq" podUID="d6abdea8-a426-4553-b4e7-8998d96eaed3" Dec 06 09:20:02 crc kubenswrapper[4672]: I1206 09:20:02.050351 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-zpt5t\" (UID: \"72a85d5f-d856-47b2-b0d6-f1fe23722f39\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-zpt5t" Dec 06 09:20:02 crc kubenswrapper[4672]: E1206 09:20:02.050566 4672 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 06 09:20:02 crc kubenswrapper[4672]: I1206 09:20:02.050620 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-zpt5t\" (UID: \"72a85d5f-d856-47b2-b0d6-f1fe23722f39\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-zpt5t" Dec 06 09:20:02 crc kubenswrapper[4672]: E1206 09:20:02.050679 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-webhook-certs podName:72a85d5f-d856-47b2-b0d6-f1fe23722f39 nodeName:}" failed. No retries permitted until 2025-12-06 09:20:04.05066012 +0000 UTC m=+821.794920407 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-zpt5t" (UID: "72a85d5f-d856-47b2-b0d6-f1fe23722f39") : secret "webhook-server-cert" not found Dec 06 09:20:02 crc kubenswrapper[4672]: E1206 09:20:02.050983 4672 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 06 09:20:02 crc kubenswrapper[4672]: E1206 09:20:02.051093 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-metrics-certs podName:72a85d5f-d856-47b2-b0d6-f1fe23722f39 nodeName:}" failed. No retries permitted until 2025-12-06 09:20:04.051067242 +0000 UTC m=+821.795327689 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-zpt5t" (UID: "72a85d5f-d856-47b2-b0d6-f1fe23722f39") : secret "metrics-server-cert" not found Dec 06 09:20:02 crc kubenswrapper[4672]: I1206 09:20:02.449091 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-crbgz" event={"ID":"27d7a7f5-ab93-40b6-8718-0a8b930d2c0f","Type":"ContainerStarted","Data":"a7692bbcf9bedca9a2e7ed9beff4f3eec1364971ea1583969ae9fa42d9ecba35"} Dec 06 09:20:02 crc kubenswrapper[4672]: I1206 09:20:02.452660 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-nkk8g" event={"ID":"73aa720c-9e22-4ef9-a5b4-512c0194f0a4","Type":"ContainerStarted","Data":"03fd355f141dfd60f6ce42a267104def30e9daf8154af4e84b02c7f0260c73e8"} Dec 06 09:20:02 crc kubenswrapper[4672]: I1206 09:20:02.456130 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ntvgh" event={"ID":"dd2774f1-51aa-4387-aaf1-02cd8329ae1d","Type":"ContainerStarted","Data":"214716f9061aad7a3403823fbefa2bd5a0b04bb1653d04cd2d54f895075249e1"} Dec 06 09:20:02 crc kubenswrapper[4672]: E1206 09:20:02.460100 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ntvgh" podUID="dd2774f1-51aa-4387-aaf1-02cd8329ae1d" Dec 06 09:20:02 crc kubenswrapper[4672]: I1206 09:20:02.478488 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zxcvx" event={"ID":"3fda2255-f593-42c6-b17e-2996a6ce7c5e","Type":"ContainerStarted","Data":"275c8968227d9202117263ab8ef4c303f46618f65a10d2179804d2b8dbb89fd4"} Dec 06 09:20:02 crc kubenswrapper[4672]: I1206 09:20:02.482546 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-49652" event={"ID":"30a955f4-c456-4d9e-9621-dce7e9f7b8b8","Type":"ContainerStarted","Data":"e3f48a6c3c338092d7fb6f09661e8387e3f9f1bbea1d13bbb66ded76822303fb"} Dec 06 09:20:02 crc kubenswrapper[4672]: I1206 09:20:02.487128 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nqh5d" event={"ID":"e25e6854-1001-4962-bd9b-f4cb37ebefe1","Type":"ContainerStarted","Data":"774ba5034143178395a8233f406b32c0ebeca2e679b12a57b15a32e38ff81d95"} Dec 06 09:20:02 crc kubenswrapper[4672]: I1206 09:20:02.492478 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-9p8xf" event={"ID":"b88a6b36-14ee-4898-beb2-dae9d2be7600","Type":"ContainerStarted","Data":"e96541399b0e301bb588b9696357c53bad6da79e352e0a9d6244bd0e06214ffe"} Dec 06 09:20:02 crc kubenswrapper[4672]: E1206 09:20:02.494982 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-9p8xf" podUID="b88a6b36-14ee-4898-beb2-dae9d2be7600" Dec 06 09:20:02 crc kubenswrapper[4672]: I1206 09:20:02.506834 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xbspr" event={"ID":"274d0d53-a194-47e5-b20d-e56155f01e72","Type":"ContainerStarted","Data":"4f28f0edf5601400201116a583469f7aa253478c68c99dd5ae17a00810be3948"} Dec 06 09:20:02 crc kubenswrapper[4672]: I1206 09:20:02.513349 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-8ql2p" event={"ID":"9977f421-c235-40ef-8d9f-2e0125bf3593","Type":"ContainerStarted","Data":"72a2114862a7421288c2c607a328040cb0a9370ef087dc9f0baeb9e5ed50889f"} Dec 06 09:20:02 crc kubenswrapper[4672]: E1206 09:20:02.514075 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xbspr" podUID="274d0d53-a194-47e5-b20d-e56155f01e72" Dec 06 09:20:02 crc kubenswrapper[4672]: I1206 09:20:02.539928 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5z9dq" event={"ID":"d6abdea8-a426-4553-b4e7-8998d96eaed3","Type":"ContainerStarted","Data":"1da10f801bc69b297fb46af5847bbc3396909469f3b4c87a709d3aba83395b40"} Dec 06 09:20:02 crc kubenswrapper[4672]: E1206 09:20:02.566556 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5z9dq" podUID="d6abdea8-a426-4553-b4e7-8998d96eaed3" Dec 06 09:20:02 crc kubenswrapper[4672]: I1206 09:20:02.618530 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pqnb9" event={"ID":"a59bea52-a8d1-4ac9-8ce0-0a623efcb009","Type":"ContainerStarted","Data":"df8f606cb416de656a5002ce425383fb5b08f200797b9c5a89475659cea37763"} Dec 06 09:20:02 crc kubenswrapper[4672]: I1206 09:20:02.618583 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dvzm4" event={"ID":"7753548d-df52-4a65-b447-d20dcd379cde","Type":"ContainerStarted","Data":"ad708796093604a39ef504234d28d5d8a5c6eb89982eb326dc3b98eee8063428"} Dec 06 09:20:02 crc kubenswrapper[4672]: I1206 09:20:02.618612 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-vxgjl" event={"ID":"d1ba66a9-3383-413f-b2d3-fb13a4e4592b","Type":"ContainerStarted","Data":"d382bf593d1ce8301913aeb2b15dfe63ca5d268d68fe0d85dbb206f2f2b8969d"} Dec 06 09:20:02 crc kubenswrapper[4672]: E1206 09:20:02.622191 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-vxgjl" podUID="d1ba66a9-3383-413f-b2d3-fb13a4e4592b" Dec 06 09:20:03 crc kubenswrapper[4672]: I1206 09:20:03.179249 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6bbb7d8a-ba3a-476a-b09d-0fd084fc325e-cert\") pod \"infra-operator-controller-manager-57548d458d-rwjvr\" (UID: \"6bbb7d8a-ba3a-476a-b09d-0fd084fc325e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-rwjvr" Dec 06 09:20:03 crc kubenswrapper[4672]: E1206 09:20:03.179535 4672 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 06 09:20:03 crc kubenswrapper[4672]: E1206 09:20:03.179623 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bbb7d8a-ba3a-476a-b09d-0fd084fc325e-cert podName:6bbb7d8a-ba3a-476a-b09d-0fd084fc325e nodeName:}" failed. No retries permitted until 2025-12-06 09:20:07.179578045 +0000 UTC m=+824.923838332 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6bbb7d8a-ba3a-476a-b09d-0fd084fc325e-cert") pod "infra-operator-controller-manager-57548d458d-rwjvr" (UID: "6bbb7d8a-ba3a-476a-b09d-0fd084fc325e") : secret "infra-operator-webhook-server-cert" not found Dec 06 09:20:03 crc kubenswrapper[4672]: I1206 09:20:03.586678 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4794dd53-214a-4537-90c9-0527db628c8b-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f586tjc\" (UID: \"4794dd53-214a-4537-90c9-0527db628c8b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f586tjc" Dec 06 09:20:03 crc kubenswrapper[4672]: E1206 09:20:03.588719 4672 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 09:20:03 crc kubenswrapper[4672]: E1206 09:20:03.588775 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4794dd53-214a-4537-90c9-0527db628c8b-cert podName:4794dd53-214a-4537-90c9-0527db628c8b nodeName:}" failed. No retries permitted until 2025-12-06 09:20:07.588755443 +0000 UTC m=+825.333015730 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4794dd53-214a-4537-90c9-0527db628c8b-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f586tjc" (UID: "4794dd53-214a-4537-90c9-0527db628c8b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 09:20:03 crc kubenswrapper[4672]: E1206 09:20:03.627357 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ntvgh" podUID="dd2774f1-51aa-4387-aaf1-02cd8329ae1d" Dec 06 09:20:03 crc kubenswrapper[4672]: E1206 09:20:03.632011 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xbspr" podUID="274d0d53-a194-47e5-b20d-e56155f01e72" Dec 06 09:20:03 crc kubenswrapper[4672]: E1206 09:20:03.632142 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5z9dq" podUID="d6abdea8-a426-4553-b4e7-8998d96eaed3" Dec 06 09:20:03 crc kubenswrapper[4672]: E1206 09:20:03.632142 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-9p8xf" podUID="b88a6b36-14ee-4898-beb2-dae9d2be7600" Dec 06 09:20:03 crc kubenswrapper[4672]: E1206 09:20:03.653936 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-vxgjl" podUID="d1ba66a9-3383-413f-b2d3-fb13a4e4592b" Dec 06 09:20:04 crc kubenswrapper[4672]: I1206 09:20:04.099536 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-zpt5t\" (UID: \"72a85d5f-d856-47b2-b0d6-f1fe23722f39\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-zpt5t" Dec 06 09:20:04 crc kubenswrapper[4672]: I1206 09:20:04.099650 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-zpt5t\" (UID: \"72a85d5f-d856-47b2-b0d6-f1fe23722f39\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-zpt5t" Dec 06 09:20:04 crc kubenswrapper[4672]: E1206 09:20:04.099989 4672 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 06 09:20:04 crc kubenswrapper[4672]: E1206 09:20:04.100085 4672 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 06 09:20:04 crc kubenswrapper[4672]: E1206 09:20:04.100143 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-webhook-certs podName:72a85d5f-d856-47b2-b0d6-f1fe23722f39 nodeName:}" failed. No retries permitted until 2025-12-06 09:20:08.100123097 +0000 UTC m=+825.844383384 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-zpt5t" (UID: "72a85d5f-d856-47b2-b0d6-f1fe23722f39") : secret "webhook-server-cert" not found Dec 06 09:20:04 crc kubenswrapper[4672]: E1206 09:20:04.100397 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-metrics-certs podName:72a85d5f-d856-47b2-b0d6-f1fe23722f39 nodeName:}" failed. No retries permitted until 2025-12-06 09:20:08.100348193 +0000 UTC m=+825.844608480 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-zpt5t" (UID: "72a85d5f-d856-47b2-b0d6-f1fe23722f39") : secret "metrics-server-cert" not found Dec 06 09:20:07 crc kubenswrapper[4672]: I1206 09:20:07.248102 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6bbb7d8a-ba3a-476a-b09d-0fd084fc325e-cert\") pod \"infra-operator-controller-manager-57548d458d-rwjvr\" (UID: \"6bbb7d8a-ba3a-476a-b09d-0fd084fc325e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-rwjvr" Dec 06 09:20:07 crc kubenswrapper[4672]: E1206 09:20:07.251249 4672 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 06 09:20:07 crc kubenswrapper[4672]: E1206 09:20:07.252422 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bbb7d8a-ba3a-476a-b09d-0fd084fc325e-cert podName:6bbb7d8a-ba3a-476a-b09d-0fd084fc325e nodeName:}" failed. No retries permitted until 2025-12-06 09:20:15.252398287 +0000 UTC m=+832.996658574 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6bbb7d8a-ba3a-476a-b09d-0fd084fc325e-cert") pod "infra-operator-controller-manager-57548d458d-rwjvr" (UID: "6bbb7d8a-ba3a-476a-b09d-0fd084fc325e") : secret "infra-operator-webhook-server-cert" not found Dec 06 09:20:07 crc kubenswrapper[4672]: I1206 09:20:07.654410 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4794dd53-214a-4537-90c9-0527db628c8b-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f586tjc\" (UID: \"4794dd53-214a-4537-90c9-0527db628c8b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f586tjc" Dec 06 09:20:07 crc kubenswrapper[4672]: E1206 09:20:07.654692 4672 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 09:20:07 crc kubenswrapper[4672]: E1206 09:20:07.654744 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4794dd53-214a-4537-90c9-0527db628c8b-cert podName:4794dd53-214a-4537-90c9-0527db628c8b nodeName:}" failed. No retries permitted until 2025-12-06 09:20:15.654728243 +0000 UTC m=+833.398988540 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4794dd53-214a-4537-90c9-0527db628c8b-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f586tjc" (UID: "4794dd53-214a-4537-90c9-0527db628c8b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 09:20:08 crc kubenswrapper[4672]: I1206 09:20:08.161351 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-zpt5t\" (UID: \"72a85d5f-d856-47b2-b0d6-f1fe23722f39\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-zpt5t" Dec 06 09:20:08 crc kubenswrapper[4672]: E1206 09:20:08.161541 4672 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 06 09:20:08 crc kubenswrapper[4672]: E1206 09:20:08.161656 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-metrics-certs podName:72a85d5f-d856-47b2-b0d6-f1fe23722f39 nodeName:}" failed. No retries permitted until 2025-12-06 09:20:16.161637571 +0000 UTC m=+833.905897858 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-zpt5t" (UID: "72a85d5f-d856-47b2-b0d6-f1fe23722f39") : secret "metrics-server-cert" not found Dec 06 09:20:08 crc kubenswrapper[4672]: E1206 09:20:08.161754 4672 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 06 09:20:08 crc kubenswrapper[4672]: E1206 09:20:08.161863 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-webhook-certs podName:72a85d5f-d856-47b2-b0d6-f1fe23722f39 nodeName:}" failed. No retries permitted until 2025-12-06 09:20:16.161845787 +0000 UTC m=+833.906106144 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-zpt5t" (UID: "72a85d5f-d856-47b2-b0d6-f1fe23722f39") : secret "webhook-server-cert" not found Dec 06 09:20:08 crc kubenswrapper[4672]: I1206 09:20:08.161888 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-zpt5t\" (UID: \"72a85d5f-d856-47b2-b0d6-f1fe23722f39\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-zpt5t" Dec 06 09:20:15 crc kubenswrapper[4672]: I1206 09:20:15.260272 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6bbb7d8a-ba3a-476a-b09d-0fd084fc325e-cert\") pod \"infra-operator-controller-manager-57548d458d-rwjvr\" (UID: \"6bbb7d8a-ba3a-476a-b09d-0fd084fc325e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-rwjvr" Dec 06 09:20:15 crc kubenswrapper[4672]: I1206 09:20:15.265321 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6bbb7d8a-ba3a-476a-b09d-0fd084fc325e-cert\") pod \"infra-operator-controller-manager-57548d458d-rwjvr\" (UID: \"6bbb7d8a-ba3a-476a-b09d-0fd084fc325e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-rwjvr" Dec 06 09:20:15 crc kubenswrapper[4672]: I1206 09:20:15.317794 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-rwjvr" Dec 06 09:20:15 crc kubenswrapper[4672]: I1206 09:20:15.666214 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4794dd53-214a-4537-90c9-0527db628c8b-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f586tjc\" (UID: \"4794dd53-214a-4537-90c9-0527db628c8b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f586tjc" Dec 06 09:20:15 crc kubenswrapper[4672]: I1206 09:20:15.672470 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4794dd53-214a-4537-90c9-0527db628c8b-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f586tjc\" (UID: \"4794dd53-214a-4537-90c9-0527db628c8b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f586tjc" Dec 06 09:20:15 crc kubenswrapper[4672]: I1206 09:20:15.815340 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f586tjc" Dec 06 09:20:16 crc kubenswrapper[4672]: I1206 09:20:16.172929 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-zpt5t\" (UID: \"72a85d5f-d856-47b2-b0d6-f1fe23722f39\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-zpt5t" Dec 06 09:20:16 crc kubenswrapper[4672]: I1206 09:20:16.173481 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-zpt5t\" (UID: \"72a85d5f-d856-47b2-b0d6-f1fe23722f39\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-zpt5t" Dec 06 09:20:16 crc kubenswrapper[4672]: I1206 09:20:16.176701 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-zpt5t\" (UID: \"72a85d5f-d856-47b2-b0d6-f1fe23722f39\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-zpt5t" Dec 06 09:20:16 crc kubenswrapper[4672]: I1206 09:20:16.179186 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/72a85d5f-d856-47b2-b0d6-f1fe23722f39-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-zpt5t\" (UID: \"72a85d5f-d856-47b2-b0d6-f1fe23722f39\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-zpt5t" Dec 06 09:20:16 crc kubenswrapper[4672]: I1206 09:20:16.369894 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-zpt5t" Dec 06 09:20:17 crc kubenswrapper[4672]: E1206 09:20:17.375826 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429" Dec 06 09:20:17 crc kubenswrapper[4672]: E1206 09:20:17.376046 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2nl2x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-2zwxr_openstack-operators(96ee3cc6-bf15-4fa0-9efc-7a0aa1338b43): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 09:20:17 crc kubenswrapper[4672]: E1206 09:20:17.915024 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59" Dec 06 09:20:17 crc kubenswrapper[4672]: E1206 09:20:17.915191 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xlmcn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-nqh5d_openstack-operators(e25e6854-1001-4962-bd9b-f4cb37ebefe1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 09:20:20 crc kubenswrapper[4672]: E1206 09:20:20.075078 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 06 09:20:20 crc kubenswrapper[4672]: E1206 09:20:20.075750 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c4j49,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-nkk8g_openstack-operators(73aa720c-9e22-4ef9-a5b4-512c0194f0a4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 09:20:20 crc kubenswrapper[4672]: E1206 09:20:20.632859 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557" Dec 06 09:20:20 crc kubenswrapper[4672]: E1206 09:20:20.633057 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ks2zq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-pqnb9_openstack-operators(a59bea52-a8d1-4ac9-8ce0-0a623efcb009): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 09:20:23 crc kubenswrapper[4672]: E1206 09:20:23.587458 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385" Dec 06 09:20:23 crc kubenswrapper[4672]: E1206 09:20:23.588922 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vjffr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-49652_openstack-operators(30a955f4-c456-4d9e-9621-dce7e9f7b8b8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 09:20:25 crc kubenswrapper[4672]: E1206 09:20:25.204024 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85" Dec 06 09:20:25 crc kubenswrapper[4672]: E1206 09:20:25.204445 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t2tks,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-78b4bc895b-6jcpj_openstack-operators(7e99a7a0-5a1d-4143-a8b7-9fb170d119a2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 09:20:27 crc kubenswrapper[4672]: E1206 09:20:27.337843 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:abdb733b01e92ac17f565762f30f1d075b44c16421bd06e557f6bb3c319e1809" Dec 06 09:20:27 crc kubenswrapper[4672]: E1206 09:20:27.338326 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:abdb733b01e92ac17f565762f30f1d075b44c16421bd06e557f6bb3c319e1809,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gjbtr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-77987cd8cd-p7c94_openstack-operators(018edeb2-cc58-49fe-a7ea-15a8b6646ddd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 09:20:33 crc kubenswrapper[4672]: E1206 09:20:33.005997 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 06 09:20:33 crc kubenswrapper[4672]: E1206 09:20:33.006530 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2rf68,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-kpmch_openstack-operators(8244458a-10b4-4c4f-8f9e-dc93e90329af): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 09:20:35 crc kubenswrapper[4672]: E1206 09:20:35.373534 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 06 09:20:35 crc kubenswrapper[4672]: E1206 09:20:35.374130 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fmm96,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-j7cvj_openstack-operators(308c58b1-3c6a-4c79-88fc-b4d515efd96d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 09:20:35 crc kubenswrapper[4672]: E1206 09:20:35.863641 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 06 09:20:35 crc kubenswrapper[4672]: E1206 09:20:35.864219 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2tjg9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-ntvgh_openstack-operators(dd2774f1-51aa-4387-aaf1-02cd8329ae1d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 09:20:35 crc kubenswrapper[4672]: E1206 09:20:35.866353 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ntvgh" podUID="dd2774f1-51aa-4387-aaf1-02cd8329ae1d" Dec 06 09:20:36 crc kubenswrapper[4672]: I1206 09:20:36.428283 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-rwjvr"] Dec 06 09:20:36 crc kubenswrapper[4672]: I1206 09:20:36.496258 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54bdf956c4-zpt5t"] Dec 06 09:20:36 crc kubenswrapper[4672]: I1206 09:20:36.503137 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f586tjc"] Dec 06 09:20:36 crc kubenswrapper[4672]: W1206 09:20:36.561620 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bbb7d8a_ba3a_476a_b09d_0fd084fc325e.slice/crio-f3f40ef843eee20becf0b77c759bbecbd60b0ce8258ae88c6987f2747b428e53 WatchSource:0}: Error finding container f3f40ef843eee20becf0b77c759bbecbd60b0ce8258ae88c6987f2747b428e53: Status 404 returned error can't find the container with id f3f40ef843eee20becf0b77c759bbecbd60b0ce8258ae88c6987f2747b428e53 Dec 06 09:20:36 crc kubenswrapper[4672]: I1206 09:20:36.885481 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-lh7x2" event={"ID":"ce4e8b8a-4f3a-4303-9455-8eb984c06f57","Type":"ContainerStarted","Data":"b71155e78b8e5378bd1049e7c4ebf13aa6188ca7d067dd5ca86aca9148ce1048"} Dec 06 09:20:36 crc kubenswrapper[4672]: I1206 09:20:36.889035 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-rwjvr" event={"ID":"6bbb7d8a-ba3a-476a-b09d-0fd084fc325e","Type":"ContainerStarted","Data":"f3f40ef843eee20becf0b77c759bbecbd60b0ce8258ae88c6987f2747b428e53"} Dec 06 09:20:36 crc kubenswrapper[4672]: I1206 09:20:36.890836 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-cpc5n" event={"ID":"7dc29189-4c37-4886-af89-7c6cb57f237e","Type":"ContainerStarted","Data":"4105dacbb71ddf76af6bad404b17303d93bdaa56ada24f47aee5c373ac37894a"} Dec 06 09:20:36 crc kubenswrapper[4672]: I1206 09:20:36.893874 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-8ql2p" event={"ID":"9977f421-c235-40ef-8d9f-2e0125bf3593","Type":"ContainerStarted","Data":"622cfeb778dfa0b3123f481b9ce840e586d25802ceaa861df9b201369133f800"} Dec 06 09:20:36 crc kubenswrapper[4672]: I1206 09:20:36.896835 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-crbgz" event={"ID":"27d7a7f5-ab93-40b6-8718-0a8b930d2c0f","Type":"ContainerStarted","Data":"196865d4978af70507a6e44192e19300ad459ba2d272242b48b6ba75c5a59726"} Dec 06 09:20:36 crc kubenswrapper[4672]: I1206 09:20:36.897915 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-zpt5t" event={"ID":"72a85d5f-d856-47b2-b0d6-f1fe23722f39","Type":"ContainerStarted","Data":"dc5d50b654cf3d1a80df70f5eca3591b791d7bcb778edc521a7ca3cc309b2df4"} Dec 06 09:20:36 crc kubenswrapper[4672]: I1206 09:20:36.900722 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dvzm4" event={"ID":"7753548d-df52-4a65-b447-d20dcd379cde","Type":"ContainerStarted","Data":"7934ee2aeb8109390c0f0413186bee55c3004c3b415d5917708ecb83c7551260"} Dec 06 09:20:36 crc kubenswrapper[4672]: I1206 09:20:36.910153 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zxcvx" event={"ID":"3fda2255-f593-42c6-b17e-2996a6ce7c5e","Type":"ContainerStarted","Data":"04d6e8287cac72446c281fdfcac0e027415bfeb9b1ecafc823d4ef17c0807ab9"} Dec 06 09:20:36 crc kubenswrapper[4672]: I1206 09:20:36.912483 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f586tjc" event={"ID":"4794dd53-214a-4537-90c9-0527db628c8b","Type":"ContainerStarted","Data":"503865ede7a63d706ec38d66465ee7599c9e651da3adda950710c1edecf4b300"} Dec 06 09:20:37 crc kubenswrapper[4672]: I1206 09:20:37.922721 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-vxgjl" event={"ID":"d1ba66a9-3383-413f-b2d3-fb13a4e4592b","Type":"ContainerStarted","Data":"cde95913bce058c44ac37e81f933de9d04d9567387600ba2dc99adf3ce1ae1ce"} Dec 06 09:20:37 crc kubenswrapper[4672]: I1206 09:20:37.928734 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-9p8xf" event={"ID":"b88a6b36-14ee-4898-beb2-dae9d2be7600","Type":"ContainerStarted","Data":"07f5adb889903eb0d64c281aea2b86c07fb6e498fdc09a00dc858835088bef97"} Dec 06 09:20:37 crc kubenswrapper[4672]: I1206 09:20:37.950082 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xbspr" event={"ID":"274d0d53-a194-47e5-b20d-e56155f01e72","Type":"ContainerStarted","Data":"5e17e93f2d737c3629872a341c684dc59b796d5d0fc838063fecaac821cd0e69"} Dec 06 09:20:39 crc kubenswrapper[4672]: I1206 09:20:39.967848 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5z9dq" event={"ID":"d6abdea8-a426-4553-b4e7-8998d96eaed3","Type":"ContainerStarted","Data":"19e9d07fc8e61c5324d8d18a29485d9936a63f61d0168649f7a770fdce018af6"} Dec 06 09:20:39 crc kubenswrapper[4672]: I1206 09:20:39.971698 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-zpt5t" event={"ID":"72a85d5f-d856-47b2-b0d6-f1fe23722f39","Type":"ContainerStarted","Data":"c8ef292421e6ea047ee35e732ce35aa7828f82e23bb6ea13f9a6c2d34c55f681"} Dec 06 09:20:39 crc kubenswrapper[4672]: I1206 09:20:39.972341 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-zpt5t" Dec 06 09:20:41 crc kubenswrapper[4672]: E1206 09:20:41.851511 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-kpmch" podUID="8244458a-10b4-4c4f-8f9e-dc93e90329af" Dec 06 09:20:41 crc kubenswrapper[4672]: E1206 09:20:41.959821 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pqnb9" podUID="a59bea52-a8d1-4ac9-8ce0-0a623efcb009" Dec 06 09:20:41 crc kubenswrapper[4672]: I1206 09:20:41.992825 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pqnb9" event={"ID":"a59bea52-a8d1-4ac9-8ce0-0a623efcb009","Type":"ContainerStarted","Data":"0d9068829c9dd803510fc7e8e98fe6aa05b0c51629bff5ae91d9a190e12be3b6"} Dec 06 09:20:42 crc kubenswrapper[4672]: I1206 09:20:42.002611 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-kpmch" event={"ID":"8244458a-10b4-4c4f-8f9e-dc93e90329af","Type":"ContainerStarted","Data":"718f00f035a1f0830ddb157049d1d3200d13c872cf49e2065c6e73a672b263b1"} Dec 06 09:20:42 crc kubenswrapper[4672]: E1206 09:20:42.004452 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-kpmch" podUID="8244458a-10b4-4c4f-8f9e-dc93e90329af" Dec 06 09:20:42 crc kubenswrapper[4672]: I1206 09:20:42.017164 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-zpt5t" podStartSLOduration=42.017131201 podStartE2EDuration="42.017131201s" podCreationTimestamp="2025-12-06 09:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:20:40.005177313 +0000 UTC m=+857.749437600" watchObservedRunningTime="2025-12-06 09:20:42.017131201 +0000 UTC m=+859.761391488" Dec 06 09:20:42 crc kubenswrapper[4672]: E1206 09:20:42.269297 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-49652" podUID="30a955f4-c456-4d9e-9621-dce7e9f7b8b8" Dec 06 09:20:42 crc kubenswrapper[4672]: E1206 09:20:42.472191 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nqh5d" podUID="e25e6854-1001-4962-bd9b-f4cb37ebefe1" Dec 06 09:20:42 crc kubenswrapper[4672]: E1206 09:20:42.727520 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-j7cvj" podUID="308c58b1-3c6a-4c79-88fc-b4d515efd96d" Dec 06 09:20:42 crc kubenswrapper[4672]: E1206 09:20:42.747778 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-6jcpj" podUID="7e99a7a0-5a1d-4143-a8b7-9fb170d119a2" Dec 06 09:20:42 crc kubenswrapper[4672]: E1206 09:20:42.891868 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-nkk8g" podUID="73aa720c-9e22-4ef9-a5b4-512c0194f0a4" Dec 06 09:20:42 crc kubenswrapper[4672]: E1206 09:20:42.893837 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-2zwxr" podUID="96ee3cc6-bf15-4fa0-9efc-7a0aa1338b43" Dec 06 09:20:43 crc kubenswrapper[4672]: E1206 09:20:43.002485 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-p7c94" podUID="018edeb2-cc58-49fe-a7ea-15a8b6646ddd" Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.029274 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-2zwxr" event={"ID":"96ee3cc6-bf15-4fa0-9efc-7a0aa1338b43","Type":"ContainerStarted","Data":"fbcced9a7ff2c965af6b853bd99a0cdde1f0ff1024d07936c01e08326f5c25be"} Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.041808 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-p7c94" event={"ID":"018edeb2-cc58-49fe-a7ea-15a8b6646ddd","Type":"ContainerStarted","Data":"995766e443839e4d4fa412bc29d883f29a86845e6404738f179fbb49d971abc9"} Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.055362 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-crbgz" event={"ID":"27d7a7f5-ab93-40b6-8718-0a8b930d2c0f","Type":"ContainerStarted","Data":"2c363bf1f560d6dbd895497e37e5c8a013e2ff6998ff852bdb8213a6d2ab7f4a"} Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.055403 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-crbgz" Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.070319 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-crbgz" Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.079845 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-crbgz" podStartSLOduration=3.661503324 podStartE2EDuration="44.079827155s" podCreationTimestamp="2025-12-06 09:19:59 +0000 UTC" firstStartedPulling="2025-12-06 09:20:01.398210333 +0000 UTC m=+819.142470620" lastFinishedPulling="2025-12-06 09:20:41.816534164 +0000 UTC m=+859.560794451" observedRunningTime="2025-12-06 09:20:43.078172808 +0000 UTC m=+860.822433095" watchObservedRunningTime="2025-12-06 09:20:43.079827155 +0000 UTC m=+860.824087442" Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.085060 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-nkk8g" event={"ID":"73aa720c-9e22-4ef9-a5b4-512c0194f0a4","Type":"ContainerStarted","Data":"78b24022c5daafd9e743c93e67a54359ba8f38efa7d6a4c9eebef82b541f9d98"} Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.097699 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-cpc5n" event={"ID":"7dc29189-4c37-4886-af89-7c6cb57f237e","Type":"ContainerStarted","Data":"5b883c6c23684a85794330fd76761224b2b41f7106792a9d7531f8cc7de3df93"} Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.098494 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-cpc5n" Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.110161 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-6jcpj" event={"ID":"7e99a7a0-5a1d-4143-a8b7-9fb170d119a2","Type":"ContainerStarted","Data":"8b11a508f5e9c1b87cddde4d6b61bb28947b8a900c98ad6aaf9a5fef814fb40c"} Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.115913 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-cpc5n" Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.147687 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-rwjvr" event={"ID":"6bbb7d8a-ba3a-476a-b09d-0fd084fc325e","Type":"ContainerStarted","Data":"f6f9f4d39582565fcfc6ccc6d6649fa3e9039914d3b6f49379efd7a2a0ab418c"} Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.147739 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-rwjvr" event={"ID":"6bbb7d8a-ba3a-476a-b09d-0fd084fc325e","Type":"ContainerStarted","Data":"e42c02641cd94b93e953bfd03b1d69086aff5b3751a91435e329424e31098c1a"} Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.148507 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-rwjvr" Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.185475 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xbspr" event={"ID":"274d0d53-a194-47e5-b20d-e56155f01e72","Type":"ContainerStarted","Data":"820fa5c08749c9e80d26ef14e92b4240edadec5aa5cbc4d61f1a12740018f659"} Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.186453 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xbspr" Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.191162 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xbspr" Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.212931 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-j7cvj" event={"ID":"308c58b1-3c6a-4c79-88fc-b4d515efd96d","Type":"ContainerStarted","Data":"821719ce5cfaf2026f6ed09a7099381b2d098ccdd0c6161e9349cdaf8c06e159"} Dec 06 09:20:43 crc kubenswrapper[4672]: E1206 09:20:43.217792 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-j7cvj" podUID="308c58b1-3c6a-4c79-88fc-b4d515efd96d" Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.221222 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dvzm4" event={"ID":"7753548d-df52-4a65-b447-d20dcd379cde","Type":"ContainerStarted","Data":"6d35097245c62c8fa41eb38919924acb36ba7c0373a5e31f8db2acb689850553"} Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.222201 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dvzm4" Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.229333 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dvzm4" Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.233288 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-49652" event={"ID":"30a955f4-c456-4d9e-9621-dce7e9f7b8b8","Type":"ContainerStarted","Data":"613864fb6374178c5cbb3f97a5e3f8782c6a46bfc94d0808b780161643cbd68c"} Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.236927 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-cpc5n" podStartSLOduration=4.8391676 podStartE2EDuration="45.236910885s" podCreationTimestamp="2025-12-06 09:19:58 +0000 UTC" firstStartedPulling="2025-12-06 09:20:01.258028505 +0000 UTC m=+819.002288792" lastFinishedPulling="2025-12-06 09:20:41.6557718 +0000 UTC m=+859.400032077" observedRunningTime="2025-12-06 09:20:43.222568105 +0000 UTC m=+860.966828392" watchObservedRunningTime="2025-12-06 09:20:43.236910885 +0000 UTC m=+860.981171172" Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.242002 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nqh5d" event={"ID":"e25e6854-1001-4962-bd9b-f4cb37ebefe1","Type":"ContainerStarted","Data":"fcabeb0fa8300f4c2f64b04bb35f83bed716556a763cd0cc06b36cd407d6a593"} Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.254965 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-8ql2p" event={"ID":"9977f421-c235-40ef-8d9f-2e0125bf3593","Type":"ContainerStarted","Data":"37987f18574b7b00cf55feaa75acd007a26938d5cea6b35b6c1142408b772eb3"} Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.257348 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-8ql2p" Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.261497 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-8ql2p" Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.264424 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5z9dq" event={"ID":"d6abdea8-a426-4553-b4e7-8998d96eaed3","Type":"ContainerStarted","Data":"102a4054e6b835c38c3b8b857c21621a5824a516485f0544ee95a468984d1719"} Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.265137 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5z9dq" Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.301665 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zxcvx" event={"ID":"3fda2255-f593-42c6-b17e-2996a6ce7c5e","Type":"ContainerStarted","Data":"881b3b9b56aece1fdf788302e99647c6332e36d8470c915f59d19fc37d95bd04"} Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.303005 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zxcvx" Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.311970 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zxcvx" Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.325772 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-vxgjl" event={"ID":"d1ba66a9-3383-413f-b2d3-fb13a4e4592b","Type":"ContainerStarted","Data":"ca228af8f60545efa4a503d273dc67fc3858d4efad307a12e3d6ba63fc010aab"} Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.328266 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-vxgjl" Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.333420 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-vxgjl" Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.350130 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-9p8xf" event={"ID":"b88a6b36-14ee-4898-beb2-dae9d2be7600","Type":"ContainerStarted","Data":"22335e7ada7f84a047c6407204d86f39c197bdb706b0a4443eae50e26ab4a037"} Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.350777 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-9p8xf" Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.357915 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-9p8xf" Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.368655 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f586tjc" event={"ID":"4794dd53-214a-4537-90c9-0527db628c8b","Type":"ContainerStarted","Data":"36f3821897d8aab0e24148ac1fe15ea793c8e24185a690e4d088b64a0e7b86fb"} Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.372725 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f586tjc" Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.455142 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5z9dq" podStartSLOduration=4.593689241 podStartE2EDuration="44.455123295s" podCreationTimestamp="2025-12-06 09:19:59 +0000 UTC" firstStartedPulling="2025-12-06 09:20:01.816362031 +0000 UTC m=+819.560622308" lastFinishedPulling="2025-12-06 09:20:41.677796075 +0000 UTC m=+859.422056362" observedRunningTime="2025-12-06 09:20:43.398280186 +0000 UTC m=+861.142540563" watchObservedRunningTime="2025-12-06 09:20:43.455123295 +0000 UTC m=+861.199383582" Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.507536 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xbspr" podStartSLOduration=4.731402539 podStartE2EDuration="44.507509099s" podCreationTimestamp="2025-12-06 09:19:59 +0000 UTC" firstStartedPulling="2025-12-06 09:20:01.793142182 +0000 UTC m=+819.537402469" lastFinishedPulling="2025-12-06 09:20:41.569248732 +0000 UTC m=+859.313509029" observedRunningTime="2025-12-06 09:20:43.489047933 +0000 UTC m=+861.233308230" watchObservedRunningTime="2025-12-06 09:20:43.507509099 +0000 UTC m=+861.251769386" Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.703069 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-8ql2p" podStartSLOduration=4.3333694640000004 podStartE2EDuration="44.703047914s" podCreationTimestamp="2025-12-06 09:19:59 +0000 UTC" firstStartedPulling="2025-12-06 09:20:01.394221912 +0000 UTC m=+819.138482199" lastFinishedPulling="2025-12-06 09:20:41.763900352 +0000 UTC m=+859.508160649" observedRunningTime="2025-12-06 09:20:43.672969694 +0000 UTC m=+861.417229991" watchObservedRunningTime="2025-12-06 09:20:43.703047914 +0000 UTC m=+861.447308201" Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.779375 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-rwjvr" podStartSLOduration=39.793857016 podStartE2EDuration="44.779360258s" podCreationTimestamp="2025-12-06 09:19:59 +0000 UTC" firstStartedPulling="2025-12-06 09:20:36.565971092 +0000 UTC m=+854.310231379" lastFinishedPulling="2025-12-06 09:20:41.551474314 +0000 UTC m=+859.295734621" observedRunningTime="2025-12-06 09:20:43.778924465 +0000 UTC m=+861.523184752" watchObservedRunningTime="2025-12-06 09:20:43.779360258 +0000 UTC m=+861.523620545" Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.845565 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dvzm4" podStartSLOduration=4.441179707 podStartE2EDuration="44.845548197s" podCreationTimestamp="2025-12-06 09:19:59 +0000 UTC" firstStartedPulling="2025-12-06 09:20:01.421756351 +0000 UTC m=+819.166016638" lastFinishedPulling="2025-12-06 09:20:41.826124841 +0000 UTC m=+859.570385128" observedRunningTime="2025-12-06 09:20:43.840889688 +0000 UTC m=+861.585149975" watchObservedRunningTime="2025-12-06 09:20:43.845548197 +0000 UTC m=+861.589808484" Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.863239 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zxcvx" podStartSLOduration=4.368919398 podStartE2EDuration="44.863220752s" podCreationTimestamp="2025-12-06 09:19:59 +0000 UTC" firstStartedPulling="2025-12-06 09:20:01.409146999 +0000 UTC m=+819.153407276" lastFinishedPulling="2025-12-06 09:20:41.903448343 +0000 UTC m=+859.647708630" observedRunningTime="2025-12-06 09:20:43.859491237 +0000 UTC m=+861.603751524" watchObservedRunningTime="2025-12-06 09:20:43.863220752 +0000 UTC m=+861.607481039" Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.927639 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f586tjc" podStartSLOduration=39.94786253 podStartE2EDuration="44.927621622s" podCreationTimestamp="2025-12-06 09:19:59 +0000 UTC" firstStartedPulling="2025-12-06 09:20:36.57164262 +0000 UTC m=+854.315902907" lastFinishedPulling="2025-12-06 09:20:41.551401712 +0000 UTC m=+859.295661999" observedRunningTime="2025-12-06 09:20:43.924370091 +0000 UTC m=+861.668630378" watchObservedRunningTime="2025-12-06 09:20:43.927621622 +0000 UTC m=+861.671881929" Dec 06 09:20:43 crc kubenswrapper[4672]: I1206 09:20:43.998121 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-vxgjl" podStartSLOduration=5.182882589 podStartE2EDuration="44.998104522s" podCreationTimestamp="2025-12-06 09:19:59 +0000 UTC" firstStartedPulling="2025-12-06 09:20:01.776177458 +0000 UTC m=+819.520437745" lastFinishedPulling="2025-12-06 09:20:41.591399371 +0000 UTC m=+859.335659678" observedRunningTime="2025-12-06 09:20:43.992215128 +0000 UTC m=+861.736475415" watchObservedRunningTime="2025-12-06 09:20:43.998104522 +0000 UTC m=+861.742364809" Dec 06 09:20:44 crc kubenswrapper[4672]: I1206 09:20:44.036225 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-9p8xf" podStartSLOduration=5.045380317 podStartE2EDuration="45.036203127s" podCreationTimestamp="2025-12-06 09:19:59 +0000 UTC" firstStartedPulling="2025-12-06 09:20:01.772575918 +0000 UTC m=+819.516836205" lastFinishedPulling="2025-12-06 09:20:41.763398728 +0000 UTC m=+859.507659015" observedRunningTime="2025-12-06 09:20:44.031265189 +0000 UTC m=+861.775525486" watchObservedRunningTime="2025-12-06 09:20:44.036203127 +0000 UTC m=+861.780463414" Dec 06 09:20:44 crc kubenswrapper[4672]: I1206 09:20:44.372943 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f586tjc" event={"ID":"4794dd53-214a-4537-90c9-0527db628c8b","Type":"ContainerStarted","Data":"0bbc8d688d4762c86be160646795ced778d651143badb2741c01d10a3c5e33ee"} Dec 06 09:20:44 crc kubenswrapper[4672]: I1206 09:20:44.375272 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pqnb9" event={"ID":"a59bea52-a8d1-4ac9-8ce0-0a623efcb009","Type":"ContainerStarted","Data":"7de27a2a8d0142eb0c6979e95a8a025663a799ec0659303907d2ae98316dd310"} Dec 06 09:20:44 crc kubenswrapper[4672]: I1206 09:20:44.375579 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pqnb9" Dec 06 09:20:44 crc kubenswrapper[4672]: I1206 09:20:44.377070 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-lh7x2" event={"ID":"ce4e8b8a-4f3a-4303-9455-8eb984c06f57","Type":"ContainerStarted","Data":"d0807d2b8ba66605eeda5287c747d088c66bd18dc2dd77042cf079842a36f751"} Dec 06 09:20:44 crc kubenswrapper[4672]: E1206 09:20:44.380954 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-j7cvj" podUID="308c58b1-3c6a-4c79-88fc-b4d515efd96d" Dec 06 09:20:44 crc kubenswrapper[4672]: I1206 09:20:44.382762 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5z9dq" Dec 06 09:20:44 crc kubenswrapper[4672]: I1206 09:20:44.429033 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-lh7x2" podStartSLOduration=5.457954197 podStartE2EDuration="46.429017837s" podCreationTimestamp="2025-12-06 09:19:58 +0000 UTC" firstStartedPulling="2025-12-06 09:20:00.84965156 +0000 UTC m=+818.593911847" lastFinishedPulling="2025-12-06 09:20:41.8207152 +0000 UTC m=+859.564975487" observedRunningTime="2025-12-06 09:20:44.427025681 +0000 UTC m=+862.171285968" watchObservedRunningTime="2025-12-06 09:20:44.429017837 +0000 UTC m=+862.173278124" Dec 06 09:20:44 crc kubenswrapper[4672]: I1206 09:20:44.429921 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pqnb9" podStartSLOduration=4.360225905 podStartE2EDuration="45.429916012s" podCreationTimestamp="2025-12-06 09:19:59 +0000 UTC" firstStartedPulling="2025-12-06 09:20:01.408758558 +0000 UTC m=+819.153018835" lastFinishedPulling="2025-12-06 09:20:42.478448655 +0000 UTC m=+860.222708942" observedRunningTime="2025-12-06 09:20:44.405970122 +0000 UTC m=+862.150230429" watchObservedRunningTime="2025-12-06 09:20:44.429916012 +0000 UTC m=+862.174176299" Dec 06 09:20:45 crc kubenswrapper[4672]: I1206 09:20:45.385652 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-2zwxr" event={"ID":"96ee3cc6-bf15-4fa0-9efc-7a0aa1338b43","Type":"ContainerStarted","Data":"dd26e2d84623fdb7501173350f116f10d5839a025a10552ee727b5b774cb725a"} Dec 06 09:20:45 crc kubenswrapper[4672]: I1206 09:20:45.391861 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-lh7x2" Dec 06 09:20:45 crc kubenswrapper[4672]: I1206 09:20:45.393685 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-lh7x2" Dec 06 09:20:45 crc kubenswrapper[4672]: I1206 09:20:45.436352 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-2zwxr" podStartSLOduration=2.667863851 podStartE2EDuration="46.436331743s" podCreationTimestamp="2025-12-06 09:19:59 +0000 UTC" firstStartedPulling="2025-12-06 09:20:01.340324275 +0000 UTC m=+819.084584562" lastFinishedPulling="2025-12-06 09:20:45.108792167 +0000 UTC m=+862.853052454" observedRunningTime="2025-12-06 09:20:45.418919436 +0000 UTC m=+863.163179723" watchObservedRunningTime="2025-12-06 09:20:45.436331743 +0000 UTC m=+863.180592030" Dec 06 09:20:46 crc kubenswrapper[4672]: I1206 09:20:46.380481 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-zpt5t" Dec 06 09:20:46 crc kubenswrapper[4672]: I1206 09:20:46.394009 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nqh5d" event={"ID":"e25e6854-1001-4962-bd9b-f4cb37ebefe1","Type":"ContainerStarted","Data":"f6d97067925d885bdd16546fe09551f1c89d4c7744a852dca9b6a4b331d89c81"} Dec 06 09:20:46 crc kubenswrapper[4672]: I1206 09:20:46.395300 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nqh5d" Dec 06 09:20:46 crc kubenswrapper[4672]: I1206 09:20:46.395535 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-6jcpj" event={"ID":"7e99a7a0-5a1d-4143-a8b7-9fb170d119a2","Type":"ContainerStarted","Data":"ecaa181e498b8e7533e86b5ffab5b5aa9c95200e0c57ca52954aad8faf639b9e"} Dec 06 09:20:46 crc kubenswrapper[4672]: I1206 09:20:46.396166 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-6jcpj" Dec 06 09:20:46 crc kubenswrapper[4672]: I1206 09:20:46.397907 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-nkk8g" event={"ID":"73aa720c-9e22-4ef9-a5b4-512c0194f0a4","Type":"ContainerStarted","Data":"f4a933bd370fe77302e61c86b60fa5ebc446733f7a2222520ab7b301021f4828"} Dec 06 09:20:46 crc kubenswrapper[4672]: I1206 09:20:46.398549 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-nkk8g" Dec 06 09:20:46 crc kubenswrapper[4672]: I1206 09:20:46.400648 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-kpmch" event={"ID":"8244458a-10b4-4c4f-8f9e-dc93e90329af","Type":"ContainerStarted","Data":"4602b6a13342dca6658f5dd1525150838601b0301bb3b94e9fa3bb6f89c947b0"} Dec 06 09:20:46 crc kubenswrapper[4672]: I1206 09:20:46.401388 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-kpmch" Dec 06 09:20:46 crc kubenswrapper[4672]: I1206 09:20:46.406388 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-49652" event={"ID":"30a955f4-c456-4d9e-9621-dce7e9f7b8b8","Type":"ContainerStarted","Data":"480e0d05e6c681a3f00384e4b5c7b28d401a0bab428724ec9eb3321a1b020620"} Dec 06 09:20:46 crc kubenswrapper[4672]: I1206 09:20:46.406523 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-49652" Dec 06 09:20:46 crc kubenswrapper[4672]: I1206 09:20:46.408138 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-p7c94" event={"ID":"018edeb2-cc58-49fe-a7ea-15a8b6646ddd","Type":"ContainerStarted","Data":"3b1fccb1bcd0e70a1ede6f7fb53be71c7af8e6c948a6c6655926b96f813340c8"} Dec 06 09:20:46 crc kubenswrapper[4672]: I1206 09:20:46.408712 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-p7c94" Dec 06 09:20:46 crc kubenswrapper[4672]: I1206 09:20:46.408756 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-2zwxr" Dec 06 09:20:46 crc kubenswrapper[4672]: I1206 09:20:46.480205 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-p7c94" podStartSLOduration=3.137180659 podStartE2EDuration="47.48018193s" podCreationTimestamp="2025-12-06 09:19:59 +0000 UTC" firstStartedPulling="2025-12-06 09:20:00.848961211 +0000 UTC m=+818.593221498" lastFinishedPulling="2025-12-06 09:20:45.191962482 +0000 UTC m=+862.936222769" observedRunningTime="2025-12-06 09:20:46.475666704 +0000 UTC m=+864.219926991" watchObservedRunningTime="2025-12-06 09:20:46.48018193 +0000 UTC m=+864.224442217" Dec 06 09:20:46 crc kubenswrapper[4672]: I1206 09:20:46.513036 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-49652" podStartSLOduration=4.00627132 podStartE2EDuration="47.513014907s" podCreationTimestamp="2025-12-06 09:19:59 +0000 UTC" firstStartedPulling="2025-12-06 09:20:01.768944276 +0000 UTC m=+819.513204563" lastFinishedPulling="2025-12-06 09:20:45.275687863 +0000 UTC m=+863.019948150" observedRunningTime="2025-12-06 09:20:46.505443107 +0000 UTC m=+864.249703404" watchObservedRunningTime="2025-12-06 09:20:46.513014907 +0000 UTC m=+864.257275194" Dec 06 09:20:46 crc kubenswrapper[4672]: I1206 09:20:46.523271 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-6jcpj" podStartSLOduration=4.1116824770000004 podStartE2EDuration="48.523251614s" podCreationTimestamp="2025-12-06 09:19:58 +0000 UTC" firstStartedPulling="2025-12-06 09:20:00.755242162 +0000 UTC m=+818.499502449" lastFinishedPulling="2025-12-06 09:20:45.166811309 +0000 UTC m=+862.911071586" observedRunningTime="2025-12-06 09:20:46.519211031 +0000 UTC m=+864.263471328" watchObservedRunningTime="2025-12-06 09:20:46.523251614 +0000 UTC m=+864.267511901" Dec 06 09:20:46 crc kubenswrapper[4672]: I1206 09:20:46.544645 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-nkk8g" podStartSLOduration=3.8705413269999998 podStartE2EDuration="47.544627622s" podCreationTimestamp="2025-12-06 09:19:59 +0000 UTC" firstStartedPulling="2025-12-06 09:20:01.769032688 +0000 UTC m=+819.513292975" lastFinishedPulling="2025-12-06 09:20:45.443118983 +0000 UTC m=+863.187379270" observedRunningTime="2025-12-06 09:20:46.54168554 +0000 UTC m=+864.285945827" watchObservedRunningTime="2025-12-06 09:20:46.544627622 +0000 UTC m=+864.288887909" Dec 06 09:20:46 crc kubenswrapper[4672]: I1206 09:20:46.561415 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nqh5d" podStartSLOduration=3.49718657 podStartE2EDuration="47.56139545s" podCreationTimestamp="2025-12-06 09:19:59 +0000 UTC" firstStartedPulling="2025-12-06 09:20:01.392810332 +0000 UTC m=+819.137070619" lastFinishedPulling="2025-12-06 09:20:45.457019212 +0000 UTC m=+863.201279499" observedRunningTime="2025-12-06 09:20:46.556111893 +0000 UTC m=+864.300372180" watchObservedRunningTime="2025-12-06 09:20:46.56139545 +0000 UTC m=+864.305655727" Dec 06 09:20:46 crc kubenswrapper[4672]: I1206 09:20:46.588941 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-kpmch" podStartSLOduration=3.457383059 podStartE2EDuration="47.58892477s" podCreationTimestamp="2025-12-06 09:19:59 +0000 UTC" firstStartedPulling="2025-12-06 09:20:01.292274872 +0000 UTC m=+819.036535149" lastFinishedPulling="2025-12-06 09:20:45.423816573 +0000 UTC m=+863.168076860" observedRunningTime="2025-12-06 09:20:46.587736996 +0000 UTC m=+864.331997283" watchObservedRunningTime="2025-12-06 09:20:46.58892477 +0000 UTC m=+864.333185057" Dec 06 09:20:47 crc kubenswrapper[4672]: E1206 09:20:47.558682 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ntvgh" podUID="dd2774f1-51aa-4387-aaf1-02cd8329ae1d" Dec 06 09:20:50 crc kubenswrapper[4672]: I1206 09:20:50.075080 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-kpmch" Dec 06 09:20:50 crc kubenswrapper[4672]: I1206 09:20:50.075161 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pqnb9" Dec 06 09:20:50 crc kubenswrapper[4672]: I1206 09:20:50.087949 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nqh5d" Dec 06 09:20:50 crc kubenswrapper[4672]: I1206 09:20:50.378624 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-nkk8g" Dec 06 09:20:50 crc kubenswrapper[4672]: I1206 09:20:50.465180 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-49652" Dec 06 09:20:55 crc kubenswrapper[4672]: I1206 09:20:55.325808 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-rwjvr" Dec 06 09:20:55 crc kubenswrapper[4672]: I1206 09:20:55.821291 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f586tjc" Dec 06 09:20:58 crc kubenswrapper[4672]: I1206 09:20:58.558449 4672 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 09:20:59 crc kubenswrapper[4672]: I1206 09:20:59.304233 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-6jcpj" Dec 06 09:20:59 crc kubenswrapper[4672]: I1206 09:20:59.372825 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-p7c94" Dec 06 09:20:59 crc kubenswrapper[4672]: I1206 09:20:59.409925 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-2zwxr" Dec 06 09:20:59 crc kubenswrapper[4672]: I1206 09:20:59.500437 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-j7cvj" event={"ID":"308c58b1-3c6a-4c79-88fc-b4d515efd96d","Type":"ContainerStarted","Data":"3d9b2dffe5673670bc2865d52bf5524ae332847ffd90a25d18e78e6fcd3d99f9"} Dec 06 09:20:59 crc kubenswrapper[4672]: I1206 09:20:59.500632 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-j7cvj" Dec 06 09:20:59 crc kubenswrapper[4672]: I1206 09:20:59.522209 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-j7cvj" podStartSLOduration=2.850578087 podStartE2EDuration="1m0.522189453s" podCreationTimestamp="2025-12-06 09:19:59 +0000 UTC" firstStartedPulling="2025-12-06 09:20:01.299452853 +0000 UTC m=+819.043713140" lastFinishedPulling="2025-12-06 09:20:58.971064219 +0000 UTC m=+876.715324506" observedRunningTime="2025-12-06 09:20:59.51669393 +0000 UTC m=+877.260954237" watchObservedRunningTime="2025-12-06 09:20:59.522189453 +0000 UTC m=+877.266449740" Dec 06 09:21:03 crc kubenswrapper[4672]: I1206 09:21:03.524879 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ntvgh" event={"ID":"dd2774f1-51aa-4387-aaf1-02cd8329ae1d","Type":"ContainerStarted","Data":"33388b7e7f2392839c53492d4b2829ca8380a8adb28ac7125afe41d05688444d"} Dec 06 09:21:03 crc kubenswrapper[4672]: I1206 09:21:03.540501 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ntvgh" podStartSLOduration=2.309414644 podStartE2EDuration="1m3.540478852s" podCreationTimestamp="2025-12-06 09:20:00 +0000 UTC" firstStartedPulling="2025-12-06 09:20:01.778592476 +0000 UTC m=+819.522852763" lastFinishedPulling="2025-12-06 09:21:03.009656684 +0000 UTC m=+880.753916971" observedRunningTime="2025-12-06 09:21:03.537169699 +0000 UTC m=+881.281429976" watchObservedRunningTime="2025-12-06 09:21:03.540478852 +0000 UTC m=+881.284739139" Dec 06 09:21:09 crc kubenswrapper[4672]: I1206 09:21:09.864453 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-j7cvj" Dec 06 09:21:31 crc kubenswrapper[4672]: I1206 09:21:31.119340 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-mrcqm"] Dec 06 09:21:31 crc kubenswrapper[4672]: I1206 09:21:31.121067 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-mrcqm" Dec 06 09:21:31 crc kubenswrapper[4672]: I1206 09:21:31.126116 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 06 09:21:31 crc kubenswrapper[4672]: I1206 09:21:31.126292 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-8xqbw" Dec 06 09:21:31 crc kubenswrapper[4672]: I1206 09:21:31.129221 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 06 09:21:31 crc kubenswrapper[4672]: I1206 09:21:31.129330 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 06 09:21:31 crc kubenswrapper[4672]: I1206 09:21:31.133222 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-mrcqm"] Dec 06 09:21:31 crc kubenswrapper[4672]: I1206 09:21:31.204175 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-567c455747-xtk2q"] Dec 06 09:21:31 crc kubenswrapper[4672]: I1206 09:21:31.205291 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-xtk2q" Dec 06 09:21:31 crc kubenswrapper[4672]: I1206 09:21:31.216954 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 06 09:21:31 crc kubenswrapper[4672]: I1206 09:21:31.261317 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-567c455747-xtk2q"] Dec 06 09:21:31 crc kubenswrapper[4672]: I1206 09:21:31.262929 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2lzw\" (UniqueName: \"kubernetes.io/projected/e587738d-3fc4-4187-b3ee-e77508f06a89-kube-api-access-s2lzw\") pod \"dnsmasq-dns-5cd484bb89-mrcqm\" (UID: \"e587738d-3fc4-4187-b3ee-e77508f06a89\") " pod="openstack/dnsmasq-dns-5cd484bb89-mrcqm" Dec 06 09:21:31 crc kubenswrapper[4672]: I1206 09:21:31.262999 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e587738d-3fc4-4187-b3ee-e77508f06a89-config\") pod \"dnsmasq-dns-5cd484bb89-mrcqm\" (UID: \"e587738d-3fc4-4187-b3ee-e77508f06a89\") " pod="openstack/dnsmasq-dns-5cd484bb89-mrcqm" Dec 06 09:21:31 crc kubenswrapper[4672]: I1206 09:21:31.364869 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18bf4162-886d-4591-8a9b-6ae9352f6537-config\") pod \"dnsmasq-dns-567c455747-xtk2q\" (UID: \"18bf4162-886d-4591-8a9b-6ae9352f6537\") " pod="openstack/dnsmasq-dns-567c455747-xtk2q" Dec 06 09:21:31 crc kubenswrapper[4672]: I1206 09:21:31.365206 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2lzw\" (UniqueName: \"kubernetes.io/projected/e587738d-3fc4-4187-b3ee-e77508f06a89-kube-api-access-s2lzw\") pod \"dnsmasq-dns-5cd484bb89-mrcqm\" (UID: \"e587738d-3fc4-4187-b3ee-e77508f06a89\") " pod="openstack/dnsmasq-dns-5cd484bb89-mrcqm" Dec 06 09:21:31 crc kubenswrapper[4672]: I1206 09:21:31.365348 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e587738d-3fc4-4187-b3ee-e77508f06a89-config\") pod \"dnsmasq-dns-5cd484bb89-mrcqm\" (UID: \"e587738d-3fc4-4187-b3ee-e77508f06a89\") " pod="openstack/dnsmasq-dns-5cd484bb89-mrcqm" Dec 06 09:21:31 crc kubenswrapper[4672]: I1206 09:21:31.365461 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18bf4162-886d-4591-8a9b-6ae9352f6537-dns-svc\") pod \"dnsmasq-dns-567c455747-xtk2q\" (UID: \"18bf4162-886d-4591-8a9b-6ae9352f6537\") " pod="openstack/dnsmasq-dns-567c455747-xtk2q" Dec 06 09:21:31 crc kubenswrapper[4672]: I1206 09:21:31.365637 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k42b\" (UniqueName: \"kubernetes.io/projected/18bf4162-886d-4591-8a9b-6ae9352f6537-kube-api-access-2k42b\") pod \"dnsmasq-dns-567c455747-xtk2q\" (UID: \"18bf4162-886d-4591-8a9b-6ae9352f6537\") " pod="openstack/dnsmasq-dns-567c455747-xtk2q" Dec 06 09:21:31 crc kubenswrapper[4672]: I1206 09:21:31.366259 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e587738d-3fc4-4187-b3ee-e77508f06a89-config\") pod \"dnsmasq-dns-5cd484bb89-mrcqm\" (UID: \"e587738d-3fc4-4187-b3ee-e77508f06a89\") " pod="openstack/dnsmasq-dns-5cd484bb89-mrcqm" Dec 06 09:21:31 crc kubenswrapper[4672]: I1206 09:21:31.382974 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2lzw\" (UniqueName: \"kubernetes.io/projected/e587738d-3fc4-4187-b3ee-e77508f06a89-kube-api-access-s2lzw\") pod \"dnsmasq-dns-5cd484bb89-mrcqm\" (UID: \"e587738d-3fc4-4187-b3ee-e77508f06a89\") " pod="openstack/dnsmasq-dns-5cd484bb89-mrcqm" Dec 06 09:21:31 crc kubenswrapper[4672]: I1206 09:21:31.435729 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-mrcqm" Dec 06 09:21:31 crc kubenswrapper[4672]: I1206 09:21:31.467348 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k42b\" (UniqueName: \"kubernetes.io/projected/18bf4162-886d-4591-8a9b-6ae9352f6537-kube-api-access-2k42b\") pod \"dnsmasq-dns-567c455747-xtk2q\" (UID: \"18bf4162-886d-4591-8a9b-6ae9352f6537\") " pod="openstack/dnsmasq-dns-567c455747-xtk2q" Dec 06 09:21:31 crc kubenswrapper[4672]: I1206 09:21:31.467452 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18bf4162-886d-4591-8a9b-6ae9352f6537-config\") pod \"dnsmasq-dns-567c455747-xtk2q\" (UID: \"18bf4162-886d-4591-8a9b-6ae9352f6537\") " pod="openstack/dnsmasq-dns-567c455747-xtk2q" Dec 06 09:21:31 crc kubenswrapper[4672]: I1206 09:21:31.467519 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18bf4162-886d-4591-8a9b-6ae9352f6537-dns-svc\") pod \"dnsmasq-dns-567c455747-xtk2q\" (UID: \"18bf4162-886d-4591-8a9b-6ae9352f6537\") " pod="openstack/dnsmasq-dns-567c455747-xtk2q" Dec 06 09:21:31 crc kubenswrapper[4672]: I1206 09:21:31.468463 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18bf4162-886d-4591-8a9b-6ae9352f6537-config\") pod \"dnsmasq-dns-567c455747-xtk2q\" (UID: \"18bf4162-886d-4591-8a9b-6ae9352f6537\") " pod="openstack/dnsmasq-dns-567c455747-xtk2q" Dec 06 09:21:31 crc kubenswrapper[4672]: I1206 09:21:31.468470 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18bf4162-886d-4591-8a9b-6ae9352f6537-dns-svc\") pod \"dnsmasq-dns-567c455747-xtk2q\" (UID: \"18bf4162-886d-4591-8a9b-6ae9352f6537\") " pod="openstack/dnsmasq-dns-567c455747-xtk2q" Dec 06 09:21:31 crc kubenswrapper[4672]: I1206 09:21:31.487275 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k42b\" (UniqueName: \"kubernetes.io/projected/18bf4162-886d-4591-8a9b-6ae9352f6537-kube-api-access-2k42b\") pod \"dnsmasq-dns-567c455747-xtk2q\" (UID: \"18bf4162-886d-4591-8a9b-6ae9352f6537\") " pod="openstack/dnsmasq-dns-567c455747-xtk2q" Dec 06 09:21:31 crc kubenswrapper[4672]: I1206 09:21:31.528865 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-xtk2q" Dec 06 09:21:31 crc kubenswrapper[4672]: I1206 09:21:31.902948 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-mrcqm"] Dec 06 09:21:31 crc kubenswrapper[4672]: W1206 09:21:31.908552 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode587738d_3fc4_4187_b3ee_e77508f06a89.slice/crio-f6f1a04d7cf2a77539d398d51a119c15fd129b731cbb454760f8955577ffe27b WatchSource:0}: Error finding container f6f1a04d7cf2a77539d398d51a119c15fd129b731cbb454760f8955577ffe27b: Status 404 returned error can't find the container with id f6f1a04d7cf2a77539d398d51a119c15fd129b731cbb454760f8955577ffe27b Dec 06 09:21:31 crc kubenswrapper[4672]: I1206 09:21:31.970928 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-567c455747-xtk2q"] Dec 06 09:21:31 crc kubenswrapper[4672]: W1206 09:21:31.974632 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18bf4162_886d_4591_8a9b_6ae9352f6537.slice/crio-c1500258e893c182f024ec61913c195b74949d094619d0981b3b881d7524afc7 WatchSource:0}: Error finding container c1500258e893c182f024ec61913c195b74949d094619d0981b3b881d7524afc7: Status 404 returned error can't find the container with id c1500258e893c182f024ec61913c195b74949d094619d0981b3b881d7524afc7 Dec 06 09:21:32 crc kubenswrapper[4672]: I1206 09:21:32.748774 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-567c455747-xtk2q" event={"ID":"18bf4162-886d-4591-8a9b-6ae9352f6537","Type":"ContainerStarted","Data":"c1500258e893c182f024ec61913c195b74949d094619d0981b3b881d7524afc7"} Dec 06 09:21:32 crc kubenswrapper[4672]: I1206 09:21:32.751177 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cd484bb89-mrcqm" event={"ID":"e587738d-3fc4-4187-b3ee-e77508f06a89","Type":"ContainerStarted","Data":"f6f1a04d7cf2a77539d398d51a119c15fd129b731cbb454760f8955577ffe27b"} Dec 06 09:21:33 crc kubenswrapper[4672]: I1206 09:21:33.834873 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-567c455747-xtk2q"] Dec 06 09:21:33 crc kubenswrapper[4672]: I1206 09:21:33.879950 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-bkstt"] Dec 06 09:21:33 crc kubenswrapper[4672]: I1206 09:21:33.881102 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc4b48fc9-bkstt" Dec 06 09:21:33 crc kubenswrapper[4672]: I1206 09:21:33.899118 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-bkstt"] Dec 06 09:21:34 crc kubenswrapper[4672]: I1206 09:21:34.021276 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b19d3022-686a-4cad-9a8f-cb89e48efeca-config\") pod \"dnsmasq-dns-bc4b48fc9-bkstt\" (UID: \"b19d3022-686a-4cad-9a8f-cb89e48efeca\") " pod="openstack/dnsmasq-dns-bc4b48fc9-bkstt" Dec 06 09:21:34 crc kubenswrapper[4672]: I1206 09:21:34.021341 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrkfs\" (UniqueName: \"kubernetes.io/projected/b19d3022-686a-4cad-9a8f-cb89e48efeca-kube-api-access-vrkfs\") pod \"dnsmasq-dns-bc4b48fc9-bkstt\" (UID: \"b19d3022-686a-4cad-9a8f-cb89e48efeca\") " pod="openstack/dnsmasq-dns-bc4b48fc9-bkstt" Dec 06 09:21:34 crc kubenswrapper[4672]: I1206 09:21:34.021377 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b19d3022-686a-4cad-9a8f-cb89e48efeca-dns-svc\") pod \"dnsmasq-dns-bc4b48fc9-bkstt\" (UID: \"b19d3022-686a-4cad-9a8f-cb89e48efeca\") " pod="openstack/dnsmasq-dns-bc4b48fc9-bkstt" Dec 06 09:21:34 crc kubenswrapper[4672]: I1206 09:21:34.123132 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrkfs\" (UniqueName: \"kubernetes.io/projected/b19d3022-686a-4cad-9a8f-cb89e48efeca-kube-api-access-vrkfs\") pod \"dnsmasq-dns-bc4b48fc9-bkstt\" (UID: \"b19d3022-686a-4cad-9a8f-cb89e48efeca\") " pod="openstack/dnsmasq-dns-bc4b48fc9-bkstt" Dec 06 09:21:34 crc kubenswrapper[4672]: I1206 09:21:34.123195 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b19d3022-686a-4cad-9a8f-cb89e48efeca-dns-svc\") pod \"dnsmasq-dns-bc4b48fc9-bkstt\" (UID: \"b19d3022-686a-4cad-9a8f-cb89e48efeca\") " pod="openstack/dnsmasq-dns-bc4b48fc9-bkstt" Dec 06 09:21:34 crc kubenswrapper[4672]: I1206 09:21:34.123260 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b19d3022-686a-4cad-9a8f-cb89e48efeca-config\") pod \"dnsmasq-dns-bc4b48fc9-bkstt\" (UID: \"b19d3022-686a-4cad-9a8f-cb89e48efeca\") " pod="openstack/dnsmasq-dns-bc4b48fc9-bkstt" Dec 06 09:21:34 crc kubenswrapper[4672]: I1206 09:21:34.124089 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b19d3022-686a-4cad-9a8f-cb89e48efeca-config\") pod \"dnsmasq-dns-bc4b48fc9-bkstt\" (UID: \"b19d3022-686a-4cad-9a8f-cb89e48efeca\") " pod="openstack/dnsmasq-dns-bc4b48fc9-bkstt" Dec 06 09:21:34 crc kubenswrapper[4672]: I1206 09:21:34.124619 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b19d3022-686a-4cad-9a8f-cb89e48efeca-dns-svc\") pod \"dnsmasq-dns-bc4b48fc9-bkstt\" (UID: \"b19d3022-686a-4cad-9a8f-cb89e48efeca\") " pod="openstack/dnsmasq-dns-bc4b48fc9-bkstt" Dec 06 09:21:34 crc kubenswrapper[4672]: I1206 09:21:34.168085 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrkfs\" (UniqueName: \"kubernetes.io/projected/b19d3022-686a-4cad-9a8f-cb89e48efeca-kube-api-access-vrkfs\") pod \"dnsmasq-dns-bc4b48fc9-bkstt\" (UID: \"b19d3022-686a-4cad-9a8f-cb89e48efeca\") " pod="openstack/dnsmasq-dns-bc4b48fc9-bkstt" Dec 06 09:21:34 crc kubenswrapper[4672]: I1206 09:21:34.208960 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-mrcqm"] Dec 06 09:21:34 crc kubenswrapper[4672]: I1206 09:21:34.225041 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc4b48fc9-bkstt" Dec 06 09:21:34 crc kubenswrapper[4672]: I1206 09:21:34.260028 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb666b895-d99dl"] Dec 06 09:21:34 crc kubenswrapper[4672]: I1206 09:21:34.261143 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-d99dl" Dec 06 09:21:34 crc kubenswrapper[4672]: I1206 09:21:34.280193 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-d99dl"] Dec 06 09:21:34 crc kubenswrapper[4672]: I1206 09:21:34.427323 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6d719fd-72b2-4fe2-a634-b92e6b6f3902-config\") pod \"dnsmasq-dns-cb666b895-d99dl\" (UID: \"e6d719fd-72b2-4fe2-a634-b92e6b6f3902\") " pod="openstack/dnsmasq-dns-cb666b895-d99dl" Dec 06 09:21:34 crc kubenswrapper[4672]: I1206 09:21:34.427661 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx4l6\" (UniqueName: \"kubernetes.io/projected/e6d719fd-72b2-4fe2-a634-b92e6b6f3902-kube-api-access-nx4l6\") pod \"dnsmasq-dns-cb666b895-d99dl\" (UID: \"e6d719fd-72b2-4fe2-a634-b92e6b6f3902\") " pod="openstack/dnsmasq-dns-cb666b895-d99dl" Dec 06 09:21:34 crc kubenswrapper[4672]: I1206 09:21:34.427721 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6d719fd-72b2-4fe2-a634-b92e6b6f3902-dns-svc\") pod \"dnsmasq-dns-cb666b895-d99dl\" (UID: \"e6d719fd-72b2-4fe2-a634-b92e6b6f3902\") " pod="openstack/dnsmasq-dns-cb666b895-d99dl" Dec 06 09:21:34 crc kubenswrapper[4672]: I1206 09:21:34.535295 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx4l6\" (UniqueName: \"kubernetes.io/projected/e6d719fd-72b2-4fe2-a634-b92e6b6f3902-kube-api-access-nx4l6\") pod \"dnsmasq-dns-cb666b895-d99dl\" (UID: \"e6d719fd-72b2-4fe2-a634-b92e6b6f3902\") " pod="openstack/dnsmasq-dns-cb666b895-d99dl" Dec 06 09:21:34 crc kubenswrapper[4672]: I1206 09:21:34.535404 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6d719fd-72b2-4fe2-a634-b92e6b6f3902-dns-svc\") pod \"dnsmasq-dns-cb666b895-d99dl\" (UID: \"e6d719fd-72b2-4fe2-a634-b92e6b6f3902\") " pod="openstack/dnsmasq-dns-cb666b895-d99dl" Dec 06 09:21:34 crc kubenswrapper[4672]: I1206 09:21:34.535438 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6d719fd-72b2-4fe2-a634-b92e6b6f3902-config\") pod \"dnsmasq-dns-cb666b895-d99dl\" (UID: \"e6d719fd-72b2-4fe2-a634-b92e6b6f3902\") " pod="openstack/dnsmasq-dns-cb666b895-d99dl" Dec 06 09:21:34 crc kubenswrapper[4672]: I1206 09:21:34.536700 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6d719fd-72b2-4fe2-a634-b92e6b6f3902-dns-svc\") pod \"dnsmasq-dns-cb666b895-d99dl\" (UID: \"e6d719fd-72b2-4fe2-a634-b92e6b6f3902\") " pod="openstack/dnsmasq-dns-cb666b895-d99dl" Dec 06 09:21:34 crc kubenswrapper[4672]: I1206 09:21:34.536712 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6d719fd-72b2-4fe2-a634-b92e6b6f3902-config\") pod \"dnsmasq-dns-cb666b895-d99dl\" (UID: \"e6d719fd-72b2-4fe2-a634-b92e6b6f3902\") " pod="openstack/dnsmasq-dns-cb666b895-d99dl" Dec 06 09:21:34 crc kubenswrapper[4672]: I1206 09:21:34.557120 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx4l6\" (UniqueName: \"kubernetes.io/projected/e6d719fd-72b2-4fe2-a634-b92e6b6f3902-kube-api-access-nx4l6\") pod \"dnsmasq-dns-cb666b895-d99dl\" (UID: \"e6d719fd-72b2-4fe2-a634-b92e6b6f3902\") " pod="openstack/dnsmasq-dns-cb666b895-d99dl" Dec 06 09:21:34 crc kubenswrapper[4672]: I1206 09:21:34.636122 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-d99dl" Dec 06 09:21:34 crc kubenswrapper[4672]: I1206 09:21:34.729135 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-bkstt"] Dec 06 09:21:34 crc kubenswrapper[4672]: I1206 09:21:34.777658 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc4b48fc9-bkstt" event={"ID":"b19d3022-686a-4cad-9a8f-cb89e48efeca","Type":"ContainerStarted","Data":"5d6ba808750fe7f149611ecd513a302ea6d2b69cfa685a94db0c7fe6faaede50"} Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.050978 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.052338 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.055203 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.055378 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.056588 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.057518 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.057807 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.057832 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.057937 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-5xzx4" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.073178 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.145082 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/54ae723f-36b7-4991-9439-23af064249fa-server-conf\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.145121 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/54ae723f-36b7-4991-9439-23af064249fa-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.145175 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/54ae723f-36b7-4991-9439-23af064249fa-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.145246 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54ae723f-36b7-4991-9439-23af064249fa-config-data\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.145274 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/54ae723f-36b7-4991-9439-23af064249fa-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.145353 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/54ae723f-36b7-4991-9439-23af064249fa-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.145412 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/54ae723f-36b7-4991-9439-23af064249fa-pod-info\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.145431 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/54ae723f-36b7-4991-9439-23af064249fa-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.145486 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq4fg\" (UniqueName: \"kubernetes.io/projected/54ae723f-36b7-4991-9439-23af064249fa-kube-api-access-qq4fg\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.145510 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/54ae723f-36b7-4991-9439-23af064249fa-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.145557 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.249046 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-d99dl"] Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.264577 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54ae723f-36b7-4991-9439-23af064249fa-config-data\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.264628 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/54ae723f-36b7-4991-9439-23af064249fa-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.265419 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54ae723f-36b7-4991-9439-23af064249fa-config-data\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.265493 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/54ae723f-36b7-4991-9439-23af064249fa-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.265521 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/54ae723f-36b7-4991-9439-23af064249fa-pod-info\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.265537 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/54ae723f-36b7-4991-9439-23af064249fa-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.265563 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq4fg\" (UniqueName: \"kubernetes.io/projected/54ae723f-36b7-4991-9439-23af064249fa-kube-api-access-qq4fg\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.265578 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/54ae723f-36b7-4991-9439-23af064249fa-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.265614 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.265633 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/54ae723f-36b7-4991-9439-23af064249fa-server-conf\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.265648 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/54ae723f-36b7-4991-9439-23af064249fa-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.265676 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/54ae723f-36b7-4991-9439-23af064249fa-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.265950 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/54ae723f-36b7-4991-9439-23af064249fa-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.266430 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/54ae723f-36b7-4991-9439-23af064249fa-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.267591 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/54ae723f-36b7-4991-9439-23af064249fa-server-conf\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.267772 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.273279 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/54ae723f-36b7-4991-9439-23af064249fa-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.274070 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/54ae723f-36b7-4991-9439-23af064249fa-pod-info\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.274699 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/54ae723f-36b7-4991-9439-23af064249fa-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.290303 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq4fg\" (UniqueName: \"kubernetes.io/projected/54ae723f-36b7-4991-9439-23af064249fa-kube-api-access-qq4fg\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.304744 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/54ae723f-36b7-4991-9439-23af064249fa-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.325215 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/54ae723f-36b7-4991-9439-23af064249fa-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.399846 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.405523 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.406890 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.414028 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.414177 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-l5sdg" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.414296 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.414394 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.414556 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.414670 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.414797 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.433144 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.569797 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1bbe623e-19ec-49f2-bfa4-65728b94d035-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.569877 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1bbe623e-19ec-49f2-bfa4-65728b94d035-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.569898 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1bbe623e-19ec-49f2-bfa4-65728b94d035-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.569921 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1bbe623e-19ec-49f2-bfa4-65728b94d035-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.569946 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1bbe623e-19ec-49f2-bfa4-65728b94d035-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.569963 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.569978 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1bbe623e-19ec-49f2-bfa4-65728b94d035-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.570011 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1bbe623e-19ec-49f2-bfa4-65728b94d035-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.570053 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1bbe623e-19ec-49f2-bfa4-65728b94d035-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.570078 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1bbe623e-19ec-49f2-bfa4-65728b94d035-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.570094 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbf2q\" (UniqueName: \"kubernetes.io/projected/1bbe623e-19ec-49f2-bfa4-65728b94d035-kube-api-access-nbf2q\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.675344 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1bbe623e-19ec-49f2-bfa4-65728b94d035-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.675437 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1bbe623e-19ec-49f2-bfa4-65728b94d035-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.675464 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1bbe623e-19ec-49f2-bfa4-65728b94d035-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.675488 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbf2q\" (UniqueName: \"kubernetes.io/projected/1bbe623e-19ec-49f2-bfa4-65728b94d035-kube-api-access-nbf2q\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.675512 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1bbe623e-19ec-49f2-bfa4-65728b94d035-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.675557 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1bbe623e-19ec-49f2-bfa4-65728b94d035-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.675579 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1bbe623e-19ec-49f2-bfa4-65728b94d035-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.675620 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1bbe623e-19ec-49f2-bfa4-65728b94d035-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.675654 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1bbe623e-19ec-49f2-bfa4-65728b94d035-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.675678 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.675700 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1bbe623e-19ec-49f2-bfa4-65728b94d035-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.682523 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1bbe623e-19ec-49f2-bfa4-65728b94d035-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.687215 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1bbe623e-19ec-49f2-bfa4-65728b94d035-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.688921 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.689557 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1bbe623e-19ec-49f2-bfa4-65728b94d035-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.692177 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1bbe623e-19ec-49f2-bfa4-65728b94d035-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.692563 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1bbe623e-19ec-49f2-bfa4-65728b94d035-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.692952 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1bbe623e-19ec-49f2-bfa4-65728b94d035-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.697085 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1bbe623e-19ec-49f2-bfa4-65728b94d035-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.701089 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1bbe623e-19ec-49f2-bfa4-65728b94d035-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.703633 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1bbe623e-19ec-49f2-bfa4-65728b94d035-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.709566 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.713732 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbf2q\" (UniqueName: \"kubernetes.io/projected/1bbe623e-19ec-49f2-bfa4-65728b94d035-kube-api-access-nbf2q\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.736475 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.758421 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:21:35 crc kubenswrapper[4672]: I1206 09:21:35.825531 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-d99dl" event={"ID":"e6d719fd-72b2-4fe2-a634-b92e6b6f3902","Type":"ContainerStarted","Data":"f85cde69bc2fd1dacd29fb76c8948abb24cfdc36c07e2595714cb22cae8ee3ac"} Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.387831 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.483327 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 09:21:36 crc kubenswrapper[4672]: W1206 09:21:36.510849 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54ae723f_36b7_4991_9439_23af064249fa.slice/crio-8058a379a7cb28e6983aab03c6bdda9846d95fec18852baf8148b9c9e1d55b53 WatchSource:0}: Error finding container 8058a379a7cb28e6983aab03c6bdda9846d95fec18852baf8148b9c9e1d55b53: Status 404 returned error can't find the container with id 8058a379a7cb28e6983aab03c6bdda9846d95fec18852baf8148b9c9e1d55b53 Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.711003 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.721060 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.726201 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.732518 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.733000 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-4d774" Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.733266 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.733624 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.736644 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.814320 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/37d0f081-e2da-4845-9097-31607c42efc4-kolla-config\") pod \"openstack-galera-0\" (UID: \"37d0f081-e2da-4845-9097-31607c42efc4\") " pod="openstack/openstack-galera-0" Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.814386 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37d0f081-e2da-4845-9097-31607c42efc4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"37d0f081-e2da-4845-9097-31607c42efc4\") " pod="openstack/openstack-galera-0" Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.814410 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkfxs\" (UniqueName: \"kubernetes.io/projected/37d0f081-e2da-4845-9097-31607c42efc4-kube-api-access-zkfxs\") pod \"openstack-galera-0\" (UID: \"37d0f081-e2da-4845-9097-31607c42efc4\") " pod="openstack/openstack-galera-0" Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.814444 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/37d0f081-e2da-4845-9097-31607c42efc4-config-data-default\") pod \"openstack-galera-0\" (UID: \"37d0f081-e2da-4845-9097-31607c42efc4\") " pod="openstack/openstack-galera-0" Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.814475 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/37d0f081-e2da-4845-9097-31607c42efc4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"37d0f081-e2da-4845-9097-31607c42efc4\") " pod="openstack/openstack-galera-0" Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.814496 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/37d0f081-e2da-4845-9097-31607c42efc4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"37d0f081-e2da-4845-9097-31607c42efc4\") " pod="openstack/openstack-galera-0" Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.814629 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"37d0f081-e2da-4845-9097-31607c42efc4\") " pod="openstack/openstack-galera-0" Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.814669 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d0f081-e2da-4845-9097-31607c42efc4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"37d0f081-e2da-4845-9097-31607c42efc4\") " pod="openstack/openstack-galera-0" Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.884205 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1bbe623e-19ec-49f2-bfa4-65728b94d035","Type":"ContainerStarted","Data":"629e8a99471dbe0c3c436af3076b7d79db4d171f2c92685ff437d4ec1106b5b9"} Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.889517 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"54ae723f-36b7-4991-9439-23af064249fa","Type":"ContainerStarted","Data":"8058a379a7cb28e6983aab03c6bdda9846d95fec18852baf8148b9c9e1d55b53"} Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.917561 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37d0f081-e2da-4845-9097-31607c42efc4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"37d0f081-e2da-4845-9097-31607c42efc4\") " pod="openstack/openstack-galera-0" Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.919752 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkfxs\" (UniqueName: \"kubernetes.io/projected/37d0f081-e2da-4845-9097-31607c42efc4-kube-api-access-zkfxs\") pod \"openstack-galera-0\" (UID: \"37d0f081-e2da-4845-9097-31607c42efc4\") " pod="openstack/openstack-galera-0" Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.919831 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/37d0f081-e2da-4845-9097-31607c42efc4-config-data-default\") pod \"openstack-galera-0\" (UID: \"37d0f081-e2da-4845-9097-31607c42efc4\") " pod="openstack/openstack-galera-0" Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.919877 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/37d0f081-e2da-4845-9097-31607c42efc4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"37d0f081-e2da-4845-9097-31607c42efc4\") " pod="openstack/openstack-galera-0" Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.919916 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/37d0f081-e2da-4845-9097-31607c42efc4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"37d0f081-e2da-4845-9097-31607c42efc4\") " pod="openstack/openstack-galera-0" Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.919965 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"37d0f081-e2da-4845-9097-31607c42efc4\") " pod="openstack/openstack-galera-0" Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.919328 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37d0f081-e2da-4845-9097-31607c42efc4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"37d0f081-e2da-4845-9097-31607c42efc4\") " pod="openstack/openstack-galera-0" Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.920530 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d0f081-e2da-4845-9097-31607c42efc4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"37d0f081-e2da-4845-9097-31607c42efc4\") " pod="openstack/openstack-galera-0" Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.920569 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/37d0f081-e2da-4845-9097-31607c42efc4-kolla-config\") pod \"openstack-galera-0\" (UID: \"37d0f081-e2da-4845-9097-31607c42efc4\") " pod="openstack/openstack-galera-0" Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.920945 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"37d0f081-e2da-4845-9097-31607c42efc4\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-galera-0" Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.926991 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/37d0f081-e2da-4845-9097-31607c42efc4-config-data-default\") pod \"openstack-galera-0\" (UID: \"37d0f081-e2da-4845-9097-31607c42efc4\") " pod="openstack/openstack-galera-0" Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.927995 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/37d0f081-e2da-4845-9097-31607c42efc4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"37d0f081-e2da-4845-9097-31607c42efc4\") " pod="openstack/openstack-galera-0" Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.928089 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/37d0f081-e2da-4845-9097-31607c42efc4-kolla-config\") pod \"openstack-galera-0\" (UID: \"37d0f081-e2da-4845-9097-31607c42efc4\") " pod="openstack/openstack-galera-0" Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.938228 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d0f081-e2da-4845-9097-31607c42efc4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"37d0f081-e2da-4845-9097-31607c42efc4\") " pod="openstack/openstack-galera-0" Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.971681 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"37d0f081-e2da-4845-9097-31607c42efc4\") " pod="openstack/openstack-galera-0" Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.975521 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/37d0f081-e2da-4845-9097-31607c42efc4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"37d0f081-e2da-4845-9097-31607c42efc4\") " pod="openstack/openstack-galera-0" Dec 06 09:21:36 crc kubenswrapper[4672]: I1206 09:21:36.979473 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkfxs\" (UniqueName: \"kubernetes.io/projected/37d0f081-e2da-4845-9097-31607c42efc4-kube-api-access-zkfxs\") pod \"openstack-galera-0\" (UID: \"37d0f081-e2da-4845-9097-31607c42efc4\") " pod="openstack/openstack-galera-0" Dec 06 09:21:37 crc kubenswrapper[4672]: I1206 09:21:37.065527 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 06 09:21:37 crc kubenswrapper[4672]: I1206 09:21:37.845355 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 06 09:21:37 crc kubenswrapper[4672]: W1206 09:21:37.868589 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37d0f081_e2da_4845_9097_31607c42efc4.slice/crio-a18689661942dd8ecfd58e97298f046cb1581e153209fd517ab498a1d747a7bd WatchSource:0}: Error finding container a18689661942dd8ecfd58e97298f046cb1581e153209fd517ab498a1d747a7bd: Status 404 returned error can't find the container with id a18689661942dd8ecfd58e97298f046cb1581e153209fd517ab498a1d747a7bd Dec 06 09:21:37 crc kubenswrapper[4672]: I1206 09:21:37.920763 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"37d0f081-e2da-4845-9097-31607c42efc4","Type":"ContainerStarted","Data":"a18689661942dd8ecfd58e97298f046cb1581e153209fd517ab498a1d747a7bd"} Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.186840 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.187966 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.195795 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.197042 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.197211 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.197337 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-pvgrn" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.257953 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c53efb2-1642-4efd-b920-7ad41e6c136a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8c53efb2-1642-4efd-b920-7ad41e6c136a\") " pod="openstack/openstack-cell1-galera-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.257999 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8c53efb2-1642-4efd-b920-7ad41e6c136a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8c53efb2-1642-4efd-b920-7ad41e6c136a\") " pod="openstack/openstack-cell1-galera-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.258036 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8c53efb2-1642-4efd-b920-7ad41e6c136a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8c53efb2-1642-4efd-b920-7ad41e6c136a\") " pod="openstack/openstack-cell1-galera-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.258054 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c53efb2-1642-4efd-b920-7ad41e6c136a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8c53efb2-1642-4efd-b920-7ad41e6c136a\") " pod="openstack/openstack-cell1-galera-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.258090 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8c53efb2-1642-4efd-b920-7ad41e6c136a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8c53efb2-1642-4efd-b920-7ad41e6c136a\") " pod="openstack/openstack-cell1-galera-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.258118 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8c53efb2-1642-4efd-b920-7ad41e6c136a\") " pod="openstack/openstack-cell1-galera-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.258132 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkhjn\" (UniqueName: \"kubernetes.io/projected/8c53efb2-1642-4efd-b920-7ad41e6c136a-kube-api-access-kkhjn\") pod \"openstack-cell1-galera-0\" (UID: \"8c53efb2-1642-4efd-b920-7ad41e6c136a\") " pod="openstack/openstack-cell1-galera-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.258161 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c53efb2-1642-4efd-b920-7ad41e6c136a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8c53efb2-1642-4efd-b920-7ad41e6c136a\") " pod="openstack/openstack-cell1-galera-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.264306 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.364903 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8c53efb2-1642-4efd-b920-7ad41e6c136a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8c53efb2-1642-4efd-b920-7ad41e6c136a\") " pod="openstack/openstack-cell1-galera-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.364976 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8c53efb2-1642-4efd-b920-7ad41e6c136a\") " pod="openstack/openstack-cell1-galera-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.364995 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkhjn\" (UniqueName: \"kubernetes.io/projected/8c53efb2-1642-4efd-b920-7ad41e6c136a-kube-api-access-kkhjn\") pod \"openstack-cell1-galera-0\" (UID: \"8c53efb2-1642-4efd-b920-7ad41e6c136a\") " pod="openstack/openstack-cell1-galera-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.365764 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8c53efb2-1642-4efd-b920-7ad41e6c136a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8c53efb2-1642-4efd-b920-7ad41e6c136a\") " pod="openstack/openstack-cell1-galera-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.365832 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8c53efb2-1642-4efd-b920-7ad41e6c136a\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-cell1-galera-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.366266 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c53efb2-1642-4efd-b920-7ad41e6c136a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8c53efb2-1642-4efd-b920-7ad41e6c136a\") " pod="openstack/openstack-cell1-galera-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.366388 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c53efb2-1642-4efd-b920-7ad41e6c136a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8c53efb2-1642-4efd-b920-7ad41e6c136a\") " pod="openstack/openstack-cell1-galera-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.366435 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8c53efb2-1642-4efd-b920-7ad41e6c136a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8c53efb2-1642-4efd-b920-7ad41e6c136a\") " pod="openstack/openstack-cell1-galera-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.366505 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8c53efb2-1642-4efd-b920-7ad41e6c136a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8c53efb2-1642-4efd-b920-7ad41e6c136a\") " pod="openstack/openstack-cell1-galera-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.366536 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c53efb2-1642-4efd-b920-7ad41e6c136a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8c53efb2-1642-4efd-b920-7ad41e6c136a\") " pod="openstack/openstack-cell1-galera-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.367700 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8c53efb2-1642-4efd-b920-7ad41e6c136a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8c53efb2-1642-4efd-b920-7ad41e6c136a\") " pod="openstack/openstack-cell1-galera-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.367818 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c53efb2-1642-4efd-b920-7ad41e6c136a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8c53efb2-1642-4efd-b920-7ad41e6c136a\") " pod="openstack/openstack-cell1-galera-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.368383 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8c53efb2-1642-4efd-b920-7ad41e6c136a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8c53efb2-1642-4efd-b920-7ad41e6c136a\") " pod="openstack/openstack-cell1-galera-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.383374 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c53efb2-1642-4efd-b920-7ad41e6c136a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8c53efb2-1642-4efd-b920-7ad41e6c136a\") " pod="openstack/openstack-cell1-galera-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.389424 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c53efb2-1642-4efd-b920-7ad41e6c136a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8c53efb2-1642-4efd-b920-7ad41e6c136a\") " pod="openstack/openstack-cell1-galera-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.402433 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8c53efb2-1642-4efd-b920-7ad41e6c136a\") " pod="openstack/openstack-cell1-galera-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.422377 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkhjn\" (UniqueName: \"kubernetes.io/projected/8c53efb2-1642-4efd-b920-7ad41e6c136a-kube-api-access-kkhjn\") pod \"openstack-cell1-galera-0\" (UID: \"8c53efb2-1642-4efd-b920-7ad41e6c136a\") " pod="openstack/openstack-cell1-galera-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.507679 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.739726 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.740900 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.756046 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-mb2mf" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.756083 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.756667 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.782361 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/944c7316-15fa-4e57-896c-65205e8137b2-config-data\") pod \"memcached-0\" (UID: \"944c7316-15fa-4e57-896c-65205e8137b2\") " pod="openstack/memcached-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.782401 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/944c7316-15fa-4e57-896c-65205e8137b2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"944c7316-15fa-4e57-896c-65205e8137b2\") " pod="openstack/memcached-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.782418 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmj27\" (UniqueName: \"kubernetes.io/projected/944c7316-15fa-4e57-896c-65205e8137b2-kube-api-access-rmj27\") pod \"memcached-0\" (UID: \"944c7316-15fa-4e57-896c-65205e8137b2\") " pod="openstack/memcached-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.782498 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/944c7316-15fa-4e57-896c-65205e8137b2-kolla-config\") pod \"memcached-0\" (UID: \"944c7316-15fa-4e57-896c-65205e8137b2\") " pod="openstack/memcached-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.782639 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/944c7316-15fa-4e57-896c-65205e8137b2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"944c7316-15fa-4e57-896c-65205e8137b2\") " pod="openstack/memcached-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.789415 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.884233 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/944c7316-15fa-4e57-896c-65205e8137b2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"944c7316-15fa-4e57-896c-65205e8137b2\") " pod="openstack/memcached-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.884990 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/944c7316-15fa-4e57-896c-65205e8137b2-config-data\") pod \"memcached-0\" (UID: \"944c7316-15fa-4e57-896c-65205e8137b2\") " pod="openstack/memcached-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.885063 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/944c7316-15fa-4e57-896c-65205e8137b2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"944c7316-15fa-4e57-896c-65205e8137b2\") " pod="openstack/memcached-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.885133 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmj27\" (UniqueName: \"kubernetes.io/projected/944c7316-15fa-4e57-896c-65205e8137b2-kube-api-access-rmj27\") pod \"memcached-0\" (UID: \"944c7316-15fa-4e57-896c-65205e8137b2\") " pod="openstack/memcached-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.885556 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/944c7316-15fa-4e57-896c-65205e8137b2-kolla-config\") pod \"memcached-0\" (UID: \"944c7316-15fa-4e57-896c-65205e8137b2\") " pod="openstack/memcached-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.886046 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/944c7316-15fa-4e57-896c-65205e8137b2-kolla-config\") pod \"memcached-0\" (UID: \"944c7316-15fa-4e57-896c-65205e8137b2\") " pod="openstack/memcached-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.886416 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/944c7316-15fa-4e57-896c-65205e8137b2-config-data\") pod \"memcached-0\" (UID: \"944c7316-15fa-4e57-896c-65205e8137b2\") " pod="openstack/memcached-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.895343 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/944c7316-15fa-4e57-896c-65205e8137b2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"944c7316-15fa-4e57-896c-65205e8137b2\") " pod="openstack/memcached-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.917404 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/944c7316-15fa-4e57-896c-65205e8137b2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"944c7316-15fa-4e57-896c-65205e8137b2\") " pod="openstack/memcached-0" Dec 06 09:21:38 crc kubenswrapper[4672]: I1206 09:21:38.922970 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmj27\" (UniqueName: \"kubernetes.io/projected/944c7316-15fa-4e57-896c-65205e8137b2-kube-api-access-rmj27\") pod \"memcached-0\" (UID: \"944c7316-15fa-4e57-896c-65205e8137b2\") " pod="openstack/memcached-0" Dec 06 09:21:39 crc kubenswrapper[4672]: I1206 09:21:39.068938 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 06 09:21:39 crc kubenswrapper[4672]: I1206 09:21:39.362814 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 06 09:21:39 crc kubenswrapper[4672]: I1206 09:21:39.719789 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 06 09:21:39 crc kubenswrapper[4672]: I1206 09:21:39.993333 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8c53efb2-1642-4efd-b920-7ad41e6c136a","Type":"ContainerStarted","Data":"a956110b7ab158ed8e78fbcdc25e43a4576f52ca169fa108adedd39ea490cb78"} Dec 06 09:21:41 crc kubenswrapper[4672]: I1206 09:21:41.702752 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 09:21:41 crc kubenswrapper[4672]: I1206 09:21:41.704525 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 09:21:41 crc kubenswrapper[4672]: I1206 09:21:41.734748 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-bmrjd" Dec 06 09:21:41 crc kubenswrapper[4672]: I1206 09:21:41.770714 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 09:21:41 crc kubenswrapper[4672]: I1206 09:21:41.854764 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bldqw\" (UniqueName: \"kubernetes.io/projected/4a2f21bb-694a-4fd5-a5b5-e1d094f8ef62-kube-api-access-bldqw\") pod \"kube-state-metrics-0\" (UID: \"4a2f21bb-694a-4fd5-a5b5-e1d094f8ef62\") " pod="openstack/kube-state-metrics-0" Dec 06 09:21:41 crc kubenswrapper[4672]: I1206 09:21:41.956383 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bldqw\" (UniqueName: \"kubernetes.io/projected/4a2f21bb-694a-4fd5-a5b5-e1d094f8ef62-kube-api-access-bldqw\") pod \"kube-state-metrics-0\" (UID: \"4a2f21bb-694a-4fd5-a5b5-e1d094f8ef62\") " pod="openstack/kube-state-metrics-0" Dec 06 09:21:41 crc kubenswrapper[4672]: I1206 09:21:41.990471 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bldqw\" (UniqueName: \"kubernetes.io/projected/4a2f21bb-694a-4fd5-a5b5-e1d094f8ef62-kube-api-access-bldqw\") pod \"kube-state-metrics-0\" (UID: \"4a2f21bb-694a-4fd5-a5b5-e1d094f8ef62\") " pod="openstack/kube-state-metrics-0" Dec 06 09:21:42 crc kubenswrapper[4672]: I1206 09:21:42.034904 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.091318 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-hxgmq"] Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.092966 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hxgmq" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.096584 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.096786 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-6lmkm" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.096915 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.641508 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6-var-run-ovn\") pod \"ovn-controller-hxgmq\" (UID: \"2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6\") " pod="openstack/ovn-controller-hxgmq" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.641574 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6-ovn-controller-tls-certs\") pod \"ovn-controller-hxgmq\" (UID: \"2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6\") " pod="openstack/ovn-controller-hxgmq" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.641637 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6-var-log-ovn\") pod \"ovn-controller-hxgmq\" (UID: \"2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6\") " pod="openstack/ovn-controller-hxgmq" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.641681 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6-scripts\") pod \"ovn-controller-hxgmq\" (UID: \"2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6\") " pod="openstack/ovn-controller-hxgmq" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.641700 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9xcj\" (UniqueName: \"kubernetes.io/projected/2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6-kube-api-access-p9xcj\") pod \"ovn-controller-hxgmq\" (UID: \"2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6\") " pod="openstack/ovn-controller-hxgmq" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.641740 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6-var-run\") pod \"ovn-controller-hxgmq\" (UID: \"2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6\") " pod="openstack/ovn-controller-hxgmq" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.641770 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6-combined-ca-bundle\") pod \"ovn-controller-hxgmq\" (UID: \"2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6\") " pod="openstack/ovn-controller-hxgmq" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.656713 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hxgmq"] Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.702518 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-rsxq7"] Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.704404 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-rsxq7" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.727361 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-rsxq7"] Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.742435 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6-var-run-ovn\") pod \"ovn-controller-hxgmq\" (UID: \"2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6\") " pod="openstack/ovn-controller-hxgmq" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.742475 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8cc7e2b2-ad6c-44f4-b477-951936b867d8-var-lib\") pod \"ovn-controller-ovs-rsxq7\" (UID: \"8cc7e2b2-ad6c-44f4-b477-951936b867d8\") " pod="openstack/ovn-controller-ovs-rsxq7" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.742496 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8cc7e2b2-ad6c-44f4-b477-951936b867d8-var-log\") pod \"ovn-controller-ovs-rsxq7\" (UID: \"8cc7e2b2-ad6c-44f4-b477-951936b867d8\") " pod="openstack/ovn-controller-ovs-rsxq7" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.742511 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6-ovn-controller-tls-certs\") pod \"ovn-controller-hxgmq\" (UID: \"2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6\") " pod="openstack/ovn-controller-hxgmq" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.742535 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9wnb\" (UniqueName: \"kubernetes.io/projected/8cc7e2b2-ad6c-44f4-b477-951936b867d8-kube-api-access-f9wnb\") pod \"ovn-controller-ovs-rsxq7\" (UID: \"8cc7e2b2-ad6c-44f4-b477-951936b867d8\") " pod="openstack/ovn-controller-ovs-rsxq7" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.742554 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6-var-log-ovn\") pod \"ovn-controller-hxgmq\" (UID: \"2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6\") " pod="openstack/ovn-controller-hxgmq" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.742594 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6-scripts\") pod \"ovn-controller-hxgmq\" (UID: \"2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6\") " pod="openstack/ovn-controller-hxgmq" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.742628 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8cc7e2b2-ad6c-44f4-b477-951936b867d8-var-run\") pod \"ovn-controller-ovs-rsxq7\" (UID: \"8cc7e2b2-ad6c-44f4-b477-951936b867d8\") " pod="openstack/ovn-controller-ovs-rsxq7" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.742644 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9xcj\" (UniqueName: \"kubernetes.io/projected/2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6-kube-api-access-p9xcj\") pod \"ovn-controller-hxgmq\" (UID: \"2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6\") " pod="openstack/ovn-controller-hxgmq" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.742659 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8cc7e2b2-ad6c-44f4-b477-951936b867d8-etc-ovs\") pod \"ovn-controller-ovs-rsxq7\" (UID: \"8cc7e2b2-ad6c-44f4-b477-951936b867d8\") " pod="openstack/ovn-controller-ovs-rsxq7" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.742679 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6-var-run\") pod \"ovn-controller-hxgmq\" (UID: \"2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6\") " pod="openstack/ovn-controller-hxgmq" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.742708 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6-combined-ca-bundle\") pod \"ovn-controller-hxgmq\" (UID: \"2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6\") " pod="openstack/ovn-controller-hxgmq" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.742751 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8cc7e2b2-ad6c-44f4-b477-951936b867d8-scripts\") pod \"ovn-controller-ovs-rsxq7\" (UID: \"8cc7e2b2-ad6c-44f4-b477-951936b867d8\") " pod="openstack/ovn-controller-ovs-rsxq7" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.743243 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6-var-log-ovn\") pod \"ovn-controller-hxgmq\" (UID: \"2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6\") " pod="openstack/ovn-controller-hxgmq" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.743329 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6-var-run\") pod \"ovn-controller-hxgmq\" (UID: \"2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6\") " pod="openstack/ovn-controller-hxgmq" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.743728 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6-var-run-ovn\") pod \"ovn-controller-hxgmq\" (UID: \"2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6\") " pod="openstack/ovn-controller-hxgmq" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.747686 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6-scripts\") pod \"ovn-controller-hxgmq\" (UID: \"2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6\") " pod="openstack/ovn-controller-hxgmq" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.751827 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6-ovn-controller-tls-certs\") pod \"ovn-controller-hxgmq\" (UID: \"2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6\") " pod="openstack/ovn-controller-hxgmq" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.760314 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6-combined-ca-bundle\") pod \"ovn-controller-hxgmq\" (UID: \"2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6\") " pod="openstack/ovn-controller-hxgmq" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.760466 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9xcj\" (UniqueName: \"kubernetes.io/projected/2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6-kube-api-access-p9xcj\") pod \"ovn-controller-hxgmq\" (UID: \"2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6\") " pod="openstack/ovn-controller-hxgmq" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.844748 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8cc7e2b2-ad6c-44f4-b477-951936b867d8-var-lib\") pod \"ovn-controller-ovs-rsxq7\" (UID: \"8cc7e2b2-ad6c-44f4-b477-951936b867d8\") " pod="openstack/ovn-controller-ovs-rsxq7" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.844805 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8cc7e2b2-ad6c-44f4-b477-951936b867d8-var-log\") pod \"ovn-controller-ovs-rsxq7\" (UID: \"8cc7e2b2-ad6c-44f4-b477-951936b867d8\") " pod="openstack/ovn-controller-ovs-rsxq7" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.844856 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9wnb\" (UniqueName: \"kubernetes.io/projected/8cc7e2b2-ad6c-44f4-b477-951936b867d8-kube-api-access-f9wnb\") pod \"ovn-controller-ovs-rsxq7\" (UID: \"8cc7e2b2-ad6c-44f4-b477-951936b867d8\") " pod="openstack/ovn-controller-ovs-rsxq7" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.844893 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8cc7e2b2-ad6c-44f4-b477-951936b867d8-var-run\") pod \"ovn-controller-ovs-rsxq7\" (UID: \"8cc7e2b2-ad6c-44f4-b477-951936b867d8\") " pod="openstack/ovn-controller-ovs-rsxq7" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.845242 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8cc7e2b2-ad6c-44f4-b477-951936b867d8-var-lib\") pod \"ovn-controller-ovs-rsxq7\" (UID: \"8cc7e2b2-ad6c-44f4-b477-951936b867d8\") " pod="openstack/ovn-controller-ovs-rsxq7" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.845526 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8cc7e2b2-ad6c-44f4-b477-951936b867d8-var-run\") pod \"ovn-controller-ovs-rsxq7\" (UID: \"8cc7e2b2-ad6c-44f4-b477-951936b867d8\") " pod="openstack/ovn-controller-ovs-rsxq7" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.845753 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8cc7e2b2-ad6c-44f4-b477-951936b867d8-var-log\") pod \"ovn-controller-ovs-rsxq7\" (UID: \"8cc7e2b2-ad6c-44f4-b477-951936b867d8\") " pod="openstack/ovn-controller-ovs-rsxq7" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.845812 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8cc7e2b2-ad6c-44f4-b477-951936b867d8-etc-ovs\") pod \"ovn-controller-ovs-rsxq7\" (UID: \"8cc7e2b2-ad6c-44f4-b477-951936b867d8\") " pod="openstack/ovn-controller-ovs-rsxq7" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.845868 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8cc7e2b2-ad6c-44f4-b477-951936b867d8-scripts\") pod \"ovn-controller-ovs-rsxq7\" (UID: \"8cc7e2b2-ad6c-44f4-b477-951936b867d8\") " pod="openstack/ovn-controller-ovs-rsxq7" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.846173 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8cc7e2b2-ad6c-44f4-b477-951936b867d8-etc-ovs\") pod \"ovn-controller-ovs-rsxq7\" (UID: \"8cc7e2b2-ad6c-44f4-b477-951936b867d8\") " pod="openstack/ovn-controller-ovs-rsxq7" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.851192 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8cc7e2b2-ad6c-44f4-b477-951936b867d8-scripts\") pod \"ovn-controller-ovs-rsxq7\" (UID: \"8cc7e2b2-ad6c-44f4-b477-951936b867d8\") " pod="openstack/ovn-controller-ovs-rsxq7" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.868328 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9wnb\" (UniqueName: \"kubernetes.io/projected/8cc7e2b2-ad6c-44f4-b477-951936b867d8-kube-api-access-f9wnb\") pod \"ovn-controller-ovs-rsxq7\" (UID: \"8cc7e2b2-ad6c-44f4-b477-951936b867d8\") " pod="openstack/ovn-controller-ovs-rsxq7" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.955001 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hxgmq" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.982810 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.984675 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.987691 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.987927 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.988154 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-bxxtk" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.988335 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 06 09:21:45 crc kubenswrapper[4672]: I1206 09:21:45.991086 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 06 09:21:46 crc kubenswrapper[4672]: I1206 09:21:46.000327 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 06 09:21:46 crc kubenswrapper[4672]: I1206 09:21:46.036895 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-rsxq7" Dec 06 09:21:46 crc kubenswrapper[4672]: I1206 09:21:46.150450 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f085ca1-832b-40dc-b131-2c287df92f6e-config\") pod \"ovsdbserver-nb-0\" (UID: \"9f085ca1-832b-40dc-b131-2c287df92f6e\") " pod="openstack/ovsdbserver-nb-0" Dec 06 09:21:46 crc kubenswrapper[4672]: I1206 09:21:46.150509 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f085ca1-832b-40dc-b131-2c287df92f6e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9f085ca1-832b-40dc-b131-2c287df92f6e\") " pod="openstack/ovsdbserver-nb-0" Dec 06 09:21:46 crc kubenswrapper[4672]: I1206 09:21:46.150556 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9f085ca1-832b-40dc-b131-2c287df92f6e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9f085ca1-832b-40dc-b131-2c287df92f6e\") " pod="openstack/ovsdbserver-nb-0" Dec 06 09:21:46 crc kubenswrapper[4672]: I1206 09:21:46.150627 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9f085ca1-832b-40dc-b131-2c287df92f6e\") " pod="openstack/ovsdbserver-nb-0" Dec 06 09:21:46 crc kubenswrapper[4672]: I1206 09:21:46.150648 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f085ca1-832b-40dc-b131-2c287df92f6e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9f085ca1-832b-40dc-b131-2c287df92f6e\") " pod="openstack/ovsdbserver-nb-0" Dec 06 09:21:46 crc kubenswrapper[4672]: I1206 09:21:46.150719 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f085ca1-832b-40dc-b131-2c287df92f6e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9f085ca1-832b-40dc-b131-2c287df92f6e\") " pod="openstack/ovsdbserver-nb-0" Dec 06 09:21:46 crc kubenswrapper[4672]: I1206 09:21:46.150744 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k82rb\" (UniqueName: \"kubernetes.io/projected/9f085ca1-832b-40dc-b131-2c287df92f6e-kube-api-access-k82rb\") pod \"ovsdbserver-nb-0\" (UID: \"9f085ca1-832b-40dc-b131-2c287df92f6e\") " pod="openstack/ovsdbserver-nb-0" Dec 06 09:21:46 crc kubenswrapper[4672]: I1206 09:21:46.150786 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f085ca1-832b-40dc-b131-2c287df92f6e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9f085ca1-832b-40dc-b131-2c287df92f6e\") " pod="openstack/ovsdbserver-nb-0" Dec 06 09:21:46 crc kubenswrapper[4672]: I1206 09:21:46.252573 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f085ca1-832b-40dc-b131-2c287df92f6e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9f085ca1-832b-40dc-b131-2c287df92f6e\") " pod="openstack/ovsdbserver-nb-0" Dec 06 09:21:46 crc kubenswrapper[4672]: I1206 09:21:46.252654 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f085ca1-832b-40dc-b131-2c287df92f6e-config\") pod \"ovsdbserver-nb-0\" (UID: \"9f085ca1-832b-40dc-b131-2c287df92f6e\") " pod="openstack/ovsdbserver-nb-0" Dec 06 09:21:46 crc kubenswrapper[4672]: I1206 09:21:46.252672 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f085ca1-832b-40dc-b131-2c287df92f6e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9f085ca1-832b-40dc-b131-2c287df92f6e\") " pod="openstack/ovsdbserver-nb-0" Dec 06 09:21:46 crc kubenswrapper[4672]: I1206 09:21:46.252712 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9f085ca1-832b-40dc-b131-2c287df92f6e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9f085ca1-832b-40dc-b131-2c287df92f6e\") " pod="openstack/ovsdbserver-nb-0" Dec 06 09:21:46 crc kubenswrapper[4672]: I1206 09:21:46.252736 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9f085ca1-832b-40dc-b131-2c287df92f6e\") " pod="openstack/ovsdbserver-nb-0" Dec 06 09:21:46 crc kubenswrapper[4672]: I1206 09:21:46.252755 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f085ca1-832b-40dc-b131-2c287df92f6e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9f085ca1-832b-40dc-b131-2c287df92f6e\") " pod="openstack/ovsdbserver-nb-0" Dec 06 09:21:46 crc kubenswrapper[4672]: I1206 09:21:46.252799 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f085ca1-832b-40dc-b131-2c287df92f6e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9f085ca1-832b-40dc-b131-2c287df92f6e\") " pod="openstack/ovsdbserver-nb-0" Dec 06 09:21:46 crc kubenswrapper[4672]: I1206 09:21:46.252824 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k82rb\" (UniqueName: \"kubernetes.io/projected/9f085ca1-832b-40dc-b131-2c287df92f6e-kube-api-access-k82rb\") pod \"ovsdbserver-nb-0\" (UID: \"9f085ca1-832b-40dc-b131-2c287df92f6e\") " pod="openstack/ovsdbserver-nb-0" Dec 06 09:21:46 crc kubenswrapper[4672]: I1206 09:21:46.253936 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9f085ca1-832b-40dc-b131-2c287df92f6e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9f085ca1-832b-40dc-b131-2c287df92f6e\") " pod="openstack/ovsdbserver-nb-0" Dec 06 09:21:46 crc kubenswrapper[4672]: I1206 09:21:46.254706 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f085ca1-832b-40dc-b131-2c287df92f6e-config\") pod \"ovsdbserver-nb-0\" (UID: \"9f085ca1-832b-40dc-b131-2c287df92f6e\") " pod="openstack/ovsdbserver-nb-0" Dec 06 09:21:46 crc kubenswrapper[4672]: I1206 09:21:46.257011 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f085ca1-832b-40dc-b131-2c287df92f6e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9f085ca1-832b-40dc-b131-2c287df92f6e\") " pod="openstack/ovsdbserver-nb-0" Dec 06 09:21:46 crc kubenswrapper[4672]: I1206 09:21:46.257033 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9f085ca1-832b-40dc-b131-2c287df92f6e\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Dec 06 09:21:46 crc kubenswrapper[4672]: I1206 09:21:46.258246 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f085ca1-832b-40dc-b131-2c287df92f6e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9f085ca1-832b-40dc-b131-2c287df92f6e\") " pod="openstack/ovsdbserver-nb-0" Dec 06 09:21:46 crc kubenswrapper[4672]: I1206 09:21:46.273937 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k82rb\" (UniqueName: \"kubernetes.io/projected/9f085ca1-832b-40dc-b131-2c287df92f6e-kube-api-access-k82rb\") pod \"ovsdbserver-nb-0\" (UID: \"9f085ca1-832b-40dc-b131-2c287df92f6e\") " pod="openstack/ovsdbserver-nb-0" Dec 06 09:21:46 crc kubenswrapper[4672]: I1206 09:21:46.274470 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f085ca1-832b-40dc-b131-2c287df92f6e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9f085ca1-832b-40dc-b131-2c287df92f6e\") " pod="openstack/ovsdbserver-nb-0" Dec 06 09:21:46 crc kubenswrapper[4672]: I1206 09:21:46.287231 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f085ca1-832b-40dc-b131-2c287df92f6e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9f085ca1-832b-40dc-b131-2c287df92f6e\") " pod="openstack/ovsdbserver-nb-0" Dec 06 09:21:46 crc kubenswrapper[4672]: I1206 09:21:46.297481 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9f085ca1-832b-40dc-b131-2c287df92f6e\") " pod="openstack/ovsdbserver-nb-0" Dec 06 09:21:46 crc kubenswrapper[4672]: I1206 09:21:46.323931 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 06 09:21:47 crc kubenswrapper[4672]: I1206 09:21:47.622442 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"944c7316-15fa-4e57-896c-65205e8137b2","Type":"ContainerStarted","Data":"87ee70e07fd1d22559b58b1e86accac12cc0448488943f601306eaaea4f24921"} Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.016813 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.018014 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.020073 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-g8jnd" Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.020957 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.021456 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.021607 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.035926 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.093379 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cddfb03-e3ff-478e-91c7-e3b58145d1e6-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4cddfb03-e3ff-478e-91c7-e3b58145d1e6\") " pod="openstack/ovsdbserver-sb-0" Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.093651 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cddfb03-e3ff-478e-91c7-e3b58145d1e6-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4cddfb03-e3ff-478e-91c7-e3b58145d1e6\") " pod="openstack/ovsdbserver-sb-0" Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.093674 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4cddfb03-e3ff-478e-91c7-e3b58145d1e6-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4cddfb03-e3ff-478e-91c7-e3b58145d1e6\") " pod="openstack/ovsdbserver-sb-0" Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.093691 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4cddfb03-e3ff-478e-91c7-e3b58145d1e6-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4cddfb03-e3ff-478e-91c7-e3b58145d1e6\") " pod="openstack/ovsdbserver-sb-0" Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.093720 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cddfb03-e3ff-478e-91c7-e3b58145d1e6-config\") pod \"ovsdbserver-sb-0\" (UID: \"4cddfb03-e3ff-478e-91c7-e3b58145d1e6\") " pod="openstack/ovsdbserver-sb-0" Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.093824 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4sj4\" (UniqueName: \"kubernetes.io/projected/4cddfb03-e3ff-478e-91c7-e3b58145d1e6-kube-api-access-k4sj4\") pod \"ovsdbserver-sb-0\" (UID: \"4cddfb03-e3ff-478e-91c7-e3b58145d1e6\") " pod="openstack/ovsdbserver-sb-0" Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.093868 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4cddfb03-e3ff-478e-91c7-e3b58145d1e6\") " pod="openstack/ovsdbserver-sb-0" Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.093888 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cddfb03-e3ff-478e-91c7-e3b58145d1e6-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4cddfb03-e3ff-478e-91c7-e3b58145d1e6\") " pod="openstack/ovsdbserver-sb-0" Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.195566 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4sj4\" (UniqueName: \"kubernetes.io/projected/4cddfb03-e3ff-478e-91c7-e3b58145d1e6-kube-api-access-k4sj4\") pod \"ovsdbserver-sb-0\" (UID: \"4cddfb03-e3ff-478e-91c7-e3b58145d1e6\") " pod="openstack/ovsdbserver-sb-0" Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.196048 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4cddfb03-e3ff-478e-91c7-e3b58145d1e6\") " pod="openstack/ovsdbserver-sb-0" Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.196076 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cddfb03-e3ff-478e-91c7-e3b58145d1e6-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4cddfb03-e3ff-478e-91c7-e3b58145d1e6\") " pod="openstack/ovsdbserver-sb-0" Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.196324 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4cddfb03-e3ff-478e-91c7-e3b58145d1e6\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-sb-0" Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.200461 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cddfb03-e3ff-478e-91c7-e3b58145d1e6-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4cddfb03-e3ff-478e-91c7-e3b58145d1e6\") " pod="openstack/ovsdbserver-sb-0" Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.202478 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cddfb03-e3ff-478e-91c7-e3b58145d1e6-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4cddfb03-e3ff-478e-91c7-e3b58145d1e6\") " pod="openstack/ovsdbserver-sb-0" Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.202753 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cddfb03-e3ff-478e-91c7-e3b58145d1e6-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4cddfb03-e3ff-478e-91c7-e3b58145d1e6\") " pod="openstack/ovsdbserver-sb-0" Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.202811 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4cddfb03-e3ff-478e-91c7-e3b58145d1e6-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4cddfb03-e3ff-478e-91c7-e3b58145d1e6\") " pod="openstack/ovsdbserver-sb-0" Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.202832 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4cddfb03-e3ff-478e-91c7-e3b58145d1e6-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4cddfb03-e3ff-478e-91c7-e3b58145d1e6\") " pod="openstack/ovsdbserver-sb-0" Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.202902 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cddfb03-e3ff-478e-91c7-e3b58145d1e6-config\") pod \"ovsdbserver-sb-0\" (UID: \"4cddfb03-e3ff-478e-91c7-e3b58145d1e6\") " pod="openstack/ovsdbserver-sb-0" Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.203281 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4cddfb03-e3ff-478e-91c7-e3b58145d1e6-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4cddfb03-e3ff-478e-91c7-e3b58145d1e6\") " pod="openstack/ovsdbserver-sb-0" Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.203790 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cddfb03-e3ff-478e-91c7-e3b58145d1e6-config\") pod \"ovsdbserver-sb-0\" (UID: \"4cddfb03-e3ff-478e-91c7-e3b58145d1e6\") " pod="openstack/ovsdbserver-sb-0" Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.203947 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4cddfb03-e3ff-478e-91c7-e3b58145d1e6-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4cddfb03-e3ff-478e-91c7-e3b58145d1e6\") " pod="openstack/ovsdbserver-sb-0" Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.206760 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cddfb03-e3ff-478e-91c7-e3b58145d1e6-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4cddfb03-e3ff-478e-91c7-e3b58145d1e6\") " pod="openstack/ovsdbserver-sb-0" Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.207198 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cddfb03-e3ff-478e-91c7-e3b58145d1e6-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4cddfb03-e3ff-478e-91c7-e3b58145d1e6\") " pod="openstack/ovsdbserver-sb-0" Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.211261 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4sj4\" (UniqueName: \"kubernetes.io/projected/4cddfb03-e3ff-478e-91c7-e3b58145d1e6-kube-api-access-k4sj4\") pod \"ovsdbserver-sb-0\" (UID: \"4cddfb03-e3ff-478e-91c7-e3b58145d1e6\") " pod="openstack/ovsdbserver-sb-0" Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.218848 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4cddfb03-e3ff-478e-91c7-e3b58145d1e6\") " pod="openstack/ovsdbserver-sb-0" Dec 06 09:21:49 crc kubenswrapper[4672]: I1206 09:21:49.339978 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 06 09:22:01 crc kubenswrapper[4672]: E1206 09:22:01.896860 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb@sha256:d9a2d8b19d3de4658dd26a2e781d00002e937738bfe6a1d0cf6c68c015085f4a" Dec 06 09:22:01 crc kubenswrapper[4672]: E1206 09:22:01.897607 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:d9a2d8b19d3de4658dd26a2e781d00002e937738bfe6a1d0cf6c68c015085f4a,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zkfxs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(37d0f081-e2da-4845-9097-31607c42efc4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 09:22:01 crc kubenswrapper[4672]: E1206 09:22:01.898810 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="37d0f081-e2da-4845-9097-31607c42efc4" Dec 06 09:22:02 crc kubenswrapper[4672]: E1206 09:22:02.759247 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb@sha256:d9a2d8b19d3de4658dd26a2e781d00002e937738bfe6a1d0cf6c68c015085f4a\\\"\"" pod="openstack/openstack-galera-0" podUID="37d0f081-e2da-4845-9097-31607c42efc4" Dec 06 09:22:03 crc kubenswrapper[4672]: E1206 09:22:03.475325 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb@sha256:d9a2d8b19d3de4658dd26a2e781d00002e937738bfe6a1d0cf6c68c015085f4a" Dec 06 09:22:03 crc kubenswrapper[4672]: E1206 09:22:03.475836 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:d9a2d8b19d3de4658dd26a2e781d00002e937738bfe6a1d0cf6c68c015085f4a,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kkhjn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(8c53efb2-1642-4efd-b920-7ad41e6c136a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 09:22:03 crc kubenswrapper[4672]: E1206 09:22:03.477029 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="8c53efb2-1642-4efd-b920-7ad41e6c136a" Dec 06 09:22:03 crc kubenswrapper[4672]: E1206 09:22:03.480664 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:c2e607504d3c90c201f29ca74f0ee51de1ef2c5444d597d2ff3160ab9b3ce84d" Dec 06 09:22:03 crc kubenswrapper[4672]: E1206 09:22:03.480870 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:c2e607504d3c90c201f29ca74f0ee51de1ef2c5444d597d2ff3160ab9b3ce84d,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nbf2q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(1bbe623e-19ec-49f2-bfa4-65728b94d035): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 09:22:03 crc kubenswrapper[4672]: E1206 09:22:03.482085 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="1bbe623e-19ec-49f2-bfa4-65728b94d035" Dec 06 09:22:03 crc kubenswrapper[4672]: E1206 09:22:03.497865 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:c2e607504d3c90c201f29ca74f0ee51de1ef2c5444d597d2ff3160ab9b3ce84d" Dec 06 09:22:03 crc kubenswrapper[4672]: E1206 09:22:03.498116 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:c2e607504d3c90c201f29ca74f0ee51de1ef2c5444d597d2ff3160ab9b3ce84d,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qq4fg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(54ae723f-36b7-4991-9439-23af064249fa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 09:22:03 crc kubenswrapper[4672]: E1206 09:22:03.499698 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="54ae723f-36b7-4991-9439-23af064249fa" Dec 06 09:22:03 crc kubenswrapper[4672]: E1206 09:22:03.768723 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:c2e607504d3c90c201f29ca74f0ee51de1ef2c5444d597d2ff3160ab9b3ce84d\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="1bbe623e-19ec-49f2-bfa4-65728b94d035" Dec 06 09:22:03 crc kubenswrapper[4672]: E1206 09:22:03.769144 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb@sha256:d9a2d8b19d3de4658dd26a2e781d00002e937738bfe6a1d0cf6c68c015085f4a\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="8c53efb2-1642-4efd-b920-7ad41e6c136a" Dec 06 09:22:03 crc kubenswrapper[4672]: E1206 09:22:03.769589 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:c2e607504d3c90c201f29ca74f0ee51de1ef2c5444d597d2ff3160ab9b3ce84d\\\"\"" pod="openstack/rabbitmq-server-0" podUID="54ae723f-36b7-4991-9439-23af064249fa" Dec 06 09:22:04 crc kubenswrapper[4672]: E1206 09:22:04.169068 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached@sha256:dad2336390cae6705133deefaa09c9e39512cf29133aa009006e3962c8022108" Dec 06 09:22:04 crc kubenswrapper[4672]: E1206 09:22:04.169241 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached@sha256:dad2336390cae6705133deefaa09c9e39512cf29133aa009006e3962c8022108,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:nf7h559hbfh5bch589h544h657h659hf6h5d6h56fh689h5f7h5bdh67bh96h67dh5cch67h68h666h599h57ch5b5h5bdh686h644h569h57h669h5b5h5c5q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rmj27,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(944c7316-15fa-4e57-896c-65205e8137b2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 09:22:04 crc kubenswrapper[4672]: E1206 09:22:04.171155 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="944c7316-15fa-4e57-896c-65205e8137b2" Dec 06 09:22:04 crc kubenswrapper[4672]: I1206 09:22:04.477759 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qfs56"] Dec 06 09:22:04 crc kubenswrapper[4672]: I1206 09:22:04.480080 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qfs56" Dec 06 09:22:04 crc kubenswrapper[4672]: I1206 09:22:04.488700 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qfs56"] Dec 06 09:22:04 crc kubenswrapper[4672]: I1206 09:22:04.577812 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/453d0f39-ffba-4ad4-9a6f-07d7539c60ce-utilities\") pod \"certified-operators-qfs56\" (UID: \"453d0f39-ffba-4ad4-9a6f-07d7539c60ce\") " pod="openshift-marketplace/certified-operators-qfs56" Dec 06 09:22:04 crc kubenswrapper[4672]: I1206 09:22:04.577978 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nv76\" (UniqueName: \"kubernetes.io/projected/453d0f39-ffba-4ad4-9a6f-07d7539c60ce-kube-api-access-4nv76\") pod \"certified-operators-qfs56\" (UID: \"453d0f39-ffba-4ad4-9a6f-07d7539c60ce\") " pod="openshift-marketplace/certified-operators-qfs56" Dec 06 09:22:04 crc kubenswrapper[4672]: I1206 09:22:04.578045 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/453d0f39-ffba-4ad4-9a6f-07d7539c60ce-catalog-content\") pod \"certified-operators-qfs56\" (UID: \"453d0f39-ffba-4ad4-9a6f-07d7539c60ce\") " pod="openshift-marketplace/certified-operators-qfs56" Dec 06 09:22:04 crc kubenswrapper[4672]: I1206 09:22:04.679255 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/453d0f39-ffba-4ad4-9a6f-07d7539c60ce-utilities\") pod \"certified-operators-qfs56\" (UID: \"453d0f39-ffba-4ad4-9a6f-07d7539c60ce\") " pod="openshift-marketplace/certified-operators-qfs56" Dec 06 09:22:04 crc kubenswrapper[4672]: I1206 09:22:04.679313 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nv76\" (UniqueName: \"kubernetes.io/projected/453d0f39-ffba-4ad4-9a6f-07d7539c60ce-kube-api-access-4nv76\") pod \"certified-operators-qfs56\" (UID: \"453d0f39-ffba-4ad4-9a6f-07d7539c60ce\") " pod="openshift-marketplace/certified-operators-qfs56" Dec 06 09:22:04 crc kubenswrapper[4672]: I1206 09:22:04.679339 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/453d0f39-ffba-4ad4-9a6f-07d7539c60ce-catalog-content\") pod \"certified-operators-qfs56\" (UID: \"453d0f39-ffba-4ad4-9a6f-07d7539c60ce\") " pod="openshift-marketplace/certified-operators-qfs56" Dec 06 09:22:04 crc kubenswrapper[4672]: I1206 09:22:04.679871 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/453d0f39-ffba-4ad4-9a6f-07d7539c60ce-utilities\") pod \"certified-operators-qfs56\" (UID: \"453d0f39-ffba-4ad4-9a6f-07d7539c60ce\") " pod="openshift-marketplace/certified-operators-qfs56" Dec 06 09:22:04 crc kubenswrapper[4672]: I1206 09:22:04.679895 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/453d0f39-ffba-4ad4-9a6f-07d7539c60ce-catalog-content\") pod \"certified-operators-qfs56\" (UID: \"453d0f39-ffba-4ad4-9a6f-07d7539c60ce\") " pod="openshift-marketplace/certified-operators-qfs56" Dec 06 09:22:04 crc kubenswrapper[4672]: I1206 09:22:04.715026 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nv76\" (UniqueName: \"kubernetes.io/projected/453d0f39-ffba-4ad4-9a6f-07d7539c60ce-kube-api-access-4nv76\") pod \"certified-operators-qfs56\" (UID: \"453d0f39-ffba-4ad4-9a6f-07d7539c60ce\") " pod="openshift-marketplace/certified-operators-qfs56" Dec 06 09:22:04 crc kubenswrapper[4672]: I1206 09:22:04.796356 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qfs56" Dec 06 09:22:04 crc kubenswrapper[4672]: E1206 09:22:04.800090 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached@sha256:dad2336390cae6705133deefaa09c9e39512cf29133aa009006e3962c8022108\\\"\"" pod="openstack/memcached-0" podUID="944c7316-15fa-4e57-896c-65205e8137b2" Dec 06 09:22:05 crc kubenswrapper[4672]: E1206 09:22:05.508798 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792" Dec 06 09:22:05 crc kubenswrapper[4672]: E1206 09:22:05.509299 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vrkfs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-bc4b48fc9-bkstt_openstack(b19d3022-686a-4cad-9a8f-cb89e48efeca): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 09:22:05 crc kubenswrapper[4672]: E1206 09:22:05.510761 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-bc4b48fc9-bkstt" podUID="b19d3022-686a-4cad-9a8f-cb89e48efeca" Dec 06 09:22:05 crc kubenswrapper[4672]: E1206 09:22:05.666706 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792" Dec 06 09:22:05 crc kubenswrapper[4672]: E1206 09:22:05.666848 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nx4l6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-cb666b895-d99dl_openstack(e6d719fd-72b2-4fe2-a634-b92e6b6f3902): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 09:22:05 crc kubenswrapper[4672]: E1206 09:22:05.667964 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-cb666b895-d99dl" podUID="e6d719fd-72b2-4fe2-a634-b92e6b6f3902" Dec 06 09:22:05 crc kubenswrapper[4672]: E1206 09:22:05.694680 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792" Dec 06 09:22:05 crc kubenswrapper[4672]: E1206 09:22:05.694842 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2lzw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5cd484bb89-mrcqm_openstack(e587738d-3fc4-4187-b3ee-e77508f06a89): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 09:22:05 crc kubenswrapper[4672]: E1206 09:22:05.695095 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792" Dec 06 09:22:05 crc kubenswrapper[4672]: E1206 09:22:05.695164 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2k42b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-567c455747-xtk2q_openstack(18bf4162-886d-4591-8a9b-6ae9352f6537): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 09:22:05 crc kubenswrapper[4672]: E1206 09:22:05.697629 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5cd484bb89-mrcqm" podUID="e587738d-3fc4-4187-b3ee-e77508f06a89" Dec 06 09:22:05 crc kubenswrapper[4672]: E1206 09:22:05.697584 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-567c455747-xtk2q" podUID="18bf4162-886d-4591-8a9b-6ae9352f6537" Dec 06 09:22:05 crc kubenswrapper[4672]: E1206 09:22:05.805174 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792\\\"\"" pod="openstack/dnsmasq-dns-bc4b48fc9-bkstt" podUID="b19d3022-686a-4cad-9a8f-cb89e48efeca" Dec 06 09:22:05 crc kubenswrapper[4672]: E1206 09:22:05.805313 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792\\\"\"" pod="openstack/dnsmasq-dns-cb666b895-d99dl" podUID="e6d719fd-72b2-4fe2-a634-b92e6b6f3902" Dec 06 09:22:05 crc kubenswrapper[4672]: I1206 09:22:05.932433 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.031079 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hxgmq"] Dec 06 09:22:06 crc kubenswrapper[4672]: W1206 09:22:06.041422 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod453d0f39_ffba_4ad4_9a6f_07d7539c60ce.slice/crio-747a24322d5a3674f4a9c8abe3b7564c3c5f25a5deaa9601cecbd6c83d66cce9 WatchSource:0}: Error finding container 747a24322d5a3674f4a9c8abe3b7564c3c5f25a5deaa9601cecbd6c83d66cce9: Status 404 returned error can't find the container with id 747a24322d5a3674f4a9c8abe3b7564c3c5f25a5deaa9601cecbd6c83d66cce9 Dec 06 09:22:06 crc kubenswrapper[4672]: W1206 09:22:06.043731 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f9d3ddd_e69d_48c6_ae37_094f18a1ddc6.slice/crio-4778f9ce6c6ccc2ba51b81287b8e472c440b9b4efca336854a4ce55c3ab3ae2e WatchSource:0}: Error finding container 4778f9ce6c6ccc2ba51b81287b8e472c440b9b4efca336854a4ce55c3ab3ae2e: Status 404 returned error can't find the container with id 4778f9ce6c6ccc2ba51b81287b8e472c440b9b4efca336854a4ce55c3ab3ae2e Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.059357 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qfs56"] Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.246185 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 06 09:22:06 crc kubenswrapper[4672]: W1206 09:22:06.269838 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cddfb03_e3ff_478e_91c7_e3b58145d1e6.slice/crio-548b404a673c1f058d17385170070829e6515d9017ddfa08a6f1076d75fac694 WatchSource:0}: Error finding container 548b404a673c1f058d17385170070829e6515d9017ddfa08a6f1076d75fac694: Status 404 returned error can't find the container with id 548b404a673c1f058d17385170070829e6515d9017ddfa08a6f1076d75fac694 Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.286709 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-xtk2q" Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.339237 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-mrcqm" Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.418051 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2k42b\" (UniqueName: \"kubernetes.io/projected/18bf4162-886d-4591-8a9b-6ae9352f6537-kube-api-access-2k42b\") pod \"18bf4162-886d-4591-8a9b-6ae9352f6537\" (UID: \"18bf4162-886d-4591-8a9b-6ae9352f6537\") " Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.418374 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2lzw\" (UniqueName: \"kubernetes.io/projected/e587738d-3fc4-4187-b3ee-e77508f06a89-kube-api-access-s2lzw\") pod \"e587738d-3fc4-4187-b3ee-e77508f06a89\" (UID: \"e587738d-3fc4-4187-b3ee-e77508f06a89\") " Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.418453 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18bf4162-886d-4591-8a9b-6ae9352f6537-config\") pod \"18bf4162-886d-4591-8a9b-6ae9352f6537\" (UID: \"18bf4162-886d-4591-8a9b-6ae9352f6537\") " Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.418550 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e587738d-3fc4-4187-b3ee-e77508f06a89-config\") pod \"e587738d-3fc4-4187-b3ee-e77508f06a89\" (UID: \"e587738d-3fc4-4187-b3ee-e77508f06a89\") " Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.418641 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18bf4162-886d-4591-8a9b-6ae9352f6537-dns-svc\") pod \"18bf4162-886d-4591-8a9b-6ae9352f6537\" (UID: \"18bf4162-886d-4591-8a9b-6ae9352f6537\") " Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.419117 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e587738d-3fc4-4187-b3ee-e77508f06a89-config" (OuterVolumeSpecName: "config") pod "e587738d-3fc4-4187-b3ee-e77508f06a89" (UID: "e587738d-3fc4-4187-b3ee-e77508f06a89"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.419145 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18bf4162-886d-4591-8a9b-6ae9352f6537-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "18bf4162-886d-4591-8a9b-6ae9352f6537" (UID: "18bf4162-886d-4591-8a9b-6ae9352f6537"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.419156 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18bf4162-886d-4591-8a9b-6ae9352f6537-config" (OuterVolumeSpecName: "config") pod "18bf4162-886d-4591-8a9b-6ae9352f6537" (UID: "18bf4162-886d-4591-8a9b-6ae9352f6537"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.425040 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e587738d-3fc4-4187-b3ee-e77508f06a89-kube-api-access-s2lzw" (OuterVolumeSpecName: "kube-api-access-s2lzw") pod "e587738d-3fc4-4187-b3ee-e77508f06a89" (UID: "e587738d-3fc4-4187-b3ee-e77508f06a89"). InnerVolumeSpecName "kube-api-access-s2lzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.425962 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18bf4162-886d-4591-8a9b-6ae9352f6537-kube-api-access-2k42b" (OuterVolumeSpecName: "kube-api-access-2k42b") pod "18bf4162-886d-4591-8a9b-6ae9352f6537" (UID: "18bf4162-886d-4591-8a9b-6ae9352f6537"). InnerVolumeSpecName "kube-api-access-2k42b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.507832 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-rsxq7"] Dec 06 09:22:06 crc kubenswrapper[4672]: W1206 09:22:06.511341 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cc7e2b2_ad6c_44f4_b477_951936b867d8.slice/crio-00ae501f42e5655c426b3413d2d52c5efaf20f671b74c54538075e6987f080f9 WatchSource:0}: Error finding container 00ae501f42e5655c426b3413d2d52c5efaf20f671b74c54538075e6987f080f9: Status 404 returned error can't find the container with id 00ae501f42e5655c426b3413d2d52c5efaf20f671b74c54538075e6987f080f9 Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.519997 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18bf4162-886d-4591-8a9b-6ae9352f6537-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.520024 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2k42b\" (UniqueName: \"kubernetes.io/projected/18bf4162-886d-4591-8a9b-6ae9352f6537-kube-api-access-2k42b\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.520035 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2lzw\" (UniqueName: \"kubernetes.io/projected/e587738d-3fc4-4187-b3ee-e77508f06a89-kube-api-access-s2lzw\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.520045 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18bf4162-886d-4591-8a9b-6ae9352f6537-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.520053 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e587738d-3fc4-4187-b3ee-e77508f06a89-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.809485 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-567c455747-xtk2q" event={"ID":"18bf4162-886d-4591-8a9b-6ae9352f6537","Type":"ContainerDied","Data":"c1500258e893c182f024ec61913c195b74949d094619d0981b3b881d7524afc7"} Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.809731 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-xtk2q" Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.811940 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4cddfb03-e3ff-478e-91c7-e3b58145d1e6","Type":"ContainerStarted","Data":"548b404a673c1f058d17385170070829e6515d9017ddfa08a6f1076d75fac694"} Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.815195 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hxgmq" event={"ID":"2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6","Type":"ContainerStarted","Data":"4778f9ce6c6ccc2ba51b81287b8e472c440b9b4efca336854a4ce55c3ab3ae2e"} Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.818109 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4a2f21bb-694a-4fd5-a5b5-e1d094f8ef62","Type":"ContainerStarted","Data":"c1a5266830d1127726b933682c619f5717e57d1112af092a9dec2d6b7102b4b5"} Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.819204 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cd484bb89-mrcqm" event={"ID":"e587738d-3fc4-4187-b3ee-e77508f06a89","Type":"ContainerDied","Data":"f6f1a04d7cf2a77539d398d51a119c15fd129b731cbb454760f8955577ffe27b"} Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.819260 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-mrcqm" Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.822967 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qfs56" event={"ID":"453d0f39-ffba-4ad4-9a6f-07d7539c60ce","Type":"ContainerDied","Data":"197007904e3f1ee185c409f3655d47a59ce027fddd2e564177e26ecf9d8925d7"} Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.822887 4672 generic.go:334] "Generic (PLEG): container finished" podID="453d0f39-ffba-4ad4-9a6f-07d7539c60ce" containerID="197007904e3f1ee185c409f3655d47a59ce027fddd2e564177e26ecf9d8925d7" exitCode=0 Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.823130 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qfs56" event={"ID":"453d0f39-ffba-4ad4-9a6f-07d7539c60ce","Type":"ContainerStarted","Data":"747a24322d5a3674f4a9c8abe3b7564c3c5f25a5deaa9601cecbd6c83d66cce9"} Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.839466 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rsxq7" event={"ID":"8cc7e2b2-ad6c-44f4-b477-951936b867d8","Type":"ContainerStarted","Data":"00ae501f42e5655c426b3413d2d52c5efaf20f671b74c54538075e6987f080f9"} Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.898740 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-567c455747-xtk2q"] Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.910454 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-567c455747-xtk2q"] Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.932755 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-mrcqm"] Dec 06 09:22:06 crc kubenswrapper[4672]: I1206 09:22:06.936960 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-mrcqm"] Dec 06 09:22:07 crc kubenswrapper[4672]: I1206 09:22:07.326468 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 06 09:22:07 crc kubenswrapper[4672]: W1206 09:22:07.505915 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f085ca1_832b_40dc_b131_2c287df92f6e.slice/crio-1ac1ceca664502ed623f15982cbb9390dce78553e375d1de0e12013f65f80718 WatchSource:0}: Error finding container 1ac1ceca664502ed623f15982cbb9390dce78553e375d1de0e12013f65f80718: Status 404 returned error can't find the container with id 1ac1ceca664502ed623f15982cbb9390dce78553e375d1de0e12013f65f80718 Dec 06 09:22:07 crc kubenswrapper[4672]: I1206 09:22:07.847088 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9f085ca1-832b-40dc-b131-2c287df92f6e","Type":"ContainerStarted","Data":"1ac1ceca664502ed623f15982cbb9390dce78553e375d1de0e12013f65f80718"} Dec 06 09:22:08 crc kubenswrapper[4672]: I1206 09:22:08.570847 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18bf4162-886d-4591-8a9b-6ae9352f6537" path="/var/lib/kubelet/pods/18bf4162-886d-4591-8a9b-6ae9352f6537/volumes" Dec 06 09:22:08 crc kubenswrapper[4672]: I1206 09:22:08.571398 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e587738d-3fc4-4187-b3ee-e77508f06a89" path="/var/lib/kubelet/pods/e587738d-3fc4-4187-b3ee-e77508f06a89/volumes" Dec 06 09:22:11 crc kubenswrapper[4672]: I1206 09:22:11.884944 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hxgmq" event={"ID":"2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6","Type":"ContainerStarted","Data":"4183daadc139c135982bfa6eadab7ca74204d75e54eeae34bdfca3d7131f67d6"} Dec 06 09:22:11 crc kubenswrapper[4672]: I1206 09:22:11.886077 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-hxgmq" Dec 06 09:22:11 crc kubenswrapper[4672]: I1206 09:22:11.893507 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9f085ca1-832b-40dc-b131-2c287df92f6e","Type":"ContainerStarted","Data":"f70af43f25ce79fd98884874569bc2d3d7f126a3eba026c744fadf8bef5af8ea"} Dec 06 09:22:11 crc kubenswrapper[4672]: I1206 09:22:11.895692 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4a2f21bb-694a-4fd5-a5b5-e1d094f8ef62","Type":"ContainerStarted","Data":"b50b3aaf277df5d74f98ed52523368d6640aed76ba4a931bdf1412d08d32a4b3"} Dec 06 09:22:11 crc kubenswrapper[4672]: I1206 09:22:11.897144 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 06 09:22:11 crc kubenswrapper[4672]: I1206 09:22:11.900806 4672 generic.go:334] "Generic (PLEG): container finished" podID="453d0f39-ffba-4ad4-9a6f-07d7539c60ce" containerID="9f327a064988bc0a87434c094d994b96a8f051643e08111c3bb197aa59066a59" exitCode=0 Dec 06 09:22:11 crc kubenswrapper[4672]: I1206 09:22:11.900877 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qfs56" event={"ID":"453d0f39-ffba-4ad4-9a6f-07d7539c60ce","Type":"ContainerDied","Data":"9f327a064988bc0a87434c094d994b96a8f051643e08111c3bb197aa59066a59"} Dec 06 09:22:11 crc kubenswrapper[4672]: I1206 09:22:11.916311 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-hxgmq" podStartSLOduration=22.05686438 podStartE2EDuration="26.91628968s" podCreationTimestamp="2025-12-06 09:21:45 +0000 UTC" firstStartedPulling="2025-12-06 09:22:06.046778394 +0000 UTC m=+943.791038681" lastFinishedPulling="2025-12-06 09:22:10.906203664 +0000 UTC m=+948.650463981" observedRunningTime="2025-12-06 09:22:11.91077668 +0000 UTC m=+949.655036987" watchObservedRunningTime="2025-12-06 09:22:11.91628968 +0000 UTC m=+949.660549967" Dec 06 09:22:11 crc kubenswrapper[4672]: I1206 09:22:11.918177 4672 generic.go:334] "Generic (PLEG): container finished" podID="8cc7e2b2-ad6c-44f4-b477-951936b867d8" containerID="ba3f4ba4e067b537797b552cb3699c081ec9fbe855c9afc878ace2e83316a6be" exitCode=0 Dec 06 09:22:11 crc kubenswrapper[4672]: I1206 09:22:11.918235 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rsxq7" event={"ID":"8cc7e2b2-ad6c-44f4-b477-951936b867d8","Type":"ContainerDied","Data":"ba3f4ba4e067b537797b552cb3699c081ec9fbe855c9afc878ace2e83316a6be"} Dec 06 09:22:11 crc kubenswrapper[4672]: I1206 09:22:11.926061 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4cddfb03-e3ff-478e-91c7-e3b58145d1e6","Type":"ContainerStarted","Data":"4ba9693daf8e860c1e0e70f88dbc03787d100a2ed3091d9afe43fa72fa4cdc55"} Dec 06 09:22:11 crc kubenswrapper[4672]: I1206 09:22:11.954333 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=25.999180924 podStartE2EDuration="30.954312572s" podCreationTimestamp="2025-12-06 09:21:41 +0000 UTC" firstStartedPulling="2025-12-06 09:22:05.95234977 +0000 UTC m=+943.696610057" lastFinishedPulling="2025-12-06 09:22:10.907481408 +0000 UTC m=+948.651741705" observedRunningTime="2025-12-06 09:22:11.947651771 +0000 UTC m=+949.691912058" watchObservedRunningTime="2025-12-06 09:22:11.954312572 +0000 UTC m=+949.698572859" Dec 06 09:22:12 crc kubenswrapper[4672]: I1206 09:22:12.319944 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:22:12 crc kubenswrapper[4672]: I1206 09:22:12.320012 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:22:12 crc kubenswrapper[4672]: I1206 09:22:12.939961 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qfs56" event={"ID":"453d0f39-ffba-4ad4-9a6f-07d7539c60ce","Type":"ContainerStarted","Data":"a65a75dc31fd384eb2db8cb44ac3ae8adb6f38dabdcfc90cb0a66be1e30c2e1e"} Dec 06 09:22:12 crc kubenswrapper[4672]: I1206 09:22:12.946348 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rsxq7" event={"ID":"8cc7e2b2-ad6c-44f4-b477-951936b867d8","Type":"ContainerStarted","Data":"d214ad0fd8a65758a0e8882a76ee5920eac1efa50f25101a07e385f2040a4aea"} Dec 06 09:22:12 crc kubenswrapper[4672]: I1206 09:22:12.946389 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-rsxq7" Dec 06 09:22:12 crc kubenswrapper[4672]: I1206 09:22:12.946401 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rsxq7" event={"ID":"8cc7e2b2-ad6c-44f4-b477-951936b867d8","Type":"ContainerStarted","Data":"aa223d90315ba91389e22c6f9c91312b6fbc165cbabd8a084f30dc4bef0d1865"} Dec 06 09:22:12 crc kubenswrapper[4672]: I1206 09:22:12.946852 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-rsxq7" Dec 06 09:22:12 crc kubenswrapper[4672]: I1206 09:22:12.971266 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qfs56" podStartSLOduration=3.750022447 podStartE2EDuration="8.971246313s" podCreationTimestamp="2025-12-06 09:22:04 +0000 UTC" firstStartedPulling="2025-12-06 09:22:07.15773709 +0000 UTC m=+944.901997377" lastFinishedPulling="2025-12-06 09:22:12.378960956 +0000 UTC m=+950.123221243" observedRunningTime="2025-12-06 09:22:12.959872224 +0000 UTC m=+950.704132521" watchObservedRunningTime="2025-12-06 09:22:12.971246313 +0000 UTC m=+950.715506600" Dec 06 09:22:12 crc kubenswrapper[4672]: I1206 09:22:12.988880 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-rsxq7" podStartSLOduration=23.597793703 podStartE2EDuration="27.988856371s" podCreationTimestamp="2025-12-06 09:21:45 +0000 UTC" firstStartedPulling="2025-12-06 09:22:06.51494608 +0000 UTC m=+944.259206367" lastFinishedPulling="2025-12-06 09:22:10.906008748 +0000 UTC m=+948.650269035" observedRunningTime="2025-12-06 09:22:12.988368568 +0000 UTC m=+950.732628855" watchObservedRunningTime="2025-12-06 09:22:12.988856371 +0000 UTC m=+950.733116688" Dec 06 09:22:13 crc kubenswrapper[4672]: I1206 09:22:13.936350 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-brwrz"] Dec 06 09:22:13 crc kubenswrapper[4672]: I1206 09:22:13.937798 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-brwrz" Dec 06 09:22:13 crc kubenswrapper[4672]: I1206 09:22:13.959220 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-brwrz"] Dec 06 09:22:14 crc kubenswrapper[4672]: I1206 09:22:14.086504 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47170544-affc-42cd-8c00-305d44f1efa0-utilities\") pod \"redhat-marketplace-brwrz\" (UID: \"47170544-affc-42cd-8c00-305d44f1efa0\") " pod="openshift-marketplace/redhat-marketplace-brwrz" Dec 06 09:22:14 crc kubenswrapper[4672]: I1206 09:22:14.086716 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z949x\" (UniqueName: \"kubernetes.io/projected/47170544-affc-42cd-8c00-305d44f1efa0-kube-api-access-z949x\") pod \"redhat-marketplace-brwrz\" (UID: \"47170544-affc-42cd-8c00-305d44f1efa0\") " pod="openshift-marketplace/redhat-marketplace-brwrz" Dec 06 09:22:14 crc kubenswrapper[4672]: I1206 09:22:14.086775 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47170544-affc-42cd-8c00-305d44f1efa0-catalog-content\") pod \"redhat-marketplace-brwrz\" (UID: \"47170544-affc-42cd-8c00-305d44f1efa0\") " pod="openshift-marketplace/redhat-marketplace-brwrz" Dec 06 09:22:14 crc kubenswrapper[4672]: I1206 09:22:14.188046 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47170544-affc-42cd-8c00-305d44f1efa0-utilities\") pod \"redhat-marketplace-brwrz\" (UID: \"47170544-affc-42cd-8c00-305d44f1efa0\") " pod="openshift-marketplace/redhat-marketplace-brwrz" Dec 06 09:22:14 crc kubenswrapper[4672]: I1206 09:22:14.188141 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z949x\" (UniqueName: \"kubernetes.io/projected/47170544-affc-42cd-8c00-305d44f1efa0-kube-api-access-z949x\") pod \"redhat-marketplace-brwrz\" (UID: \"47170544-affc-42cd-8c00-305d44f1efa0\") " pod="openshift-marketplace/redhat-marketplace-brwrz" Dec 06 09:22:14 crc kubenswrapper[4672]: I1206 09:22:14.188195 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47170544-affc-42cd-8c00-305d44f1efa0-catalog-content\") pod \"redhat-marketplace-brwrz\" (UID: \"47170544-affc-42cd-8c00-305d44f1efa0\") " pod="openshift-marketplace/redhat-marketplace-brwrz" Dec 06 09:22:14 crc kubenswrapper[4672]: I1206 09:22:14.188760 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47170544-affc-42cd-8c00-305d44f1efa0-catalog-content\") pod \"redhat-marketplace-brwrz\" (UID: \"47170544-affc-42cd-8c00-305d44f1efa0\") " pod="openshift-marketplace/redhat-marketplace-brwrz" Dec 06 09:22:14 crc kubenswrapper[4672]: I1206 09:22:14.189099 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47170544-affc-42cd-8c00-305d44f1efa0-utilities\") pod \"redhat-marketplace-brwrz\" (UID: \"47170544-affc-42cd-8c00-305d44f1efa0\") " pod="openshift-marketplace/redhat-marketplace-brwrz" Dec 06 09:22:14 crc kubenswrapper[4672]: I1206 09:22:14.231445 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z949x\" (UniqueName: \"kubernetes.io/projected/47170544-affc-42cd-8c00-305d44f1efa0-kube-api-access-z949x\") pod \"redhat-marketplace-brwrz\" (UID: \"47170544-affc-42cd-8c00-305d44f1efa0\") " pod="openshift-marketplace/redhat-marketplace-brwrz" Dec 06 09:22:14 crc kubenswrapper[4672]: I1206 09:22:14.289552 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-brwrz" Dec 06 09:22:14 crc kubenswrapper[4672]: I1206 09:22:14.797038 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qfs56" Dec 06 09:22:14 crc kubenswrapper[4672]: I1206 09:22:14.797388 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qfs56" Dec 06 09:22:14 crc kubenswrapper[4672]: I1206 09:22:14.845045 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qfs56" Dec 06 09:22:20 crc kubenswrapper[4672]: I1206 09:22:20.119182 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-brwrz"] Dec 06 09:22:21 crc kubenswrapper[4672]: I1206 09:22:21.043769 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8c53efb2-1642-4efd-b920-7ad41e6c136a","Type":"ContainerStarted","Data":"e47592a7a82be5cc2aa8991f026901bddc46feae84e5d9cb3c35bdde2b13fca3"} Dec 06 09:22:21 crc kubenswrapper[4672]: I1206 09:22:21.045829 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"37d0f081-e2da-4845-9097-31607c42efc4","Type":"ContainerStarted","Data":"dd23292204d37dca6b9db0ac3cdbb030dabd2e717b5519b41a0ff93b8dcf25ba"} Dec 06 09:22:21 crc kubenswrapper[4672]: I1206 09:22:21.047533 4672 generic.go:334] "Generic (PLEG): container finished" podID="47170544-affc-42cd-8c00-305d44f1efa0" containerID="25a39d7b96cfcba3b0c17fa5cdbd12e149dd5ee92d51c0b60c4af04ecee5e350" exitCode=0 Dec 06 09:22:21 crc kubenswrapper[4672]: I1206 09:22:21.047572 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-brwrz" event={"ID":"47170544-affc-42cd-8c00-305d44f1efa0","Type":"ContainerDied","Data":"25a39d7b96cfcba3b0c17fa5cdbd12e149dd5ee92d51c0b60c4af04ecee5e350"} Dec 06 09:22:21 crc kubenswrapper[4672]: I1206 09:22:21.047870 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-brwrz" event={"ID":"47170544-affc-42cd-8c00-305d44f1efa0","Type":"ContainerStarted","Data":"c59dfe7edc5ddbf7a57ce3f625e676ce7ba60581160c810b583f1220476f337c"} Dec 06 09:22:21 crc kubenswrapper[4672]: I1206 09:22:21.049694 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"944c7316-15fa-4e57-896c-65205e8137b2","Type":"ContainerStarted","Data":"f58790bf4811c91d4adac6cd3193098c8f489411321867273727061b9b959dc3"} Dec 06 09:22:21 crc kubenswrapper[4672]: I1206 09:22:21.049884 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 06 09:22:21 crc kubenswrapper[4672]: I1206 09:22:21.140268 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=10.132722524 podStartE2EDuration="43.140243355s" podCreationTimestamp="2025-12-06 09:21:38 +0000 UTC" firstStartedPulling="2025-12-06 09:21:47.28691429 +0000 UTC m=+925.031174577" lastFinishedPulling="2025-12-06 09:22:20.294435121 +0000 UTC m=+958.038695408" observedRunningTime="2025-12-06 09:22:21.136682128 +0000 UTC m=+958.880942425" watchObservedRunningTime="2025-12-06 09:22:21.140243355 +0000 UTC m=+958.884503642" Dec 06 09:22:22 crc kubenswrapper[4672]: I1206 09:22:22.042448 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 06 09:22:23 crc kubenswrapper[4672]: I1206 09:22:23.077646 4672 generic.go:334] "Generic (PLEG): container finished" podID="b19d3022-686a-4cad-9a8f-cb89e48efeca" containerID="b4112b1434bc38aa5c40c9459aed5c9a4e91f30cba5a6120d389ae3e2d49496e" exitCode=0 Dec 06 09:22:23 crc kubenswrapper[4672]: I1206 09:22:23.077895 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc4b48fc9-bkstt" event={"ID":"b19d3022-686a-4cad-9a8f-cb89e48efeca","Type":"ContainerDied","Data":"b4112b1434bc38aa5c40c9459aed5c9a4e91f30cba5a6120d389ae3e2d49496e"} Dec 06 09:22:23 crc kubenswrapper[4672]: I1206 09:22:23.082000 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1bbe623e-19ec-49f2-bfa4-65728b94d035","Type":"ContainerStarted","Data":"0146bf3c3518e6366d9ba504ec80e4a862ae7f202781214ec92564f51d9798e2"} Dec 06 09:22:23 crc kubenswrapper[4672]: I1206 09:22:23.085666 4672 generic.go:334] "Generic (PLEG): container finished" podID="e6d719fd-72b2-4fe2-a634-b92e6b6f3902" containerID="a801442f2357b535f1691db2c114a19b91d50dcf75ceff99378f49ceb906ee54" exitCode=0 Dec 06 09:22:23 crc kubenswrapper[4672]: I1206 09:22:23.085742 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-d99dl" event={"ID":"e6d719fd-72b2-4fe2-a634-b92e6b6f3902","Type":"ContainerDied","Data":"a801442f2357b535f1691db2c114a19b91d50dcf75ceff99378f49ceb906ee54"} Dec 06 09:22:23 crc kubenswrapper[4672]: I1206 09:22:23.089025 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"54ae723f-36b7-4991-9439-23af064249fa","Type":"ContainerStarted","Data":"89614a71a15f9d9f77af3aefe664b3203631d0d02c91b9ee18b19df2d3ae7473"} Dec 06 09:22:23 crc kubenswrapper[4672]: I1206 09:22:23.092645 4672 generic.go:334] "Generic (PLEG): container finished" podID="47170544-affc-42cd-8c00-305d44f1efa0" containerID="2334b7475a993d7b8d38ecdad9aa8101ef9c9e5ccb09a586afd81eea672a7b01" exitCode=0 Dec 06 09:22:23 crc kubenswrapper[4672]: I1206 09:22:23.092737 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-brwrz" event={"ID":"47170544-affc-42cd-8c00-305d44f1efa0","Type":"ContainerDied","Data":"2334b7475a993d7b8d38ecdad9aa8101ef9c9e5ccb09a586afd81eea672a7b01"} Dec 06 09:22:23 crc kubenswrapper[4672]: I1206 09:22:23.106768 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4cddfb03-e3ff-478e-91c7-e3b58145d1e6","Type":"ContainerStarted","Data":"98daa7cde1744b0ff0a4b6bd84bb3ec9908ca76d96eb4c8c974f2a215acb4d81"} Dec 06 09:22:23 crc kubenswrapper[4672]: I1206 09:22:23.157356 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9f085ca1-832b-40dc-b131-2c287df92f6e","Type":"ContainerStarted","Data":"2af6e02a3ea1513b875b316a9ca8fc9f674f47b14c1099ef98e92fa8633c0396"} Dec 06 09:22:23 crc kubenswrapper[4672]: I1206 09:22:23.291923 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=22.304552631 podStartE2EDuration="36.291906817s" podCreationTimestamp="2025-12-06 09:21:47 +0000 UTC" firstStartedPulling="2025-12-06 09:22:06.278591721 +0000 UTC m=+944.022851998" lastFinishedPulling="2025-12-06 09:22:20.265945897 +0000 UTC m=+958.010206184" observedRunningTime="2025-12-06 09:22:23.289241565 +0000 UTC m=+961.033501862" watchObservedRunningTime="2025-12-06 09:22:23.291906817 +0000 UTC m=+961.036167104" Dec 06 09:22:23 crc kubenswrapper[4672]: I1206 09:22:23.307390 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=26.660980364 podStartE2EDuration="39.307372328s" podCreationTimestamp="2025-12-06 09:21:44 +0000 UTC" firstStartedPulling="2025-12-06 09:22:07.513959225 +0000 UTC m=+945.258219512" lastFinishedPulling="2025-12-06 09:22:20.160351189 +0000 UTC m=+957.904611476" observedRunningTime="2025-12-06 09:22:23.305135037 +0000 UTC m=+961.049395324" watchObservedRunningTime="2025-12-06 09:22:23.307372328 +0000 UTC m=+961.051632615" Dec 06 09:22:24 crc kubenswrapper[4672]: I1206 09:22:24.170628 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-brwrz" event={"ID":"47170544-affc-42cd-8c00-305d44f1efa0","Type":"ContainerStarted","Data":"a1e5931959f3ab8a0d60da52c1a9d67b43c71d42af35017d4950a684a9cb49a1"} Dec 06 09:22:24 crc kubenswrapper[4672]: I1206 09:22:24.178365 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc4b48fc9-bkstt" event={"ID":"b19d3022-686a-4cad-9a8f-cb89e48efeca","Type":"ContainerStarted","Data":"1cebab99fe2c340a9519a9535cad695c5db362ecffef34682e0c99ebdfb8a57a"} Dec 06 09:22:24 crc kubenswrapper[4672]: I1206 09:22:24.179066 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bc4b48fc9-bkstt" Dec 06 09:22:24 crc kubenswrapper[4672]: I1206 09:22:24.184186 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-d99dl" event={"ID":"e6d719fd-72b2-4fe2-a634-b92e6b6f3902","Type":"ContainerStarted","Data":"b27c2feac5fc02a3244c6e180855a0d71cad2eda14fb34dabf2844458a3e4da8"} Dec 06 09:22:24 crc kubenswrapper[4672]: I1206 09:22:24.202755 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-brwrz" podStartSLOduration=8.73947728 podStartE2EDuration="11.202730536s" podCreationTimestamp="2025-12-06 09:22:13 +0000 UTC" firstStartedPulling="2025-12-06 09:22:21.04986368 +0000 UTC m=+958.794123977" lastFinishedPulling="2025-12-06 09:22:23.513116956 +0000 UTC m=+961.257377233" observedRunningTime="2025-12-06 09:22:24.202543651 +0000 UTC m=+961.946803948" watchObservedRunningTime="2025-12-06 09:22:24.202730536 +0000 UTC m=+961.946990863" Dec 06 09:22:24 crc kubenswrapper[4672]: I1206 09:22:24.222001 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cb666b895-d99dl" podStartSLOduration=5.127453936 podStartE2EDuration="50.221981589s" podCreationTimestamp="2025-12-06 09:21:34 +0000 UTC" firstStartedPulling="2025-12-06 09:21:35.310747089 +0000 UTC m=+913.055007376" lastFinishedPulling="2025-12-06 09:22:20.405274752 +0000 UTC m=+958.149535029" observedRunningTime="2025-12-06 09:22:24.221345962 +0000 UTC m=+961.965606269" watchObservedRunningTime="2025-12-06 09:22:24.221981589 +0000 UTC m=+961.966241886" Dec 06 09:22:24 crc kubenswrapper[4672]: I1206 09:22:24.252235 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bc4b48fc9-bkstt" podStartSLOduration=-9223371985.602562 podStartE2EDuration="51.252214371s" podCreationTimestamp="2025-12-06 09:21:33 +0000 UTC" firstStartedPulling="2025-12-06 09:21:34.736835431 +0000 UTC m=+912.481095718" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:22:24.245982202 +0000 UTC m=+961.990242499" watchObservedRunningTime="2025-12-06 09:22:24.252214371 +0000 UTC m=+961.996474678" Dec 06 09:22:24 crc kubenswrapper[4672]: I1206 09:22:24.290338 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-brwrz" Dec 06 09:22:24 crc kubenswrapper[4672]: I1206 09:22:24.290572 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-brwrz" Dec 06 09:22:24 crc kubenswrapper[4672]: I1206 09:22:24.340863 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 06 09:22:24 crc kubenswrapper[4672]: I1206 09:22:24.637191 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cb666b895-d99dl" Dec 06 09:22:24 crc kubenswrapper[4672]: I1206 09:22:24.858275 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qfs56" Dec 06 09:22:24 crc kubenswrapper[4672]: I1206 09:22:24.913578 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qfs56"] Dec 06 09:22:25 crc kubenswrapper[4672]: I1206 09:22:25.192058 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qfs56" podUID="453d0f39-ffba-4ad4-9a6f-07d7539c60ce" containerName="registry-server" containerID="cri-o://a65a75dc31fd384eb2db8cb44ac3ae8adb6f38dabdcfc90cb0a66be1e30c2e1e" gracePeriod=2 Dec 06 09:22:25 crc kubenswrapper[4672]: I1206 09:22:25.325360 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 06 09:22:25 crc kubenswrapper[4672]: I1206 09:22:25.341446 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 06 09:22:25 crc kubenswrapper[4672]: I1206 09:22:25.341798 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-brwrz" podUID="47170544-affc-42cd-8c00-305d44f1efa0" containerName="registry-server" probeResult="failure" output=< Dec 06 09:22:25 crc kubenswrapper[4672]: timeout: failed to connect service ":50051" within 1s Dec 06 09:22:25 crc kubenswrapper[4672]: > Dec 06 09:22:25 crc kubenswrapper[4672]: I1206 09:22:25.397652 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 06 09:22:25 crc kubenswrapper[4672]: I1206 09:22:25.398843 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 06 09:22:25 crc kubenswrapper[4672]: I1206 09:22:25.625866 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qfs56" Dec 06 09:22:25 crc kubenswrapper[4672]: I1206 09:22:25.649229 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/453d0f39-ffba-4ad4-9a6f-07d7539c60ce-catalog-content\") pod \"453d0f39-ffba-4ad4-9a6f-07d7539c60ce\" (UID: \"453d0f39-ffba-4ad4-9a6f-07d7539c60ce\") " Dec 06 09:22:25 crc kubenswrapper[4672]: I1206 09:22:25.649316 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nv76\" (UniqueName: \"kubernetes.io/projected/453d0f39-ffba-4ad4-9a6f-07d7539c60ce-kube-api-access-4nv76\") pod \"453d0f39-ffba-4ad4-9a6f-07d7539c60ce\" (UID: \"453d0f39-ffba-4ad4-9a6f-07d7539c60ce\") " Dec 06 09:22:25 crc kubenswrapper[4672]: I1206 09:22:25.649429 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/453d0f39-ffba-4ad4-9a6f-07d7539c60ce-utilities\") pod \"453d0f39-ffba-4ad4-9a6f-07d7539c60ce\" (UID: \"453d0f39-ffba-4ad4-9a6f-07d7539c60ce\") " Dec 06 09:22:25 crc kubenswrapper[4672]: I1206 09:22:25.652980 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/453d0f39-ffba-4ad4-9a6f-07d7539c60ce-utilities" (OuterVolumeSpecName: "utilities") pod "453d0f39-ffba-4ad4-9a6f-07d7539c60ce" (UID: "453d0f39-ffba-4ad4-9a6f-07d7539c60ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:22:25 crc kubenswrapper[4672]: I1206 09:22:25.658854 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/453d0f39-ffba-4ad4-9a6f-07d7539c60ce-kube-api-access-4nv76" (OuterVolumeSpecName: "kube-api-access-4nv76") pod "453d0f39-ffba-4ad4-9a6f-07d7539c60ce" (UID: "453d0f39-ffba-4ad4-9a6f-07d7539c60ce"). InnerVolumeSpecName "kube-api-access-4nv76". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:22:25 crc kubenswrapper[4672]: I1206 09:22:25.711472 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/453d0f39-ffba-4ad4-9a6f-07d7539c60ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "453d0f39-ffba-4ad4-9a6f-07d7539c60ce" (UID: "453d0f39-ffba-4ad4-9a6f-07d7539c60ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:22:25 crc kubenswrapper[4672]: I1206 09:22:25.751356 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/453d0f39-ffba-4ad4-9a6f-07d7539c60ce-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:25 crc kubenswrapper[4672]: I1206 09:22:25.751386 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nv76\" (UniqueName: \"kubernetes.io/projected/453d0f39-ffba-4ad4-9a6f-07d7539c60ce-kube-api-access-4nv76\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:25 crc kubenswrapper[4672]: I1206 09:22:25.751398 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/453d0f39-ffba-4ad4-9a6f-07d7539c60ce-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.203695 4672 generic.go:334] "Generic (PLEG): container finished" podID="453d0f39-ffba-4ad4-9a6f-07d7539c60ce" containerID="a65a75dc31fd384eb2db8cb44ac3ae8adb6f38dabdcfc90cb0a66be1e30c2e1e" exitCode=0 Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.203866 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qfs56" event={"ID":"453d0f39-ffba-4ad4-9a6f-07d7539c60ce","Type":"ContainerDied","Data":"a65a75dc31fd384eb2db8cb44ac3ae8adb6f38dabdcfc90cb0a66be1e30c2e1e"} Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.204315 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qfs56" event={"ID":"453d0f39-ffba-4ad4-9a6f-07d7539c60ce","Type":"ContainerDied","Data":"747a24322d5a3674f4a9c8abe3b7564c3c5f25a5deaa9601cecbd6c83d66cce9"} Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.204349 4672 scope.go:117] "RemoveContainer" containerID="a65a75dc31fd384eb2db8cb44ac3ae8adb6f38dabdcfc90cb0a66be1e30c2e1e" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.203998 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qfs56" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.210832 4672 generic.go:334] "Generic (PLEG): container finished" podID="8c53efb2-1642-4efd-b920-7ad41e6c136a" containerID="e47592a7a82be5cc2aa8991f026901bddc46feae84e5d9cb3c35bdde2b13fca3" exitCode=0 Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.211240 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8c53efb2-1642-4efd-b920-7ad41e6c136a","Type":"ContainerDied","Data":"e47592a7a82be5cc2aa8991f026901bddc46feae84e5d9cb3c35bdde2b13fca3"} Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.218868 4672 generic.go:334] "Generic (PLEG): container finished" podID="37d0f081-e2da-4845-9097-31607c42efc4" containerID="dd23292204d37dca6b9db0ac3cdbb030dabd2e717b5519b41a0ff93b8dcf25ba" exitCode=0 Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.226078 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"37d0f081-e2da-4845-9097-31607c42efc4","Type":"ContainerDied","Data":"dd23292204d37dca6b9db0ac3cdbb030dabd2e717b5519b41a0ff93b8dcf25ba"} Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.226338 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.267060 4672 scope.go:117] "RemoveContainer" containerID="9f327a064988bc0a87434c094d994b96a8f051643e08111c3bb197aa59066a59" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.319301 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qfs56"] Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.319913 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.320696 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.327475 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qfs56"] Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.489141 4672 scope.go:117] "RemoveContainer" containerID="197007904e3f1ee185c409f3655d47a59ce027fddd2e564177e26ecf9d8925d7" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.540768 4672 scope.go:117] "RemoveContainer" containerID="a65a75dc31fd384eb2db8cb44ac3ae8adb6f38dabdcfc90cb0a66be1e30c2e1e" Dec 06 09:22:26 crc kubenswrapper[4672]: E1206 09:22:26.544725 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a65a75dc31fd384eb2db8cb44ac3ae8adb6f38dabdcfc90cb0a66be1e30c2e1e\": container with ID starting with a65a75dc31fd384eb2db8cb44ac3ae8adb6f38dabdcfc90cb0a66be1e30c2e1e not found: ID does not exist" containerID="a65a75dc31fd384eb2db8cb44ac3ae8adb6f38dabdcfc90cb0a66be1e30c2e1e" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.544778 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a65a75dc31fd384eb2db8cb44ac3ae8adb6f38dabdcfc90cb0a66be1e30c2e1e"} err="failed to get container status \"a65a75dc31fd384eb2db8cb44ac3ae8adb6f38dabdcfc90cb0a66be1e30c2e1e\": rpc error: code = NotFound desc = could not find container \"a65a75dc31fd384eb2db8cb44ac3ae8adb6f38dabdcfc90cb0a66be1e30c2e1e\": container with ID starting with a65a75dc31fd384eb2db8cb44ac3ae8adb6f38dabdcfc90cb0a66be1e30c2e1e not found: ID does not exist" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.544808 4672 scope.go:117] "RemoveContainer" containerID="9f327a064988bc0a87434c094d994b96a8f051643e08111c3bb197aa59066a59" Dec 06 09:22:26 crc kubenswrapper[4672]: E1206 09:22:26.545117 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f327a064988bc0a87434c094d994b96a8f051643e08111c3bb197aa59066a59\": container with ID starting with 9f327a064988bc0a87434c094d994b96a8f051643e08111c3bb197aa59066a59 not found: ID does not exist" containerID="9f327a064988bc0a87434c094d994b96a8f051643e08111c3bb197aa59066a59" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.545145 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f327a064988bc0a87434c094d994b96a8f051643e08111c3bb197aa59066a59"} err="failed to get container status \"9f327a064988bc0a87434c094d994b96a8f051643e08111c3bb197aa59066a59\": rpc error: code = NotFound desc = could not find container \"9f327a064988bc0a87434c094d994b96a8f051643e08111c3bb197aa59066a59\": container with ID starting with 9f327a064988bc0a87434c094d994b96a8f051643e08111c3bb197aa59066a59 not found: ID does not exist" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.545165 4672 scope.go:117] "RemoveContainer" containerID="197007904e3f1ee185c409f3655d47a59ce027fddd2e564177e26ecf9d8925d7" Dec 06 09:22:26 crc kubenswrapper[4672]: E1206 09:22:26.545458 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"197007904e3f1ee185c409f3655d47a59ce027fddd2e564177e26ecf9d8925d7\": container with ID starting with 197007904e3f1ee185c409f3655d47a59ce027fddd2e564177e26ecf9d8925d7 not found: ID does not exist" containerID="197007904e3f1ee185c409f3655d47a59ce027fddd2e564177e26ecf9d8925d7" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.545493 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"197007904e3f1ee185c409f3655d47a59ce027fddd2e564177e26ecf9d8925d7"} err="failed to get container status \"197007904e3f1ee185c409f3655d47a59ce027fddd2e564177e26ecf9d8925d7\": rpc error: code = NotFound desc = could not find container \"197007904e3f1ee185c409f3655d47a59ce027fddd2e564177e26ecf9d8925d7\": container with ID starting with 197007904e3f1ee185c409f3655d47a59ce027fddd2e564177e26ecf9d8925d7 not found: ID does not exist" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.575959 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="453d0f39-ffba-4ad4-9a6f-07d7539c60ce" path="/var/lib/kubelet/pods/453d0f39-ffba-4ad4-9a6f-07d7539c60ce/volumes" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.649165 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-bkstt"] Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.649393 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bc4b48fc9-bkstt" podUID="b19d3022-686a-4cad-9a8f-cb89e48efeca" containerName="dnsmasq-dns" containerID="cri-o://1cebab99fe2c340a9519a9535cad695c5db362ecffef34682e0c99ebdfb8a57a" gracePeriod=10 Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.674538 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-846f75bbfc-d8wm4"] Dec 06 09:22:26 crc kubenswrapper[4672]: E1206 09:22:26.674889 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="453d0f39-ffba-4ad4-9a6f-07d7539c60ce" containerName="extract-content" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.674904 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="453d0f39-ffba-4ad4-9a6f-07d7539c60ce" containerName="extract-content" Dec 06 09:22:26 crc kubenswrapper[4672]: E1206 09:22:26.674918 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="453d0f39-ffba-4ad4-9a6f-07d7539c60ce" containerName="extract-utilities" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.674925 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="453d0f39-ffba-4ad4-9a6f-07d7539c60ce" containerName="extract-utilities" Dec 06 09:22:26 crc kubenswrapper[4672]: E1206 09:22:26.674953 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="453d0f39-ffba-4ad4-9a6f-07d7539c60ce" containerName="registry-server" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.674960 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="453d0f39-ffba-4ad4-9a6f-07d7539c60ce" containerName="registry-server" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.675147 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="453d0f39-ffba-4ad4-9a6f-07d7539c60ce" containerName="registry-server" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.676938 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-846f75bbfc-d8wm4" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.679379 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.710107 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-846f75bbfc-d8wm4"] Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.759924 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-q5ktw"] Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.760894 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-q5ktw" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.767260 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.777723 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4baac82c-1001-4c74-9efe-b11e29efcce9-ovsdbserver-sb\") pod \"dnsmasq-dns-846f75bbfc-d8wm4\" (UID: \"4baac82c-1001-4c74-9efe-b11e29efcce9\") " pod="openstack/dnsmasq-dns-846f75bbfc-d8wm4" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.777814 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxtg6\" (UniqueName: \"kubernetes.io/projected/4baac82c-1001-4c74-9efe-b11e29efcce9-kube-api-access-xxtg6\") pod \"dnsmasq-dns-846f75bbfc-d8wm4\" (UID: \"4baac82c-1001-4c74-9efe-b11e29efcce9\") " pod="openstack/dnsmasq-dns-846f75bbfc-d8wm4" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.777864 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4baac82c-1001-4c74-9efe-b11e29efcce9-dns-svc\") pod \"dnsmasq-dns-846f75bbfc-d8wm4\" (UID: \"4baac82c-1001-4c74-9efe-b11e29efcce9\") " pod="openstack/dnsmasq-dns-846f75bbfc-d8wm4" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.777883 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4baac82c-1001-4c74-9efe-b11e29efcce9-config\") pod \"dnsmasq-dns-846f75bbfc-d8wm4\" (UID: \"4baac82c-1001-4c74-9efe-b11e29efcce9\") " pod="openstack/dnsmasq-dns-846f75bbfc-d8wm4" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.785360 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-q5ktw"] Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.879543 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.880789 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8b30f64-653c-49e8-857d-af30b3126e2d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-q5ktw\" (UID: \"a8b30f64-653c-49e8-857d-af30b3126e2d\") " pod="openstack/ovn-controller-metrics-q5ktw" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.880850 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a8b30f64-653c-49e8-857d-af30b3126e2d-ovs-rundir\") pod \"ovn-controller-metrics-q5ktw\" (UID: \"a8b30f64-653c-49e8-857d-af30b3126e2d\") " pod="openstack/ovn-controller-metrics-q5ktw" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.880912 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxtg6\" (UniqueName: \"kubernetes.io/projected/4baac82c-1001-4c74-9efe-b11e29efcce9-kube-api-access-xxtg6\") pod \"dnsmasq-dns-846f75bbfc-d8wm4\" (UID: \"4baac82c-1001-4c74-9efe-b11e29efcce9\") " pod="openstack/dnsmasq-dns-846f75bbfc-d8wm4" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.880937 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8b30f64-653c-49e8-857d-af30b3126e2d-config\") pod \"ovn-controller-metrics-q5ktw\" (UID: \"a8b30f64-653c-49e8-857d-af30b3126e2d\") " pod="openstack/ovn-controller-metrics-q5ktw" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.880970 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8b30f64-653c-49e8-857d-af30b3126e2d-combined-ca-bundle\") pod \"ovn-controller-metrics-q5ktw\" (UID: \"a8b30f64-653c-49e8-857d-af30b3126e2d\") " pod="openstack/ovn-controller-metrics-q5ktw" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.881003 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4baac82c-1001-4c74-9efe-b11e29efcce9-dns-svc\") pod \"dnsmasq-dns-846f75bbfc-d8wm4\" (UID: \"4baac82c-1001-4c74-9efe-b11e29efcce9\") " pod="openstack/dnsmasq-dns-846f75bbfc-d8wm4" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.881023 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4baac82c-1001-4c74-9efe-b11e29efcce9-config\") pod \"dnsmasq-dns-846f75bbfc-d8wm4\" (UID: \"4baac82c-1001-4c74-9efe-b11e29efcce9\") " pod="openstack/dnsmasq-dns-846f75bbfc-d8wm4" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.881042 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv5w7\" (UniqueName: \"kubernetes.io/projected/a8b30f64-653c-49e8-857d-af30b3126e2d-kube-api-access-sv5w7\") pod \"ovn-controller-metrics-q5ktw\" (UID: \"a8b30f64-653c-49e8-857d-af30b3126e2d\") " pod="openstack/ovn-controller-metrics-q5ktw" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.881086 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a8b30f64-653c-49e8-857d-af30b3126e2d-ovn-rundir\") pod \"ovn-controller-metrics-q5ktw\" (UID: \"a8b30f64-653c-49e8-857d-af30b3126e2d\") " pod="openstack/ovn-controller-metrics-q5ktw" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.881112 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4baac82c-1001-4c74-9efe-b11e29efcce9-ovsdbserver-sb\") pod \"dnsmasq-dns-846f75bbfc-d8wm4\" (UID: \"4baac82c-1001-4c74-9efe-b11e29efcce9\") " pod="openstack/dnsmasq-dns-846f75bbfc-d8wm4" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.882684 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4baac82c-1001-4c74-9efe-b11e29efcce9-ovsdbserver-sb\") pod \"dnsmasq-dns-846f75bbfc-d8wm4\" (UID: \"4baac82c-1001-4c74-9efe-b11e29efcce9\") " pod="openstack/dnsmasq-dns-846f75bbfc-d8wm4" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.882820 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.883861 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.883959 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4baac82c-1001-4c74-9efe-b11e29efcce9-dns-svc\") pod \"dnsmasq-dns-846f75bbfc-d8wm4\" (UID: \"4baac82c-1001-4c74-9efe-b11e29efcce9\") " pod="openstack/dnsmasq-dns-846f75bbfc-d8wm4" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.884514 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4baac82c-1001-4c74-9efe-b11e29efcce9-config\") pod \"dnsmasq-dns-846f75bbfc-d8wm4\" (UID: \"4baac82c-1001-4c74-9efe-b11e29efcce9\") " pod="openstack/dnsmasq-dns-846f75bbfc-d8wm4" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.889245 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.889508 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.889671 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-kfbpd" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.889846 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.913797 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-d99dl"] Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.914105 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cb666b895-d99dl" podUID="e6d719fd-72b2-4fe2-a634-b92e6b6f3902" containerName="dnsmasq-dns" containerID="cri-o://b27c2feac5fc02a3244c6e180855a0d71cad2eda14fb34dabf2844458a3e4da8" gracePeriod=10 Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.933218 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxtg6\" (UniqueName: \"kubernetes.io/projected/4baac82c-1001-4c74-9efe-b11e29efcce9-kube-api-access-xxtg6\") pod \"dnsmasq-dns-846f75bbfc-d8wm4\" (UID: \"4baac82c-1001-4c74-9efe-b11e29efcce9\") " pod="openstack/dnsmasq-dns-846f75bbfc-d8wm4" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.975441 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-984c76dd7-mc6xv"] Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.976665 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-984c76dd7-mc6xv" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.982889 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.985306 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb-config\") pod \"ovn-northd-0\" (UID: \"a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb\") " pod="openstack/ovn-northd-0" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.985356 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxgrc\" (UniqueName: \"kubernetes.io/projected/a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb-kube-api-access-dxgrc\") pod \"ovn-northd-0\" (UID: \"a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb\") " pod="openstack/ovn-northd-0" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.985405 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb-scripts\") pod \"ovn-northd-0\" (UID: \"a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb\") " pod="openstack/ovn-northd-0" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.985437 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8b30f64-653c-49e8-857d-af30b3126e2d-config\") pod \"ovn-controller-metrics-q5ktw\" (UID: \"a8b30f64-653c-49e8-857d-af30b3126e2d\") " pod="openstack/ovn-controller-metrics-q5ktw" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.985484 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8b30f64-653c-49e8-857d-af30b3126e2d-combined-ca-bundle\") pod \"ovn-controller-metrics-q5ktw\" (UID: \"a8b30f64-653c-49e8-857d-af30b3126e2d\") " pod="openstack/ovn-controller-metrics-q5ktw" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.985534 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv5w7\" (UniqueName: \"kubernetes.io/projected/a8b30f64-653c-49e8-857d-af30b3126e2d-kube-api-access-sv5w7\") pod \"ovn-controller-metrics-q5ktw\" (UID: \"a8b30f64-653c-49e8-857d-af30b3126e2d\") " pod="openstack/ovn-controller-metrics-q5ktw" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.985571 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb\") " pod="openstack/ovn-northd-0" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.985609 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a8b30f64-653c-49e8-857d-af30b3126e2d-ovn-rundir\") pod \"ovn-controller-metrics-q5ktw\" (UID: \"a8b30f64-653c-49e8-857d-af30b3126e2d\") " pod="openstack/ovn-controller-metrics-q5ktw" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.985633 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb\") " pod="openstack/ovn-northd-0" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.985664 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8b30f64-653c-49e8-857d-af30b3126e2d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-q5ktw\" (UID: \"a8b30f64-653c-49e8-857d-af30b3126e2d\") " pod="openstack/ovn-controller-metrics-q5ktw" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.985683 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb\") " pod="openstack/ovn-northd-0" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.985716 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a8b30f64-653c-49e8-857d-af30b3126e2d-ovs-rundir\") pod \"ovn-controller-metrics-q5ktw\" (UID: \"a8b30f64-653c-49e8-857d-af30b3126e2d\") " pod="openstack/ovn-controller-metrics-q5ktw" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.985736 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb\") " pod="openstack/ovn-northd-0" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.986418 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a8b30f64-653c-49e8-857d-af30b3126e2d-ovn-rundir\") pod \"ovn-controller-metrics-q5ktw\" (UID: \"a8b30f64-653c-49e8-857d-af30b3126e2d\") " pod="openstack/ovn-controller-metrics-q5ktw" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.986712 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a8b30f64-653c-49e8-857d-af30b3126e2d-ovs-rundir\") pod \"ovn-controller-metrics-q5ktw\" (UID: \"a8b30f64-653c-49e8-857d-af30b3126e2d\") " pod="openstack/ovn-controller-metrics-q5ktw" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.986774 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8b30f64-653c-49e8-857d-af30b3126e2d-config\") pod \"ovn-controller-metrics-q5ktw\" (UID: \"a8b30f64-653c-49e8-857d-af30b3126e2d\") " pod="openstack/ovn-controller-metrics-q5ktw" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.991542 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8b30f64-653c-49e8-857d-af30b3126e2d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-q5ktw\" (UID: \"a8b30f64-653c-49e8-857d-af30b3126e2d\") " pod="openstack/ovn-controller-metrics-q5ktw" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.997023 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8b30f64-653c-49e8-857d-af30b3126e2d-combined-ca-bundle\") pod \"ovn-controller-metrics-q5ktw\" (UID: \"a8b30f64-653c-49e8-857d-af30b3126e2d\") " pod="openstack/ovn-controller-metrics-q5ktw" Dec 06 09:22:26 crc kubenswrapper[4672]: I1206 09:22:26.997175 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-984c76dd7-mc6xv"] Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.004295 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-846f75bbfc-d8wm4" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.041684 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv5w7\" (UniqueName: \"kubernetes.io/projected/a8b30f64-653c-49e8-857d-af30b3126e2d-kube-api-access-sv5w7\") pod \"ovn-controller-metrics-q5ktw\" (UID: \"a8b30f64-653c-49e8-857d-af30b3126e2d\") " pod="openstack/ovn-controller-metrics-q5ktw" Dec 06 09:22:27 crc kubenswrapper[4672]: E1206 09:22:27.086254 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6d719fd_72b2_4fe2_a634_b92e6b6f3902.slice/crio-b27c2feac5fc02a3244c6e180855a0d71cad2eda14fb34dabf2844458a3e4da8.scope\": RecentStats: unable to find data in memory cache]" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.086561 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb\") " pod="openstack/ovn-northd-0" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.086630 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb\") " pod="openstack/ovn-northd-0" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.086659 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00fa21e3-1eba-4ac6-9eb7-330297e229fb-ovsdbserver-nb\") pod \"dnsmasq-dns-984c76dd7-mc6xv\" (UID: \"00fa21e3-1eba-4ac6-9eb7-330297e229fb\") " pod="openstack/dnsmasq-dns-984c76dd7-mc6xv" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.086681 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00fa21e3-1eba-4ac6-9eb7-330297e229fb-dns-svc\") pod \"dnsmasq-dns-984c76dd7-mc6xv\" (UID: \"00fa21e3-1eba-4ac6-9eb7-330297e229fb\") " pod="openstack/dnsmasq-dns-984c76dd7-mc6xv" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.086705 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mghp8\" (UniqueName: \"kubernetes.io/projected/00fa21e3-1eba-4ac6-9eb7-330297e229fb-kube-api-access-mghp8\") pod \"dnsmasq-dns-984c76dd7-mc6xv\" (UID: \"00fa21e3-1eba-4ac6-9eb7-330297e229fb\") " pod="openstack/dnsmasq-dns-984c76dd7-mc6xv" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.086727 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb\") " pod="openstack/ovn-northd-0" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.086757 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb\") " pod="openstack/ovn-northd-0" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.086777 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00fa21e3-1eba-4ac6-9eb7-330297e229fb-ovsdbserver-sb\") pod \"dnsmasq-dns-984c76dd7-mc6xv\" (UID: \"00fa21e3-1eba-4ac6-9eb7-330297e229fb\") " pod="openstack/dnsmasq-dns-984c76dd7-mc6xv" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.086801 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb-config\") pod \"ovn-northd-0\" (UID: \"a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb\") " pod="openstack/ovn-northd-0" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.086819 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00fa21e3-1eba-4ac6-9eb7-330297e229fb-config\") pod \"dnsmasq-dns-984c76dd7-mc6xv\" (UID: \"00fa21e3-1eba-4ac6-9eb7-330297e229fb\") " pod="openstack/dnsmasq-dns-984c76dd7-mc6xv" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.086835 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxgrc\" (UniqueName: \"kubernetes.io/projected/a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb-kube-api-access-dxgrc\") pod \"ovn-northd-0\" (UID: \"a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb\") " pod="openstack/ovn-northd-0" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.086872 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb-scripts\") pod \"ovn-northd-0\" (UID: \"a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb\") " pod="openstack/ovn-northd-0" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.087663 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb-scripts\") pod \"ovn-northd-0\" (UID: \"a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb\") " pod="openstack/ovn-northd-0" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.094055 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb-config\") pod \"ovn-northd-0\" (UID: \"a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb\") " pod="openstack/ovn-northd-0" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.094816 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb\") " pod="openstack/ovn-northd-0" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.095225 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb\") " pod="openstack/ovn-northd-0" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.095435 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-q5ktw" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.100413 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb\") " pod="openstack/ovn-northd-0" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.101001 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb\") " pod="openstack/ovn-northd-0" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.119492 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxgrc\" (UniqueName: \"kubernetes.io/projected/a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb-kube-api-access-dxgrc\") pod \"ovn-northd-0\" (UID: \"a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb\") " pod="openstack/ovn-northd-0" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.187987 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00fa21e3-1eba-4ac6-9eb7-330297e229fb-ovsdbserver-nb\") pod \"dnsmasq-dns-984c76dd7-mc6xv\" (UID: \"00fa21e3-1eba-4ac6-9eb7-330297e229fb\") " pod="openstack/dnsmasq-dns-984c76dd7-mc6xv" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.188321 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00fa21e3-1eba-4ac6-9eb7-330297e229fb-dns-svc\") pod \"dnsmasq-dns-984c76dd7-mc6xv\" (UID: \"00fa21e3-1eba-4ac6-9eb7-330297e229fb\") " pod="openstack/dnsmasq-dns-984c76dd7-mc6xv" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.188352 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mghp8\" (UniqueName: \"kubernetes.io/projected/00fa21e3-1eba-4ac6-9eb7-330297e229fb-kube-api-access-mghp8\") pod \"dnsmasq-dns-984c76dd7-mc6xv\" (UID: \"00fa21e3-1eba-4ac6-9eb7-330297e229fb\") " pod="openstack/dnsmasq-dns-984c76dd7-mc6xv" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.188388 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00fa21e3-1eba-4ac6-9eb7-330297e229fb-ovsdbserver-sb\") pod \"dnsmasq-dns-984c76dd7-mc6xv\" (UID: \"00fa21e3-1eba-4ac6-9eb7-330297e229fb\") " pod="openstack/dnsmasq-dns-984c76dd7-mc6xv" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.188414 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00fa21e3-1eba-4ac6-9eb7-330297e229fb-config\") pod \"dnsmasq-dns-984c76dd7-mc6xv\" (UID: \"00fa21e3-1eba-4ac6-9eb7-330297e229fb\") " pod="openstack/dnsmasq-dns-984c76dd7-mc6xv" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.190451 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00fa21e3-1eba-4ac6-9eb7-330297e229fb-ovsdbserver-nb\") pod \"dnsmasq-dns-984c76dd7-mc6xv\" (UID: \"00fa21e3-1eba-4ac6-9eb7-330297e229fb\") " pod="openstack/dnsmasq-dns-984c76dd7-mc6xv" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.190672 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00fa21e3-1eba-4ac6-9eb7-330297e229fb-config\") pod \"dnsmasq-dns-984c76dd7-mc6xv\" (UID: \"00fa21e3-1eba-4ac6-9eb7-330297e229fb\") " pod="openstack/dnsmasq-dns-984c76dd7-mc6xv" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.191012 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00fa21e3-1eba-4ac6-9eb7-330297e229fb-dns-svc\") pod \"dnsmasq-dns-984c76dd7-mc6xv\" (UID: \"00fa21e3-1eba-4ac6-9eb7-330297e229fb\") " pod="openstack/dnsmasq-dns-984c76dd7-mc6xv" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.193344 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00fa21e3-1eba-4ac6-9eb7-330297e229fb-ovsdbserver-sb\") pod \"dnsmasq-dns-984c76dd7-mc6xv\" (UID: \"00fa21e3-1eba-4ac6-9eb7-330297e229fb\") " pod="openstack/dnsmasq-dns-984c76dd7-mc6xv" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.225236 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.226131 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mghp8\" (UniqueName: \"kubernetes.io/projected/00fa21e3-1eba-4ac6-9eb7-330297e229fb-kube-api-access-mghp8\") pod \"dnsmasq-dns-984c76dd7-mc6xv\" (UID: \"00fa21e3-1eba-4ac6-9eb7-330297e229fb\") " pod="openstack/dnsmasq-dns-984c76dd7-mc6xv" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.288890 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8c53efb2-1642-4efd-b920-7ad41e6c136a","Type":"ContainerStarted","Data":"698499976e4497d280f6a1144f07d53a0f848a4a6d7c892d3fa7e92a2024a53c"} Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.349531 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"37d0f081-e2da-4845-9097-31607c42efc4","Type":"ContainerStarted","Data":"1a77097ada88613acab052f3b7057539a7e1517a3e3ca794e4c07509aa498246"} Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.358747 4672 generic.go:334] "Generic (PLEG): container finished" podID="b19d3022-686a-4cad-9a8f-cb89e48efeca" containerID="1cebab99fe2c340a9519a9535cad695c5db362ecffef34682e0c99ebdfb8a57a" exitCode=0 Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.358982 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc4b48fc9-bkstt" event={"ID":"b19d3022-686a-4cad-9a8f-cb89e48efeca","Type":"ContainerDied","Data":"1cebab99fe2c340a9519a9535cad695c5db362ecffef34682e0c99ebdfb8a57a"} Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.360380 4672 generic.go:334] "Generic (PLEG): container finished" podID="e6d719fd-72b2-4fe2-a634-b92e6b6f3902" containerID="b27c2feac5fc02a3244c6e180855a0d71cad2eda14fb34dabf2844458a3e4da8" exitCode=0 Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.361141 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-d99dl" event={"ID":"e6d719fd-72b2-4fe2-a634-b92e6b6f3902","Type":"ContainerDied","Data":"b27c2feac5fc02a3244c6e180855a0d71cad2eda14fb34dabf2844458a3e4da8"} Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.376680 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-984c76dd7-mc6xv" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.428507 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=9.857025433 podStartE2EDuration="50.428488313s" podCreationTimestamp="2025-12-06 09:21:37 +0000 UTC" firstStartedPulling="2025-12-06 09:21:39.435425072 +0000 UTC m=+917.179685359" lastFinishedPulling="2025-12-06 09:22:20.006887952 +0000 UTC m=+957.751148239" observedRunningTime="2025-12-06 09:22:27.323866412 +0000 UTC m=+965.068126699" watchObservedRunningTime="2025-12-06 09:22:27.428488313 +0000 UTC m=+965.172748600" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.445352 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=10.334425477 podStartE2EDuration="52.445336721s" podCreationTimestamp="2025-12-06 09:21:35 +0000 UTC" firstStartedPulling="2025-12-06 09:21:37.897103448 +0000 UTC m=+915.641363735" lastFinishedPulling="2025-12-06 09:22:20.008014692 +0000 UTC m=+957.752274979" observedRunningTime="2025-12-06 09:22:27.426992692 +0000 UTC m=+965.171252979" watchObservedRunningTime="2025-12-06 09:22:27.445336721 +0000 UTC m=+965.189597008" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.642728 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-d99dl" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.739896 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6d719fd-72b2-4fe2-a634-b92e6b6f3902-dns-svc\") pod \"e6d719fd-72b2-4fe2-a634-b92e6b6f3902\" (UID: \"e6d719fd-72b2-4fe2-a634-b92e6b6f3902\") " Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.740480 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6d719fd-72b2-4fe2-a634-b92e6b6f3902-config\") pod \"e6d719fd-72b2-4fe2-a634-b92e6b6f3902\" (UID: \"e6d719fd-72b2-4fe2-a634-b92e6b6f3902\") " Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.740545 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nx4l6\" (UniqueName: \"kubernetes.io/projected/e6d719fd-72b2-4fe2-a634-b92e6b6f3902-kube-api-access-nx4l6\") pod \"e6d719fd-72b2-4fe2-a634-b92e6b6f3902\" (UID: \"e6d719fd-72b2-4fe2-a634-b92e6b6f3902\") " Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.745481 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-q5ktw"] Dec 06 09:22:27 crc kubenswrapper[4672]: W1206 09:22:27.759770 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4baac82c_1001_4c74_9efe_b11e29efcce9.slice/crio-33e67329fc1b566aab8a4f18f9a06ab5c8838cd808eab67d74b2da088bb4002f WatchSource:0}: Error finding container 33e67329fc1b566aab8a4f18f9a06ab5c8838cd808eab67d74b2da088bb4002f: Status 404 returned error can't find the container with id 33e67329fc1b566aab8a4f18f9a06ab5c8838cd808eab67d74b2da088bb4002f Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.765987 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc4b48fc9-bkstt" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.769497 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-846f75bbfc-d8wm4"] Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.771630 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6d719fd-72b2-4fe2-a634-b92e6b6f3902-kube-api-access-nx4l6" (OuterVolumeSpecName: "kube-api-access-nx4l6") pod "e6d719fd-72b2-4fe2-a634-b92e6b6f3902" (UID: "e6d719fd-72b2-4fe2-a634-b92e6b6f3902"). InnerVolumeSpecName "kube-api-access-nx4l6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.819921 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6d719fd-72b2-4fe2-a634-b92e6b6f3902-config" (OuterVolumeSpecName: "config") pod "e6d719fd-72b2-4fe2-a634-b92e6b6f3902" (UID: "e6d719fd-72b2-4fe2-a634-b92e6b6f3902"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.839167 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6d719fd-72b2-4fe2-a634-b92e6b6f3902-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e6d719fd-72b2-4fe2-a634-b92e6b6f3902" (UID: "e6d719fd-72b2-4fe2-a634-b92e6b6f3902"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.855335 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b19d3022-686a-4cad-9a8f-cb89e48efeca-config\") pod \"b19d3022-686a-4cad-9a8f-cb89e48efeca\" (UID: \"b19d3022-686a-4cad-9a8f-cb89e48efeca\") " Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.855539 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrkfs\" (UniqueName: \"kubernetes.io/projected/b19d3022-686a-4cad-9a8f-cb89e48efeca-kube-api-access-vrkfs\") pod \"b19d3022-686a-4cad-9a8f-cb89e48efeca\" (UID: \"b19d3022-686a-4cad-9a8f-cb89e48efeca\") " Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.855786 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b19d3022-686a-4cad-9a8f-cb89e48efeca-dns-svc\") pod \"b19d3022-686a-4cad-9a8f-cb89e48efeca\" (UID: \"b19d3022-686a-4cad-9a8f-cb89e48efeca\") " Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.856744 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6d719fd-72b2-4fe2-a634-b92e6b6f3902-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.856767 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6d719fd-72b2-4fe2-a634-b92e6b6f3902-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.856782 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nx4l6\" (UniqueName: \"kubernetes.io/projected/e6d719fd-72b2-4fe2-a634-b92e6b6f3902-kube-api-access-nx4l6\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.868836 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b19d3022-686a-4cad-9a8f-cb89e48efeca-kube-api-access-vrkfs" (OuterVolumeSpecName: "kube-api-access-vrkfs") pod "b19d3022-686a-4cad-9a8f-cb89e48efeca" (UID: "b19d3022-686a-4cad-9a8f-cb89e48efeca"). InnerVolumeSpecName "kube-api-access-vrkfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.911309 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b19d3022-686a-4cad-9a8f-cb89e48efeca-config" (OuterVolumeSpecName: "config") pod "b19d3022-686a-4cad-9a8f-cb89e48efeca" (UID: "b19d3022-686a-4cad-9a8f-cb89e48efeca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.917158 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 06 09:22:27 crc kubenswrapper[4672]: W1206 09:22:27.938628 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6f49a03_3f9d_46c1_86a8_9ad0a7e6c7fb.slice/crio-035ad5dbded8c9f9b83a981489395006cbb4901e9830f3ff9a8e1cb6de763079 WatchSource:0}: Error finding container 035ad5dbded8c9f9b83a981489395006cbb4901e9830f3ff9a8e1cb6de763079: Status 404 returned error can't find the container with id 035ad5dbded8c9f9b83a981489395006cbb4901e9830f3ff9a8e1cb6de763079 Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.944158 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b19d3022-686a-4cad-9a8f-cb89e48efeca-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b19d3022-686a-4cad-9a8f-cb89e48efeca" (UID: "b19d3022-686a-4cad-9a8f-cb89e48efeca"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.958180 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b19d3022-686a-4cad-9a8f-cb89e48efeca-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.958207 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b19d3022-686a-4cad-9a8f-cb89e48efeca-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:27 crc kubenswrapper[4672]: I1206 09:22:27.958217 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrkfs\" (UniqueName: \"kubernetes.io/projected/b19d3022-686a-4cad-9a8f-cb89e48efeca-kube-api-access-vrkfs\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:28 crc kubenswrapper[4672]: I1206 09:22:28.063451 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-984c76dd7-mc6xv"] Dec 06 09:22:28 crc kubenswrapper[4672]: W1206 09:22:28.070747 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00fa21e3_1eba_4ac6_9eb7_330297e229fb.slice/crio-867943f01e7f1ec4d8475ac4ebb9b900728b58b74d7fdacd72af0d917ee6f4c6 WatchSource:0}: Error finding container 867943f01e7f1ec4d8475ac4ebb9b900728b58b74d7fdacd72af0d917ee6f4c6: Status 404 returned error can't find the container with id 867943f01e7f1ec4d8475ac4ebb9b900728b58b74d7fdacd72af0d917ee6f4c6 Dec 06 09:22:28 crc kubenswrapper[4672]: I1206 09:22:28.383553 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc4b48fc9-bkstt" event={"ID":"b19d3022-686a-4cad-9a8f-cb89e48efeca","Type":"ContainerDied","Data":"5d6ba808750fe7f149611ecd513a302ea6d2b69cfa685a94db0c7fe6faaede50"} Dec 06 09:22:28 crc kubenswrapper[4672]: I1206 09:22:28.384005 4672 scope.go:117] "RemoveContainer" containerID="1cebab99fe2c340a9519a9535cad695c5db362ecffef34682e0c99ebdfb8a57a" Dec 06 09:22:28 crc kubenswrapper[4672]: I1206 09:22:28.383956 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc4b48fc9-bkstt" Dec 06 09:22:28 crc kubenswrapper[4672]: I1206 09:22:28.403822 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-d99dl" Dec 06 09:22:28 crc kubenswrapper[4672]: I1206 09:22:28.404274 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-d99dl" event={"ID":"e6d719fd-72b2-4fe2-a634-b92e6b6f3902","Type":"ContainerDied","Data":"f85cde69bc2fd1dacd29fb76c8948abb24cfdc36c07e2595714cb22cae8ee3ac"} Dec 06 09:22:28 crc kubenswrapper[4672]: I1206 09:22:28.407396 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb","Type":"ContainerStarted","Data":"035ad5dbded8c9f9b83a981489395006cbb4901e9830f3ff9a8e1cb6de763079"} Dec 06 09:22:28 crc kubenswrapper[4672]: I1206 09:22:28.420536 4672 generic.go:334] "Generic (PLEG): container finished" podID="00fa21e3-1eba-4ac6-9eb7-330297e229fb" containerID="90ebba68239c1a06270308e81ae0d15fa97058c9cff698e26c6c9dd75e2904a5" exitCode=0 Dec 06 09:22:28 crc kubenswrapper[4672]: I1206 09:22:28.421410 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-984c76dd7-mc6xv" event={"ID":"00fa21e3-1eba-4ac6-9eb7-330297e229fb","Type":"ContainerDied","Data":"90ebba68239c1a06270308e81ae0d15fa97058c9cff698e26c6c9dd75e2904a5"} Dec 06 09:22:28 crc kubenswrapper[4672]: I1206 09:22:28.421436 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-984c76dd7-mc6xv" event={"ID":"00fa21e3-1eba-4ac6-9eb7-330297e229fb","Type":"ContainerStarted","Data":"867943f01e7f1ec4d8475ac4ebb9b900728b58b74d7fdacd72af0d917ee6f4c6"} Dec 06 09:22:28 crc kubenswrapper[4672]: I1206 09:22:28.433735 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-q5ktw" event={"ID":"a8b30f64-653c-49e8-857d-af30b3126e2d","Type":"ContainerStarted","Data":"d397b36197be2365f7d8230022111833d4eb9e413217e1f11eb314bc4dc8dbde"} Dec 06 09:22:28 crc kubenswrapper[4672]: I1206 09:22:28.433808 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-q5ktw" event={"ID":"a8b30f64-653c-49e8-857d-af30b3126e2d","Type":"ContainerStarted","Data":"cf04642ae96742d0303fe266c1bdcc2a3869437ba7ec366427ee040974348a5b"} Dec 06 09:22:28 crc kubenswrapper[4672]: I1206 09:22:28.437254 4672 generic.go:334] "Generic (PLEG): container finished" podID="4baac82c-1001-4c74-9efe-b11e29efcce9" containerID="6768281cbbebf65fd4c1bb0dc9893263d08aa12dc2cbe5db98e5d39a988c90c7" exitCode=0 Dec 06 09:22:28 crc kubenswrapper[4672]: I1206 09:22:28.437468 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-846f75bbfc-d8wm4" event={"ID":"4baac82c-1001-4c74-9efe-b11e29efcce9","Type":"ContainerDied","Data":"6768281cbbebf65fd4c1bb0dc9893263d08aa12dc2cbe5db98e5d39a988c90c7"} Dec 06 09:22:28 crc kubenswrapper[4672]: I1206 09:22:28.437515 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-846f75bbfc-d8wm4" event={"ID":"4baac82c-1001-4c74-9efe-b11e29efcce9","Type":"ContainerStarted","Data":"33e67329fc1b566aab8a4f18f9a06ab5c8838cd808eab67d74b2da088bb4002f"} Dec 06 09:22:28 crc kubenswrapper[4672]: I1206 09:22:28.508077 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 06 09:22:28 crc kubenswrapper[4672]: I1206 09:22:28.508124 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 06 09:22:28 crc kubenswrapper[4672]: I1206 09:22:28.520179 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-q5ktw" podStartSLOduration=2.520143295 podStartE2EDuration="2.520143295s" podCreationTimestamp="2025-12-06 09:22:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:22:28.504291114 +0000 UTC m=+966.248551411" watchObservedRunningTime="2025-12-06 09:22:28.520143295 +0000 UTC m=+966.264403572" Dec 06 09:22:28 crc kubenswrapper[4672]: I1206 09:22:28.578358 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-d99dl"] Dec 06 09:22:28 crc kubenswrapper[4672]: I1206 09:22:28.582851 4672 scope.go:117] "RemoveContainer" containerID="b4112b1434bc38aa5c40c9459aed5c9a4e91f30cba5a6120d389ae3e2d49496e" Dec 06 09:22:28 crc kubenswrapper[4672]: I1206 09:22:28.597740 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-d99dl"] Dec 06 09:22:28 crc kubenswrapper[4672]: I1206 09:22:28.603969 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-bkstt"] Dec 06 09:22:28 crc kubenswrapper[4672]: I1206 09:22:28.618709 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-bkstt"] Dec 06 09:22:28 crc kubenswrapper[4672]: I1206 09:22:28.666203 4672 scope.go:117] "RemoveContainer" containerID="b27c2feac5fc02a3244c6e180855a0d71cad2eda14fb34dabf2844458a3e4da8" Dec 06 09:22:28 crc kubenswrapper[4672]: I1206 09:22:28.699326 4672 scope.go:117] "RemoveContainer" containerID="a801442f2357b535f1691db2c114a19b91d50dcf75ceff99378f49ceb906ee54" Dec 06 09:22:29 crc kubenswrapper[4672]: I1206 09:22:29.071387 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 06 09:22:29 crc kubenswrapper[4672]: I1206 09:22:29.099068 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 06 09:22:29 crc kubenswrapper[4672]: I1206 09:22:29.445544 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-846f75bbfc-d8wm4" event={"ID":"4baac82c-1001-4c74-9efe-b11e29efcce9","Type":"ContainerStarted","Data":"4e5c866f6567cc77b3aef0fcbb5637fb7fbb12fe5357e18033f13cd32c52b3a0"} Dec 06 09:22:29 crc kubenswrapper[4672]: I1206 09:22:29.445750 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-846f75bbfc-d8wm4" Dec 06 09:22:29 crc kubenswrapper[4672]: I1206 09:22:29.453491 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-984c76dd7-mc6xv" event={"ID":"00fa21e3-1eba-4ac6-9eb7-330297e229fb","Type":"ContainerStarted","Data":"069624d6f78c879722c48df2461e59ab428ff9cd50de3dd13265af4422a0f040"} Dec 06 09:22:29 crc kubenswrapper[4672]: I1206 09:22:29.453662 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="944c7316-15fa-4e57-896c-65205e8137b2" containerName="memcached" containerID="cri-o://f58790bf4811c91d4adac6cd3193098c8f489411321867273727061b9b959dc3" gracePeriod=30 Dec 06 09:22:29 crc kubenswrapper[4672]: I1206 09:22:29.491111 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-846f75bbfc-d8wm4" podStartSLOduration=3.491092686 podStartE2EDuration="3.491092686s" podCreationTimestamp="2025-12-06 09:22:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:22:29.465584464 +0000 UTC m=+967.209844751" watchObservedRunningTime="2025-12-06 09:22:29.491092686 +0000 UTC m=+967.235352973" Dec 06 09:22:29 crc kubenswrapper[4672]: I1206 09:22:29.494945 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-984c76dd7-mc6xv" podStartSLOduration=3.494932161 podStartE2EDuration="3.494932161s" podCreationTimestamp="2025-12-06 09:22:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:22:29.489559975 +0000 UTC m=+967.233820262" watchObservedRunningTime="2025-12-06 09:22:29.494932161 +0000 UTC m=+967.239192448" Dec 06 09:22:30 crc kubenswrapper[4672]: I1206 09:22:30.461479 4672 generic.go:334] "Generic (PLEG): container finished" podID="944c7316-15fa-4e57-896c-65205e8137b2" containerID="f58790bf4811c91d4adac6cd3193098c8f489411321867273727061b9b959dc3" exitCode=0 Dec 06 09:22:30 crc kubenswrapper[4672]: I1206 09:22:30.461524 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"944c7316-15fa-4e57-896c-65205e8137b2","Type":"ContainerDied","Data":"f58790bf4811c91d4adac6cd3193098c8f489411321867273727061b9b959dc3"} Dec 06 09:22:30 crc kubenswrapper[4672]: I1206 09:22:30.463494 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb","Type":"ContainerStarted","Data":"ecdeb1f24c40b3836d785ef64bac266f9f8c8b4a67d6a8cf1ffb2d620d8b2af2"} Dec 06 09:22:30 crc kubenswrapper[4672]: I1206 09:22:30.463528 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb","Type":"ContainerStarted","Data":"a3a5cdcfab6f5edc5b566996de261fe00283dc84f1355f3824cf61a190bb6daa"} Dec 06 09:22:30 crc kubenswrapper[4672]: I1206 09:22:30.464831 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-984c76dd7-mc6xv" Dec 06 09:22:30 crc kubenswrapper[4672]: I1206 09:22:30.490449 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.109673136 podStartE2EDuration="4.49042968s" podCreationTimestamp="2025-12-06 09:22:26 +0000 UTC" firstStartedPulling="2025-12-06 09:22:27.948320722 +0000 UTC m=+965.692581009" lastFinishedPulling="2025-12-06 09:22:29.329077266 +0000 UTC m=+967.073337553" observedRunningTime="2025-12-06 09:22:30.48823183 +0000 UTC m=+968.232492117" watchObservedRunningTime="2025-12-06 09:22:30.49042968 +0000 UTC m=+968.234689967" Dec 06 09:22:30 crc kubenswrapper[4672]: I1206 09:22:30.569820 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b19d3022-686a-4cad-9a8f-cb89e48efeca" path="/var/lib/kubelet/pods/b19d3022-686a-4cad-9a8f-cb89e48efeca/volumes" Dec 06 09:22:30 crc kubenswrapper[4672]: I1206 09:22:30.570636 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6d719fd-72b2-4fe2-a634-b92e6b6f3902" path="/var/lib/kubelet/pods/e6d719fd-72b2-4fe2-a634-b92e6b6f3902/volumes" Dec 06 09:22:30 crc kubenswrapper[4672]: I1206 09:22:30.720232 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 06 09:22:30 crc kubenswrapper[4672]: I1206 09:22:30.820748 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/944c7316-15fa-4e57-896c-65205e8137b2-combined-ca-bundle\") pod \"944c7316-15fa-4e57-896c-65205e8137b2\" (UID: \"944c7316-15fa-4e57-896c-65205e8137b2\") " Dec 06 09:22:30 crc kubenswrapper[4672]: I1206 09:22:30.820805 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/944c7316-15fa-4e57-896c-65205e8137b2-memcached-tls-certs\") pod \"944c7316-15fa-4e57-896c-65205e8137b2\" (UID: \"944c7316-15fa-4e57-896c-65205e8137b2\") " Dec 06 09:22:30 crc kubenswrapper[4672]: I1206 09:22:30.820852 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/944c7316-15fa-4e57-896c-65205e8137b2-kolla-config\") pod \"944c7316-15fa-4e57-896c-65205e8137b2\" (UID: \"944c7316-15fa-4e57-896c-65205e8137b2\") " Dec 06 09:22:30 crc kubenswrapper[4672]: I1206 09:22:30.820934 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/944c7316-15fa-4e57-896c-65205e8137b2-config-data\") pod \"944c7316-15fa-4e57-896c-65205e8137b2\" (UID: \"944c7316-15fa-4e57-896c-65205e8137b2\") " Dec 06 09:22:30 crc kubenswrapper[4672]: I1206 09:22:30.821057 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmj27\" (UniqueName: \"kubernetes.io/projected/944c7316-15fa-4e57-896c-65205e8137b2-kube-api-access-rmj27\") pod \"944c7316-15fa-4e57-896c-65205e8137b2\" (UID: \"944c7316-15fa-4e57-896c-65205e8137b2\") " Dec 06 09:22:30 crc kubenswrapper[4672]: I1206 09:22:30.821460 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/944c7316-15fa-4e57-896c-65205e8137b2-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "944c7316-15fa-4e57-896c-65205e8137b2" (UID: "944c7316-15fa-4e57-896c-65205e8137b2"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:22:30 crc kubenswrapper[4672]: I1206 09:22:30.821756 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/944c7316-15fa-4e57-896c-65205e8137b2-config-data" (OuterVolumeSpecName: "config-data") pod "944c7316-15fa-4e57-896c-65205e8137b2" (UID: "944c7316-15fa-4e57-896c-65205e8137b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:22:30 crc kubenswrapper[4672]: I1206 09:22:30.825678 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/944c7316-15fa-4e57-896c-65205e8137b2-kube-api-access-rmj27" (OuterVolumeSpecName: "kube-api-access-rmj27") pod "944c7316-15fa-4e57-896c-65205e8137b2" (UID: "944c7316-15fa-4e57-896c-65205e8137b2"). InnerVolumeSpecName "kube-api-access-rmj27". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:22:30 crc kubenswrapper[4672]: I1206 09:22:30.843025 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/944c7316-15fa-4e57-896c-65205e8137b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "944c7316-15fa-4e57-896c-65205e8137b2" (UID: "944c7316-15fa-4e57-896c-65205e8137b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:22:30 crc kubenswrapper[4672]: I1206 09:22:30.865558 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/944c7316-15fa-4e57-896c-65205e8137b2-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "944c7316-15fa-4e57-896c-65205e8137b2" (UID: "944c7316-15fa-4e57-896c-65205e8137b2"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:22:30 crc kubenswrapper[4672]: I1206 09:22:30.922366 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmj27\" (UniqueName: \"kubernetes.io/projected/944c7316-15fa-4e57-896c-65205e8137b2-kube-api-access-rmj27\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:30 crc kubenswrapper[4672]: I1206 09:22:30.922400 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/944c7316-15fa-4e57-896c-65205e8137b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:30 crc kubenswrapper[4672]: I1206 09:22:30.922409 4672 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/944c7316-15fa-4e57-896c-65205e8137b2-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:30 crc kubenswrapper[4672]: I1206 09:22:30.922418 4672 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/944c7316-15fa-4e57-896c-65205e8137b2-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:30 crc kubenswrapper[4672]: I1206 09:22:30.922427 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/944c7316-15fa-4e57-896c-65205e8137b2-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.471833 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"944c7316-15fa-4e57-896c-65205e8137b2","Type":"ContainerDied","Data":"87ee70e07fd1d22559b58b1e86accac12cc0448488943f601306eaaea4f24921"} Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.471892 4672 scope.go:117] "RemoveContainer" containerID="f58790bf4811c91d4adac6cd3193098c8f489411321867273727061b9b959dc3" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.473947 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.473979 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.539285 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.557114 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.566650 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 06 09:22:31 crc kubenswrapper[4672]: E1206 09:22:31.567105 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b19d3022-686a-4cad-9a8f-cb89e48efeca" containerName="dnsmasq-dns" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.567126 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19d3022-686a-4cad-9a8f-cb89e48efeca" containerName="dnsmasq-dns" Dec 06 09:22:31 crc kubenswrapper[4672]: E1206 09:22:31.567145 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d719fd-72b2-4fe2-a634-b92e6b6f3902" containerName="dnsmasq-dns" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.567153 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d719fd-72b2-4fe2-a634-b92e6b6f3902" containerName="dnsmasq-dns" Dec 06 09:22:31 crc kubenswrapper[4672]: E1206 09:22:31.567170 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d719fd-72b2-4fe2-a634-b92e6b6f3902" containerName="init" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.567178 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d719fd-72b2-4fe2-a634-b92e6b6f3902" containerName="init" Dec 06 09:22:31 crc kubenswrapper[4672]: E1206 09:22:31.567200 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="944c7316-15fa-4e57-896c-65205e8137b2" containerName="memcached" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.567208 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="944c7316-15fa-4e57-896c-65205e8137b2" containerName="memcached" Dec 06 09:22:31 crc kubenswrapper[4672]: E1206 09:22:31.567217 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b19d3022-686a-4cad-9a8f-cb89e48efeca" containerName="init" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.567224 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19d3022-686a-4cad-9a8f-cb89e48efeca" containerName="init" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.567388 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="b19d3022-686a-4cad-9a8f-cb89e48efeca" containerName="dnsmasq-dns" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.567418 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="944c7316-15fa-4e57-896c-65205e8137b2" containerName="memcached" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.567432 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6d719fd-72b2-4fe2-a634-b92e6b6f3902" containerName="dnsmasq-dns" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.568165 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.569675 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.578577 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-mb2mf" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.579271 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.582819 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.641808 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7595f929-2c12-4a7f-ba33-2701f7a701ee-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7595f929-2c12-4a7f-ba33-2701f7a701ee\") " pod="openstack/memcached-0" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.641868 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7595f929-2c12-4a7f-ba33-2701f7a701ee-kolla-config\") pod \"memcached-0\" (UID: \"7595f929-2c12-4a7f-ba33-2701f7a701ee\") " pod="openstack/memcached-0" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.641889 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v7jp\" (UniqueName: \"kubernetes.io/projected/7595f929-2c12-4a7f-ba33-2701f7a701ee-kube-api-access-7v7jp\") pod \"memcached-0\" (UID: \"7595f929-2c12-4a7f-ba33-2701f7a701ee\") " pod="openstack/memcached-0" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.641909 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7595f929-2c12-4a7f-ba33-2701f7a701ee-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7595f929-2c12-4a7f-ba33-2701f7a701ee\") " pod="openstack/memcached-0" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.641985 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7595f929-2c12-4a7f-ba33-2701f7a701ee-config-data\") pod \"memcached-0\" (UID: \"7595f929-2c12-4a7f-ba33-2701f7a701ee\") " pod="openstack/memcached-0" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.743229 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7595f929-2c12-4a7f-ba33-2701f7a701ee-config-data\") pod \"memcached-0\" (UID: \"7595f929-2c12-4a7f-ba33-2701f7a701ee\") " pod="openstack/memcached-0" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.743327 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7595f929-2c12-4a7f-ba33-2701f7a701ee-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7595f929-2c12-4a7f-ba33-2701f7a701ee\") " pod="openstack/memcached-0" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.743356 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7595f929-2c12-4a7f-ba33-2701f7a701ee-kolla-config\") pod \"memcached-0\" (UID: \"7595f929-2c12-4a7f-ba33-2701f7a701ee\") " pod="openstack/memcached-0" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.743377 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v7jp\" (UniqueName: \"kubernetes.io/projected/7595f929-2c12-4a7f-ba33-2701f7a701ee-kube-api-access-7v7jp\") pod \"memcached-0\" (UID: \"7595f929-2c12-4a7f-ba33-2701f7a701ee\") " pod="openstack/memcached-0" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.743397 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7595f929-2c12-4a7f-ba33-2701f7a701ee-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7595f929-2c12-4a7f-ba33-2701f7a701ee\") " pod="openstack/memcached-0" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.744321 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7595f929-2c12-4a7f-ba33-2701f7a701ee-kolla-config\") pod \"memcached-0\" (UID: \"7595f929-2c12-4a7f-ba33-2701f7a701ee\") " pod="openstack/memcached-0" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.744916 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7595f929-2c12-4a7f-ba33-2701f7a701ee-config-data\") pod \"memcached-0\" (UID: \"7595f929-2c12-4a7f-ba33-2701f7a701ee\") " pod="openstack/memcached-0" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.746977 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7595f929-2c12-4a7f-ba33-2701f7a701ee-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7595f929-2c12-4a7f-ba33-2701f7a701ee\") " pod="openstack/memcached-0" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.751391 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7595f929-2c12-4a7f-ba33-2701f7a701ee-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7595f929-2c12-4a7f-ba33-2701f7a701ee\") " pod="openstack/memcached-0" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.767277 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v7jp\" (UniqueName: \"kubernetes.io/projected/7595f929-2c12-4a7f-ba33-2701f7a701ee-kube-api-access-7v7jp\") pod \"memcached-0\" (UID: \"7595f929-2c12-4a7f-ba33-2701f7a701ee\") " pod="openstack/memcached-0" Dec 06 09:22:31 crc kubenswrapper[4672]: I1206 09:22:31.897561 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 06 09:22:32 crc kubenswrapper[4672]: I1206 09:22:32.195730 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 06 09:22:32 crc kubenswrapper[4672]: W1206 09:22:32.207646 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7595f929_2c12_4a7f_ba33_2701f7a701ee.slice/crio-8e9d684232b25814d1166c12d63d21bc219db429c5014f1c3eaa6567067ecbb1 WatchSource:0}: Error finding container 8e9d684232b25814d1166c12d63d21bc219db429c5014f1c3eaa6567067ecbb1: Status 404 returned error can't find the container with id 8e9d684232b25814d1166c12d63d21bc219db429c5014f1c3eaa6567067ecbb1 Dec 06 09:22:32 crc kubenswrapper[4672]: I1206 09:22:32.479852 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"7595f929-2c12-4a7f-ba33-2701f7a701ee","Type":"ContainerStarted","Data":"37ab292059b3fe8c1ffe692e18be9e9d896657188104438ae973625d19ead2ac"} Dec 06 09:22:32 crc kubenswrapper[4672]: I1206 09:22:32.479902 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"7595f929-2c12-4a7f-ba33-2701f7a701ee","Type":"ContainerStarted","Data":"8e9d684232b25814d1166c12d63d21bc219db429c5014f1c3eaa6567067ecbb1"} Dec 06 09:22:32 crc kubenswrapper[4672]: I1206 09:22:32.479999 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 06 09:22:32 crc kubenswrapper[4672]: I1206 09:22:32.499556 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=1.49954076 podStartE2EDuration="1.49954076s" podCreationTimestamp="2025-12-06 09:22:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:22:32.494253107 +0000 UTC m=+970.238513394" watchObservedRunningTime="2025-12-06 09:22:32.49954076 +0000 UTC m=+970.243801047" Dec 06 09:22:32 crc kubenswrapper[4672]: I1206 09:22:32.565343 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="944c7316-15fa-4e57-896c-65205e8137b2" path="/var/lib/kubelet/pods/944c7316-15fa-4e57-896c-65205e8137b2/volumes" Dec 06 09:22:32 crc kubenswrapper[4672]: I1206 09:22:32.640006 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 06 09:22:32 crc kubenswrapper[4672]: I1206 09:22:32.755242 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 06 09:22:34 crc kubenswrapper[4672]: I1206 09:22:34.355767 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-brwrz" Dec 06 09:22:34 crc kubenswrapper[4672]: I1206 09:22:34.408745 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-brwrz" Dec 06 09:22:34 crc kubenswrapper[4672]: I1206 09:22:34.609086 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-brwrz"] Dec 06 09:22:35 crc kubenswrapper[4672]: I1206 09:22:35.502195 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-brwrz" podUID="47170544-affc-42cd-8c00-305d44f1efa0" containerName="registry-server" containerID="cri-o://a1e5931959f3ab8a0d60da52c1a9d67b43c71d42af35017d4950a684a9cb49a1" gracePeriod=2 Dec 06 09:22:36 crc kubenswrapper[4672]: I1206 09:22:36.004654 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-brwrz" Dec 06 09:22:36 crc kubenswrapper[4672]: I1206 09:22:36.131239 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47170544-affc-42cd-8c00-305d44f1efa0-catalog-content\") pod \"47170544-affc-42cd-8c00-305d44f1efa0\" (UID: \"47170544-affc-42cd-8c00-305d44f1efa0\") " Dec 06 09:22:36 crc kubenswrapper[4672]: I1206 09:22:36.131314 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47170544-affc-42cd-8c00-305d44f1efa0-utilities\") pod \"47170544-affc-42cd-8c00-305d44f1efa0\" (UID: \"47170544-affc-42cd-8c00-305d44f1efa0\") " Dec 06 09:22:36 crc kubenswrapper[4672]: I1206 09:22:36.131411 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z949x\" (UniqueName: \"kubernetes.io/projected/47170544-affc-42cd-8c00-305d44f1efa0-kube-api-access-z949x\") pod \"47170544-affc-42cd-8c00-305d44f1efa0\" (UID: \"47170544-affc-42cd-8c00-305d44f1efa0\") " Dec 06 09:22:36 crc kubenswrapper[4672]: I1206 09:22:36.132197 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47170544-affc-42cd-8c00-305d44f1efa0-utilities" (OuterVolumeSpecName: "utilities") pod "47170544-affc-42cd-8c00-305d44f1efa0" (UID: "47170544-affc-42cd-8c00-305d44f1efa0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:22:36 crc kubenswrapper[4672]: I1206 09:22:36.146919 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47170544-affc-42cd-8c00-305d44f1efa0-kube-api-access-z949x" (OuterVolumeSpecName: "kube-api-access-z949x") pod "47170544-affc-42cd-8c00-305d44f1efa0" (UID: "47170544-affc-42cd-8c00-305d44f1efa0"). InnerVolumeSpecName "kube-api-access-z949x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:22:36 crc kubenswrapper[4672]: I1206 09:22:36.152998 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47170544-affc-42cd-8c00-305d44f1efa0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47170544-affc-42cd-8c00-305d44f1efa0" (UID: "47170544-affc-42cd-8c00-305d44f1efa0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:22:36 crc kubenswrapper[4672]: I1206 09:22:36.232936 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47170544-affc-42cd-8c00-305d44f1efa0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:36 crc kubenswrapper[4672]: I1206 09:22:36.232976 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47170544-affc-42cd-8c00-305d44f1efa0-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:36 crc kubenswrapper[4672]: I1206 09:22:36.232985 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z949x\" (UniqueName: \"kubernetes.io/projected/47170544-affc-42cd-8c00-305d44f1efa0-kube-api-access-z949x\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:36 crc kubenswrapper[4672]: I1206 09:22:36.514699 4672 generic.go:334] "Generic (PLEG): container finished" podID="47170544-affc-42cd-8c00-305d44f1efa0" containerID="a1e5931959f3ab8a0d60da52c1a9d67b43c71d42af35017d4950a684a9cb49a1" exitCode=0 Dec 06 09:22:36 crc kubenswrapper[4672]: I1206 09:22:36.514796 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-brwrz" event={"ID":"47170544-affc-42cd-8c00-305d44f1efa0","Type":"ContainerDied","Data":"a1e5931959f3ab8a0d60da52c1a9d67b43c71d42af35017d4950a684a9cb49a1"} Dec 06 09:22:36 crc kubenswrapper[4672]: I1206 09:22:36.514846 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-brwrz" event={"ID":"47170544-affc-42cd-8c00-305d44f1efa0","Type":"ContainerDied","Data":"c59dfe7edc5ddbf7a57ce3f625e676ce7ba60581160c810b583f1220476f337c"} Dec 06 09:22:36 crc kubenswrapper[4672]: I1206 09:22:36.514875 4672 scope.go:117] "RemoveContainer" containerID="a1e5931959f3ab8a0d60da52c1a9d67b43c71d42af35017d4950a684a9cb49a1" Dec 06 09:22:36 crc kubenswrapper[4672]: I1206 09:22:36.514800 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-brwrz" Dec 06 09:22:36 crc kubenswrapper[4672]: I1206 09:22:36.545606 4672 scope.go:117] "RemoveContainer" containerID="2334b7475a993d7b8d38ecdad9aa8101ef9c9e5ccb09a586afd81eea672a7b01" Dec 06 09:22:36 crc kubenswrapper[4672]: I1206 09:22:36.592809 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-brwrz"] Dec 06 09:22:36 crc kubenswrapper[4672]: I1206 09:22:36.592854 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-brwrz"] Dec 06 09:22:36 crc kubenswrapper[4672]: I1206 09:22:36.611354 4672 scope.go:117] "RemoveContainer" containerID="25a39d7b96cfcba3b0c17fa5cdbd12e149dd5ee92d51c0b60c4af04ecee5e350" Dec 06 09:22:36 crc kubenswrapper[4672]: I1206 09:22:36.658946 4672 scope.go:117] "RemoveContainer" containerID="a1e5931959f3ab8a0d60da52c1a9d67b43c71d42af35017d4950a684a9cb49a1" Dec 06 09:22:36 crc kubenswrapper[4672]: E1206 09:22:36.659536 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1e5931959f3ab8a0d60da52c1a9d67b43c71d42af35017d4950a684a9cb49a1\": container with ID starting with a1e5931959f3ab8a0d60da52c1a9d67b43c71d42af35017d4950a684a9cb49a1 not found: ID does not exist" containerID="a1e5931959f3ab8a0d60da52c1a9d67b43c71d42af35017d4950a684a9cb49a1" Dec 06 09:22:36 crc kubenswrapper[4672]: I1206 09:22:36.659648 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1e5931959f3ab8a0d60da52c1a9d67b43c71d42af35017d4950a684a9cb49a1"} err="failed to get container status \"a1e5931959f3ab8a0d60da52c1a9d67b43c71d42af35017d4950a684a9cb49a1\": rpc error: code = NotFound desc = could not find container \"a1e5931959f3ab8a0d60da52c1a9d67b43c71d42af35017d4950a684a9cb49a1\": container with ID starting with a1e5931959f3ab8a0d60da52c1a9d67b43c71d42af35017d4950a684a9cb49a1 not found: ID does not exist" Dec 06 09:22:36 crc kubenswrapper[4672]: I1206 09:22:36.659699 4672 scope.go:117] "RemoveContainer" containerID="2334b7475a993d7b8d38ecdad9aa8101ef9c9e5ccb09a586afd81eea672a7b01" Dec 06 09:22:36 crc kubenswrapper[4672]: E1206 09:22:36.660188 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2334b7475a993d7b8d38ecdad9aa8101ef9c9e5ccb09a586afd81eea672a7b01\": container with ID starting with 2334b7475a993d7b8d38ecdad9aa8101ef9c9e5ccb09a586afd81eea672a7b01 not found: ID does not exist" containerID="2334b7475a993d7b8d38ecdad9aa8101ef9c9e5ccb09a586afd81eea672a7b01" Dec 06 09:22:36 crc kubenswrapper[4672]: I1206 09:22:36.660234 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2334b7475a993d7b8d38ecdad9aa8101ef9c9e5ccb09a586afd81eea672a7b01"} err="failed to get container status \"2334b7475a993d7b8d38ecdad9aa8101ef9c9e5ccb09a586afd81eea672a7b01\": rpc error: code = NotFound desc = could not find container \"2334b7475a993d7b8d38ecdad9aa8101ef9c9e5ccb09a586afd81eea672a7b01\": container with ID starting with 2334b7475a993d7b8d38ecdad9aa8101ef9c9e5ccb09a586afd81eea672a7b01 not found: ID does not exist" Dec 06 09:22:36 crc kubenswrapper[4672]: I1206 09:22:36.660272 4672 scope.go:117] "RemoveContainer" containerID="25a39d7b96cfcba3b0c17fa5cdbd12e149dd5ee92d51c0b60c4af04ecee5e350" Dec 06 09:22:36 crc kubenswrapper[4672]: E1206 09:22:36.661824 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25a39d7b96cfcba3b0c17fa5cdbd12e149dd5ee92d51c0b60c4af04ecee5e350\": container with ID starting with 25a39d7b96cfcba3b0c17fa5cdbd12e149dd5ee92d51c0b60c4af04ecee5e350 not found: ID does not exist" containerID="25a39d7b96cfcba3b0c17fa5cdbd12e149dd5ee92d51c0b60c4af04ecee5e350" Dec 06 09:22:36 crc kubenswrapper[4672]: I1206 09:22:36.661884 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25a39d7b96cfcba3b0c17fa5cdbd12e149dd5ee92d51c0b60c4af04ecee5e350"} err="failed to get container status \"25a39d7b96cfcba3b0c17fa5cdbd12e149dd5ee92d51c0b60c4af04ecee5e350\": rpc error: code = NotFound desc = could not find container \"25a39d7b96cfcba3b0c17fa5cdbd12e149dd5ee92d51c0b60c4af04ecee5e350\": container with ID starting with 25a39d7b96cfcba3b0c17fa5cdbd12e149dd5ee92d51c0b60c4af04ecee5e350 not found: ID does not exist" Dec 06 09:22:37 crc kubenswrapper[4672]: I1206 09:22:37.006876 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-846f75bbfc-d8wm4" Dec 06 09:22:37 crc kubenswrapper[4672]: I1206 09:22:37.067591 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 06 09:22:37 crc kubenswrapper[4672]: I1206 09:22:37.068424 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 06 09:22:37 crc kubenswrapper[4672]: I1206 09:22:37.247535 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 06 09:22:37 crc kubenswrapper[4672]: I1206 09:22:37.386694 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-984c76dd7-mc6xv" Dec 06 09:22:37 crc kubenswrapper[4672]: I1206 09:22:37.442868 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-846f75bbfc-d8wm4"] Dec 06 09:22:37 crc kubenswrapper[4672]: I1206 09:22:37.522308 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-846f75bbfc-d8wm4" podUID="4baac82c-1001-4c74-9efe-b11e29efcce9" containerName="dnsmasq-dns" containerID="cri-o://4e5c866f6567cc77b3aef0fcbb5637fb7fbb12fe5357e18033f13cd32c52b3a0" gracePeriod=10 Dec 06 09:22:37 crc kubenswrapper[4672]: I1206 09:22:37.622530 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.029731 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-846f75bbfc-d8wm4" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.211573 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxtg6\" (UniqueName: \"kubernetes.io/projected/4baac82c-1001-4c74-9efe-b11e29efcce9-kube-api-access-xxtg6\") pod \"4baac82c-1001-4c74-9efe-b11e29efcce9\" (UID: \"4baac82c-1001-4c74-9efe-b11e29efcce9\") " Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.211666 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4baac82c-1001-4c74-9efe-b11e29efcce9-ovsdbserver-sb\") pod \"4baac82c-1001-4c74-9efe-b11e29efcce9\" (UID: \"4baac82c-1001-4c74-9efe-b11e29efcce9\") " Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.211736 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4baac82c-1001-4c74-9efe-b11e29efcce9-config\") pod \"4baac82c-1001-4c74-9efe-b11e29efcce9\" (UID: \"4baac82c-1001-4c74-9efe-b11e29efcce9\") " Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.211931 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4baac82c-1001-4c74-9efe-b11e29efcce9-dns-svc\") pod \"4baac82c-1001-4c74-9efe-b11e29efcce9\" (UID: \"4baac82c-1001-4c74-9efe-b11e29efcce9\") " Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.228020 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4baac82c-1001-4c74-9efe-b11e29efcce9-kube-api-access-xxtg6" (OuterVolumeSpecName: "kube-api-access-xxtg6") pod "4baac82c-1001-4c74-9efe-b11e29efcce9" (UID: "4baac82c-1001-4c74-9efe-b11e29efcce9"). InnerVolumeSpecName "kube-api-access-xxtg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.249214 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4baac82c-1001-4c74-9efe-b11e29efcce9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4baac82c-1001-4c74-9efe-b11e29efcce9" (UID: "4baac82c-1001-4c74-9efe-b11e29efcce9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.256432 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4baac82c-1001-4c74-9efe-b11e29efcce9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4baac82c-1001-4c74-9efe-b11e29efcce9" (UID: "4baac82c-1001-4c74-9efe-b11e29efcce9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.259319 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4baac82c-1001-4c74-9efe-b11e29efcce9-config" (OuterVolumeSpecName: "config") pod "4baac82c-1001-4c74-9efe-b11e29efcce9" (UID: "4baac82c-1001-4c74-9efe-b11e29efcce9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.313834 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxtg6\" (UniqueName: \"kubernetes.io/projected/4baac82c-1001-4c74-9efe-b11e29efcce9-kube-api-access-xxtg6\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.313866 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4baac82c-1001-4c74-9efe-b11e29efcce9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.313875 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4baac82c-1001-4c74-9efe-b11e29efcce9-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.313883 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4baac82c-1001-4c74-9efe-b11e29efcce9-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.534920 4672 generic.go:334] "Generic (PLEG): container finished" podID="4baac82c-1001-4c74-9efe-b11e29efcce9" containerID="4e5c866f6567cc77b3aef0fcbb5637fb7fbb12fe5357e18033f13cd32c52b3a0" exitCode=0 Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.534987 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-846f75bbfc-d8wm4" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.534995 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-846f75bbfc-d8wm4" event={"ID":"4baac82c-1001-4c74-9efe-b11e29efcce9","Type":"ContainerDied","Data":"4e5c866f6567cc77b3aef0fcbb5637fb7fbb12fe5357e18033f13cd32c52b3a0"} Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.536502 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-846f75bbfc-d8wm4" event={"ID":"4baac82c-1001-4c74-9efe-b11e29efcce9","Type":"ContainerDied","Data":"33e67329fc1b566aab8a4f18f9a06ab5c8838cd808eab67d74b2da088bb4002f"} Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.536537 4672 scope.go:117] "RemoveContainer" containerID="4e5c866f6567cc77b3aef0fcbb5637fb7fbb12fe5357e18033f13cd32c52b3a0" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.565847 4672 scope.go:117] "RemoveContainer" containerID="6768281cbbebf65fd4c1bb0dc9893263d08aa12dc2cbe5db98e5d39a988c90c7" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.576153 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47170544-affc-42cd-8c00-305d44f1efa0" path="/var/lib/kubelet/pods/47170544-affc-42cd-8c00-305d44f1efa0/volumes" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.595033 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-846f75bbfc-d8wm4"] Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.603380 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-846f75bbfc-d8wm4"] Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.610137 4672 scope.go:117] "RemoveContainer" containerID="4e5c866f6567cc77b3aef0fcbb5637fb7fbb12fe5357e18033f13cd32c52b3a0" Dec 06 09:22:38 crc kubenswrapper[4672]: E1206 09:22:38.610819 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e5c866f6567cc77b3aef0fcbb5637fb7fbb12fe5357e18033f13cd32c52b3a0\": container with ID starting with 4e5c866f6567cc77b3aef0fcbb5637fb7fbb12fe5357e18033f13cd32c52b3a0 not found: ID does not exist" containerID="4e5c866f6567cc77b3aef0fcbb5637fb7fbb12fe5357e18033f13cd32c52b3a0" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.610874 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e5c866f6567cc77b3aef0fcbb5637fb7fbb12fe5357e18033f13cd32c52b3a0"} err="failed to get container status \"4e5c866f6567cc77b3aef0fcbb5637fb7fbb12fe5357e18033f13cd32c52b3a0\": rpc error: code = NotFound desc = could not find container \"4e5c866f6567cc77b3aef0fcbb5637fb7fbb12fe5357e18033f13cd32c52b3a0\": container with ID starting with 4e5c866f6567cc77b3aef0fcbb5637fb7fbb12fe5357e18033f13cd32c52b3a0 not found: ID does not exist" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.610906 4672 scope.go:117] "RemoveContainer" containerID="6768281cbbebf65fd4c1bb0dc9893263d08aa12dc2cbe5db98e5d39a988c90c7" Dec 06 09:22:38 crc kubenswrapper[4672]: E1206 09:22:38.612008 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6768281cbbebf65fd4c1bb0dc9893263d08aa12dc2cbe5db98e5d39a988c90c7\": container with ID starting with 6768281cbbebf65fd4c1bb0dc9893263d08aa12dc2cbe5db98e5d39a988c90c7 not found: ID does not exist" containerID="6768281cbbebf65fd4c1bb0dc9893263d08aa12dc2cbe5db98e5d39a988c90c7" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.612053 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6768281cbbebf65fd4c1bb0dc9893263d08aa12dc2cbe5db98e5d39a988c90c7"} err="failed to get container status \"6768281cbbebf65fd4c1bb0dc9893263d08aa12dc2cbe5db98e5d39a988c90c7\": rpc error: code = NotFound desc = could not find container \"6768281cbbebf65fd4c1bb0dc9893263d08aa12dc2cbe5db98e5d39a988c90c7\": container with ID starting with 6768281cbbebf65fd4c1bb0dc9893263d08aa12dc2cbe5db98e5d39a988c90c7 not found: ID does not exist" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.752688 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-6jp7t"] Dec 06 09:22:38 crc kubenswrapper[4672]: E1206 09:22:38.752994 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47170544-affc-42cd-8c00-305d44f1efa0" containerName="registry-server" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.753005 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="47170544-affc-42cd-8c00-305d44f1efa0" containerName="registry-server" Dec 06 09:22:38 crc kubenswrapper[4672]: E1206 09:22:38.753017 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47170544-affc-42cd-8c00-305d44f1efa0" containerName="extract-utilities" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.753023 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="47170544-affc-42cd-8c00-305d44f1efa0" containerName="extract-utilities" Dec 06 09:22:38 crc kubenswrapper[4672]: E1206 09:22:38.753048 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47170544-affc-42cd-8c00-305d44f1efa0" containerName="extract-content" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.753057 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="47170544-affc-42cd-8c00-305d44f1efa0" containerName="extract-content" Dec 06 09:22:38 crc kubenswrapper[4672]: E1206 09:22:38.753068 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4baac82c-1001-4c74-9efe-b11e29efcce9" containerName="dnsmasq-dns" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.753074 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4baac82c-1001-4c74-9efe-b11e29efcce9" containerName="dnsmasq-dns" Dec 06 09:22:38 crc kubenswrapper[4672]: E1206 09:22:38.753083 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4baac82c-1001-4c74-9efe-b11e29efcce9" containerName="init" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.753089 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4baac82c-1001-4c74-9efe-b11e29efcce9" containerName="init" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.753229 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="47170544-affc-42cd-8c00-305d44f1efa0" containerName="registry-server" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.753239 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="4baac82c-1001-4c74-9efe-b11e29efcce9" containerName="dnsmasq-dns" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.753736 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-6jp7t" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.776964 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-6jp7t"] Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.886795 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-b3bc-account-create-update-wf22b"] Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.888007 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b3bc-account-create-update-wf22b" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.890190 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.893683 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b3bc-account-create-update-wf22b"] Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.929907 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/438c535f-88ae-4e3d-98f9-014d67606706-operator-scripts\") pod \"keystone-db-create-6jp7t\" (UID: \"438c535f-88ae-4e3d-98f9-014d67606706\") " pod="openstack/keystone-db-create-6jp7t" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.929977 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hshxp\" (UniqueName: \"kubernetes.io/projected/438c535f-88ae-4e3d-98f9-014d67606706-kube-api-access-hshxp\") pod \"keystone-db-create-6jp7t\" (UID: \"438c535f-88ae-4e3d-98f9-014d67606706\") " pod="openstack/keystone-db-create-6jp7t" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.969187 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-vnjc2"] Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.970205 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vnjc2" Dec 06 09:22:38 crc kubenswrapper[4672]: I1206 09:22:38.980550 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-vnjc2"] Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.031110 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/438c535f-88ae-4e3d-98f9-014d67606706-operator-scripts\") pod \"keystone-db-create-6jp7t\" (UID: \"438c535f-88ae-4e3d-98f9-014d67606706\") " pod="openstack/keystone-db-create-6jp7t" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.031450 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzg8r\" (UniqueName: \"kubernetes.io/projected/ffeed4c1-b7f1-4475-850b-f768a7aafe8a-kube-api-access-jzg8r\") pod \"keystone-b3bc-account-create-update-wf22b\" (UID: \"ffeed4c1-b7f1-4475-850b-f768a7aafe8a\") " pod="openstack/keystone-b3bc-account-create-update-wf22b" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.031560 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hshxp\" (UniqueName: \"kubernetes.io/projected/438c535f-88ae-4e3d-98f9-014d67606706-kube-api-access-hshxp\") pod \"keystone-db-create-6jp7t\" (UID: \"438c535f-88ae-4e3d-98f9-014d67606706\") " pod="openstack/keystone-db-create-6jp7t" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.031734 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffeed4c1-b7f1-4475-850b-f768a7aafe8a-operator-scripts\") pod \"keystone-b3bc-account-create-update-wf22b\" (UID: \"ffeed4c1-b7f1-4475-850b-f768a7aafe8a\") " pod="openstack/keystone-b3bc-account-create-update-wf22b" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.031893 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/438c535f-88ae-4e3d-98f9-014d67606706-operator-scripts\") pod \"keystone-db-create-6jp7t\" (UID: \"438c535f-88ae-4e3d-98f9-014d67606706\") " pod="openstack/keystone-db-create-6jp7t" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.060615 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hshxp\" (UniqueName: \"kubernetes.io/projected/438c535f-88ae-4e3d-98f9-014d67606706-kube-api-access-hshxp\") pod \"keystone-db-create-6jp7t\" (UID: \"438c535f-88ae-4e3d-98f9-014d67606706\") " pod="openstack/keystone-db-create-6jp7t" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.076317 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-ce36-account-create-update-66fz7"] Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.077438 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ce36-account-create-update-66fz7" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.082558 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.085715 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-ce36-account-create-update-66fz7"] Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.133009 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffeed4c1-b7f1-4475-850b-f768a7aafe8a-operator-scripts\") pod \"keystone-b3bc-account-create-update-wf22b\" (UID: \"ffeed4c1-b7f1-4475-850b-f768a7aafe8a\") " pod="openstack/keystone-b3bc-account-create-update-wf22b" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.133093 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzg8r\" (UniqueName: \"kubernetes.io/projected/ffeed4c1-b7f1-4475-850b-f768a7aafe8a-kube-api-access-jzg8r\") pod \"keystone-b3bc-account-create-update-wf22b\" (UID: \"ffeed4c1-b7f1-4475-850b-f768a7aafe8a\") " pod="openstack/keystone-b3bc-account-create-update-wf22b" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.133159 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4ecba74-762d-4ca3-a6c4-99c9804d5d64-operator-scripts\") pod \"placement-db-create-vnjc2\" (UID: \"d4ecba74-762d-4ca3-a6c4-99c9804d5d64\") " pod="openstack/placement-db-create-vnjc2" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.133196 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhtbx\" (UniqueName: \"kubernetes.io/projected/d4ecba74-762d-4ca3-a6c4-99c9804d5d64-kube-api-access-lhtbx\") pod \"placement-db-create-vnjc2\" (UID: \"d4ecba74-762d-4ca3-a6c4-99c9804d5d64\") " pod="openstack/placement-db-create-vnjc2" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.133779 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffeed4c1-b7f1-4475-850b-f768a7aafe8a-operator-scripts\") pod \"keystone-b3bc-account-create-update-wf22b\" (UID: \"ffeed4c1-b7f1-4475-850b-f768a7aafe8a\") " pod="openstack/keystone-b3bc-account-create-update-wf22b" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.135327 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-6jp7t" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.148146 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzg8r\" (UniqueName: \"kubernetes.io/projected/ffeed4c1-b7f1-4475-850b-f768a7aafe8a-kube-api-access-jzg8r\") pod \"keystone-b3bc-account-create-update-wf22b\" (UID: \"ffeed4c1-b7f1-4475-850b-f768a7aafe8a\") " pod="openstack/keystone-b3bc-account-create-update-wf22b" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.201620 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b3bc-account-create-update-wf22b" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.234755 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4ecba74-762d-4ca3-a6c4-99c9804d5d64-operator-scripts\") pod \"placement-db-create-vnjc2\" (UID: \"d4ecba74-762d-4ca3-a6c4-99c9804d5d64\") " pod="openstack/placement-db-create-vnjc2" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.235003 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhtbx\" (UniqueName: \"kubernetes.io/projected/d4ecba74-762d-4ca3-a6c4-99c9804d5d64-kube-api-access-lhtbx\") pod \"placement-db-create-vnjc2\" (UID: \"d4ecba74-762d-4ca3-a6c4-99c9804d5d64\") " pod="openstack/placement-db-create-vnjc2" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.235067 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46dc695c-b1e7-4eb8-9345-63a7b9365334-operator-scripts\") pod \"placement-ce36-account-create-update-66fz7\" (UID: \"46dc695c-b1e7-4eb8-9345-63a7b9365334\") " pod="openstack/placement-ce36-account-create-update-66fz7" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.235085 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkb2w\" (UniqueName: \"kubernetes.io/projected/46dc695c-b1e7-4eb8-9345-63a7b9365334-kube-api-access-lkb2w\") pod \"placement-ce36-account-create-update-66fz7\" (UID: \"46dc695c-b1e7-4eb8-9345-63a7b9365334\") " pod="openstack/placement-ce36-account-create-update-66fz7" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.235809 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4ecba74-762d-4ca3-a6c4-99c9804d5d64-operator-scripts\") pod \"placement-db-create-vnjc2\" (UID: \"d4ecba74-762d-4ca3-a6c4-99c9804d5d64\") " pod="openstack/placement-db-create-vnjc2" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.236056 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-jp69p"] Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.237006 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jp69p" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.242885 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-jp69p"] Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.274979 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhtbx\" (UniqueName: \"kubernetes.io/projected/d4ecba74-762d-4ca3-a6c4-99c9804d5d64-kube-api-access-lhtbx\") pod \"placement-db-create-vnjc2\" (UID: \"d4ecba74-762d-4ca3-a6c4-99c9804d5d64\") " pod="openstack/placement-db-create-vnjc2" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.304918 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vnjc2" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.336852 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qzkw\" (UniqueName: \"kubernetes.io/projected/4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9-kube-api-access-8qzkw\") pod \"glance-db-create-jp69p\" (UID: \"4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9\") " pod="openstack/glance-db-create-jp69p" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.336909 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkb2w\" (UniqueName: \"kubernetes.io/projected/46dc695c-b1e7-4eb8-9345-63a7b9365334-kube-api-access-lkb2w\") pod \"placement-ce36-account-create-update-66fz7\" (UID: \"46dc695c-b1e7-4eb8-9345-63a7b9365334\") " pod="openstack/placement-ce36-account-create-update-66fz7" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.336966 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46dc695c-b1e7-4eb8-9345-63a7b9365334-operator-scripts\") pod \"placement-ce36-account-create-update-66fz7\" (UID: \"46dc695c-b1e7-4eb8-9345-63a7b9365334\") " pod="openstack/placement-ce36-account-create-update-66fz7" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.336999 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9-operator-scripts\") pod \"glance-db-create-jp69p\" (UID: \"4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9\") " pod="openstack/glance-db-create-jp69p" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.337762 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46dc695c-b1e7-4eb8-9345-63a7b9365334-operator-scripts\") pod \"placement-ce36-account-create-update-66fz7\" (UID: \"46dc695c-b1e7-4eb8-9345-63a7b9365334\") " pod="openstack/placement-ce36-account-create-update-66fz7" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.356287 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkb2w\" (UniqueName: \"kubernetes.io/projected/46dc695c-b1e7-4eb8-9345-63a7b9365334-kube-api-access-lkb2w\") pod \"placement-ce36-account-create-update-66fz7\" (UID: \"46dc695c-b1e7-4eb8-9345-63a7b9365334\") " pod="openstack/placement-ce36-account-create-update-66fz7" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.393404 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ce36-account-create-update-66fz7" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.399480 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-f703-account-create-update-zxv8m"] Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.407182 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f703-account-create-update-zxv8m" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.410247 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.428765 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f703-account-create-update-zxv8m"] Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.438117 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qzkw\" (UniqueName: \"kubernetes.io/projected/4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9-kube-api-access-8qzkw\") pod \"glance-db-create-jp69p\" (UID: \"4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9\") " pod="openstack/glance-db-create-jp69p" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.438181 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9-operator-scripts\") pod \"glance-db-create-jp69p\" (UID: \"4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9\") " pod="openstack/glance-db-create-jp69p" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.439261 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9-operator-scripts\") pod \"glance-db-create-jp69p\" (UID: \"4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9\") " pod="openstack/glance-db-create-jp69p" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.465021 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qzkw\" (UniqueName: \"kubernetes.io/projected/4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9-kube-api-access-8qzkw\") pod \"glance-db-create-jp69p\" (UID: \"4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9\") " pod="openstack/glance-db-create-jp69p" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.542589 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjh48\" (UniqueName: \"kubernetes.io/projected/21d9a4c5-61b5-4342-8b9f-853fefc25329-kube-api-access-hjh48\") pod \"glance-f703-account-create-update-zxv8m\" (UID: \"21d9a4c5-61b5-4342-8b9f-853fefc25329\") " pod="openstack/glance-f703-account-create-update-zxv8m" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.543364 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21d9a4c5-61b5-4342-8b9f-853fefc25329-operator-scripts\") pod \"glance-f703-account-create-update-zxv8m\" (UID: \"21d9a4c5-61b5-4342-8b9f-853fefc25329\") " pod="openstack/glance-f703-account-create-update-zxv8m" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.577962 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jp69p" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.588795 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-6jp7t"] Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.645387 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21d9a4c5-61b5-4342-8b9f-853fefc25329-operator-scripts\") pod \"glance-f703-account-create-update-zxv8m\" (UID: \"21d9a4c5-61b5-4342-8b9f-853fefc25329\") " pod="openstack/glance-f703-account-create-update-zxv8m" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.645534 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjh48\" (UniqueName: \"kubernetes.io/projected/21d9a4c5-61b5-4342-8b9f-853fefc25329-kube-api-access-hjh48\") pod \"glance-f703-account-create-update-zxv8m\" (UID: \"21d9a4c5-61b5-4342-8b9f-853fefc25329\") " pod="openstack/glance-f703-account-create-update-zxv8m" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.647261 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21d9a4c5-61b5-4342-8b9f-853fefc25329-operator-scripts\") pod \"glance-f703-account-create-update-zxv8m\" (UID: \"21d9a4c5-61b5-4342-8b9f-853fefc25329\") " pod="openstack/glance-f703-account-create-update-zxv8m" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.666500 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjh48\" (UniqueName: \"kubernetes.io/projected/21d9a4c5-61b5-4342-8b9f-853fefc25329-kube-api-access-hjh48\") pod \"glance-f703-account-create-update-zxv8m\" (UID: \"21d9a4c5-61b5-4342-8b9f-853fefc25329\") " pod="openstack/glance-f703-account-create-update-zxv8m" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.728190 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b3bc-account-create-update-wf22b"] Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.733185 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f703-account-create-update-zxv8m" Dec 06 09:22:39 crc kubenswrapper[4672]: W1206 09:22:39.735142 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffeed4c1_b7f1_4475_850b_f768a7aafe8a.slice/crio-ddde069b6bc9decd425ae6433f290fbc802c200b96299b79c9c52ac95b5a83d0 WatchSource:0}: Error finding container ddde069b6bc9decd425ae6433f290fbc802c200b96299b79c9c52ac95b5a83d0: Status 404 returned error can't find the container with id ddde069b6bc9decd425ae6433f290fbc802c200b96299b79c9c52ac95b5a83d0 Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.820080 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-x2jpt"] Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.824862 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x2jpt" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.836646 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x2jpt"] Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.852556 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-vnjc2"] Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.935043 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-ce36-account-create-update-66fz7"] Dec 06 09:22:39 crc kubenswrapper[4672]: W1206 09:22:39.942225 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46dc695c_b1e7_4eb8_9345_63a7b9365334.slice/crio-695bb61b60e82d45c0fc7be72c98225d18cbcdb4507413235f2a37c4ad46d8ed WatchSource:0}: Error finding container 695bb61b60e82d45c0fc7be72c98225d18cbcdb4507413235f2a37c4ad46d8ed: Status 404 returned error can't find the container with id 695bb61b60e82d45c0fc7be72c98225d18cbcdb4507413235f2a37c4ad46d8ed Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.952362 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzn76\" (UniqueName: \"kubernetes.io/projected/b2e3e6e1-ca0e-405e-a482-539d4dc9cadd-kube-api-access-mzn76\") pod \"redhat-operators-x2jpt\" (UID: \"b2e3e6e1-ca0e-405e-a482-539d4dc9cadd\") " pod="openshift-marketplace/redhat-operators-x2jpt" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.952529 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2e3e6e1-ca0e-405e-a482-539d4dc9cadd-utilities\") pod \"redhat-operators-x2jpt\" (UID: \"b2e3e6e1-ca0e-405e-a482-539d4dc9cadd\") " pod="openshift-marketplace/redhat-operators-x2jpt" Dec 06 09:22:39 crc kubenswrapper[4672]: I1206 09:22:39.953407 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2e3e6e1-ca0e-405e-a482-539d4dc9cadd-catalog-content\") pod \"redhat-operators-x2jpt\" (UID: \"b2e3e6e1-ca0e-405e-a482-539d4dc9cadd\") " pod="openshift-marketplace/redhat-operators-x2jpt" Dec 06 09:22:40 crc kubenswrapper[4672]: I1206 09:22:40.056503 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2e3e6e1-ca0e-405e-a482-539d4dc9cadd-catalog-content\") pod \"redhat-operators-x2jpt\" (UID: \"b2e3e6e1-ca0e-405e-a482-539d4dc9cadd\") " pod="openshift-marketplace/redhat-operators-x2jpt" Dec 06 09:22:40 crc kubenswrapper[4672]: I1206 09:22:40.056557 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzn76\" (UniqueName: \"kubernetes.io/projected/b2e3e6e1-ca0e-405e-a482-539d4dc9cadd-kube-api-access-mzn76\") pod \"redhat-operators-x2jpt\" (UID: \"b2e3e6e1-ca0e-405e-a482-539d4dc9cadd\") " pod="openshift-marketplace/redhat-operators-x2jpt" Dec 06 09:22:40 crc kubenswrapper[4672]: I1206 09:22:40.056583 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2e3e6e1-ca0e-405e-a482-539d4dc9cadd-utilities\") pod \"redhat-operators-x2jpt\" (UID: \"b2e3e6e1-ca0e-405e-a482-539d4dc9cadd\") " pod="openshift-marketplace/redhat-operators-x2jpt" Dec 06 09:22:40 crc kubenswrapper[4672]: I1206 09:22:40.057550 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2e3e6e1-ca0e-405e-a482-539d4dc9cadd-catalog-content\") pod \"redhat-operators-x2jpt\" (UID: \"b2e3e6e1-ca0e-405e-a482-539d4dc9cadd\") " pod="openshift-marketplace/redhat-operators-x2jpt" Dec 06 09:22:40 crc kubenswrapper[4672]: I1206 09:22:40.057667 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2e3e6e1-ca0e-405e-a482-539d4dc9cadd-utilities\") pod \"redhat-operators-x2jpt\" (UID: \"b2e3e6e1-ca0e-405e-a482-539d4dc9cadd\") " pod="openshift-marketplace/redhat-operators-x2jpt" Dec 06 09:22:40 crc kubenswrapper[4672]: I1206 09:22:40.059908 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-jp69p"] Dec 06 09:22:40 crc kubenswrapper[4672]: I1206 09:22:40.073130 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzn76\" (UniqueName: \"kubernetes.io/projected/b2e3e6e1-ca0e-405e-a482-539d4dc9cadd-kube-api-access-mzn76\") pod \"redhat-operators-x2jpt\" (UID: \"b2e3e6e1-ca0e-405e-a482-539d4dc9cadd\") " pod="openshift-marketplace/redhat-operators-x2jpt" Dec 06 09:22:40 crc kubenswrapper[4672]: W1206 09:22:40.084575 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4eecb7d4_6ed6_4f94_9a6c_dc8abfd473a9.slice/crio-64eb6069959a7107275c0ab74eecffa4fa33ac167684f5076ab8948e371e0041 WatchSource:0}: Error finding container 64eb6069959a7107275c0ab74eecffa4fa33ac167684f5076ab8948e371e0041: Status 404 returned error can't find the container with id 64eb6069959a7107275c0ab74eecffa4fa33ac167684f5076ab8948e371e0041 Dec 06 09:22:40 crc kubenswrapper[4672]: I1206 09:22:40.091502 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f703-account-create-update-zxv8m"] Dec 06 09:22:40 crc kubenswrapper[4672]: W1206 09:22:40.099039 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21d9a4c5_61b5_4342_8b9f_853fefc25329.slice/crio-65b1ac430e569847fbe70886007934b5cda1f05df852f7a16eb472d8becbeb02 WatchSource:0}: Error finding container 65b1ac430e569847fbe70886007934b5cda1f05df852f7a16eb472d8becbeb02: Status 404 returned error can't find the container with id 65b1ac430e569847fbe70886007934b5cda1f05df852f7a16eb472d8becbeb02 Dec 06 09:22:40 crc kubenswrapper[4672]: I1206 09:22:40.168968 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x2jpt" Dec 06 09:22:40 crc kubenswrapper[4672]: I1206 09:22:40.559858 4672 generic.go:334] "Generic (PLEG): container finished" podID="ffeed4c1-b7f1-4475-850b-f768a7aafe8a" containerID="01be840edd74c703d608f7e92900632621eb912485a14429b820f8b313b66b7d" exitCode=0 Dec 06 09:22:40 crc kubenswrapper[4672]: I1206 09:22:40.562250 4672 generic.go:334] "Generic (PLEG): container finished" podID="d4ecba74-762d-4ca3-a6c4-99c9804d5d64" containerID="ca10994a9fd98630e1d44279d028b49b0db78557ffea7c52668ca9ba336d5558" exitCode=0 Dec 06 09:22:40 crc kubenswrapper[4672]: I1206 09:22:40.563468 4672 generic.go:334] "Generic (PLEG): container finished" podID="438c535f-88ae-4e3d-98f9-014d67606706" containerID="03fbec216a02d03977f726a9d1a0ec30b683bf17c08165795eeedd3707415d9f" exitCode=0 Dec 06 09:22:40 crc kubenswrapper[4672]: I1206 09:22:40.563885 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4baac82c-1001-4c74-9efe-b11e29efcce9" path="/var/lib/kubelet/pods/4baac82c-1001-4c74-9efe-b11e29efcce9/volumes" Dec 06 09:22:40 crc kubenswrapper[4672]: I1206 09:22:40.564423 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jp69p" event={"ID":"4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9","Type":"ContainerStarted","Data":"dc597b7112339745322ed2b80445088fdd35a47d4a64341aecdc32474035ea27"} Dec 06 09:22:40 crc kubenswrapper[4672]: I1206 09:22:40.564461 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jp69p" event={"ID":"4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9","Type":"ContainerStarted","Data":"64eb6069959a7107275c0ab74eecffa4fa33ac167684f5076ab8948e371e0041"} Dec 06 09:22:40 crc kubenswrapper[4672]: I1206 09:22:40.564474 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b3bc-account-create-update-wf22b" event={"ID":"ffeed4c1-b7f1-4475-850b-f768a7aafe8a","Type":"ContainerDied","Data":"01be840edd74c703d608f7e92900632621eb912485a14429b820f8b313b66b7d"} Dec 06 09:22:40 crc kubenswrapper[4672]: I1206 09:22:40.564485 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b3bc-account-create-update-wf22b" event={"ID":"ffeed4c1-b7f1-4475-850b-f768a7aafe8a","Type":"ContainerStarted","Data":"ddde069b6bc9decd425ae6433f290fbc802c200b96299b79c9c52ac95b5a83d0"} Dec 06 09:22:40 crc kubenswrapper[4672]: I1206 09:22:40.564493 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vnjc2" event={"ID":"d4ecba74-762d-4ca3-a6c4-99c9804d5d64","Type":"ContainerDied","Data":"ca10994a9fd98630e1d44279d028b49b0db78557ffea7c52668ca9ba336d5558"} Dec 06 09:22:40 crc kubenswrapper[4672]: I1206 09:22:40.564504 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vnjc2" event={"ID":"d4ecba74-762d-4ca3-a6c4-99c9804d5d64","Type":"ContainerStarted","Data":"b39ad11815e92acff6fe566c41b188dbd79e3d87717fe75990ec0ff325c57102"} Dec 06 09:22:40 crc kubenswrapper[4672]: I1206 09:22:40.564513 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-6jp7t" event={"ID":"438c535f-88ae-4e3d-98f9-014d67606706","Type":"ContainerDied","Data":"03fbec216a02d03977f726a9d1a0ec30b683bf17c08165795eeedd3707415d9f"} Dec 06 09:22:40 crc kubenswrapper[4672]: I1206 09:22:40.564523 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-6jp7t" event={"ID":"438c535f-88ae-4e3d-98f9-014d67606706","Type":"ContainerStarted","Data":"f26e91387c8911521d4c8260b2e3a0ee0d80330124baa0c16996ba2c2acf47d3"} Dec 06 09:22:40 crc kubenswrapper[4672]: I1206 09:22:40.564610 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ce36-account-create-update-66fz7" event={"ID":"46dc695c-b1e7-4eb8-9345-63a7b9365334","Type":"ContainerStarted","Data":"f2049860aae71f4e068fa600205d6bc7f390cef78952f53e563e8510a67184c2"} Dec 06 09:22:40 crc kubenswrapper[4672]: I1206 09:22:40.564642 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ce36-account-create-update-66fz7" event={"ID":"46dc695c-b1e7-4eb8-9345-63a7b9365334","Type":"ContainerStarted","Data":"695bb61b60e82d45c0fc7be72c98225d18cbcdb4507413235f2a37c4ad46d8ed"} Dec 06 09:22:40 crc kubenswrapper[4672]: I1206 09:22:40.565710 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f703-account-create-update-zxv8m" event={"ID":"21d9a4c5-61b5-4342-8b9f-853fefc25329","Type":"ContainerStarted","Data":"f65e4895b4e77ee8d5d836970adfc6c1ce905d5b3cfb8c43aae2719db8718e27"} Dec 06 09:22:40 crc kubenswrapper[4672]: I1206 09:22:40.565739 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f703-account-create-update-zxv8m" event={"ID":"21d9a4c5-61b5-4342-8b9f-853fefc25329","Type":"ContainerStarted","Data":"65b1ac430e569847fbe70886007934b5cda1f05df852f7a16eb472d8becbeb02"} Dec 06 09:22:40 crc kubenswrapper[4672]: I1206 09:22:40.573258 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-jp69p" podStartSLOduration=1.573212593 podStartE2EDuration="1.573212593s" podCreationTimestamp="2025-12-06 09:22:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:22:40.572963257 +0000 UTC m=+978.317223534" watchObservedRunningTime="2025-12-06 09:22:40.573212593 +0000 UTC m=+978.317472900" Dec 06 09:22:40 crc kubenswrapper[4672]: I1206 09:22:40.645312 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-f703-account-create-update-zxv8m" podStartSLOduration=1.6452802210000002 podStartE2EDuration="1.645280221s" podCreationTimestamp="2025-12-06 09:22:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:22:40.63086982 +0000 UTC m=+978.375130107" watchObservedRunningTime="2025-12-06 09:22:40.645280221 +0000 UTC m=+978.389540508" Dec 06 09:22:40 crc kubenswrapper[4672]: I1206 09:22:40.661904 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-ce36-account-create-update-66fz7" podStartSLOduration=1.6618849519999999 podStartE2EDuration="1.661884952s" podCreationTimestamp="2025-12-06 09:22:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:22:40.661536003 +0000 UTC m=+978.405796300" watchObservedRunningTime="2025-12-06 09:22:40.661884952 +0000 UTC m=+978.406145239" Dec 06 09:22:40 crc kubenswrapper[4672]: I1206 09:22:40.688044 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x2jpt"] Dec 06 09:22:40 crc kubenswrapper[4672]: W1206 09:22:40.759501 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2e3e6e1_ca0e_405e_a482_539d4dc9cadd.slice/crio-477c0791cf1be36b66cd2e1f455ef7575b6a01c6d0a3ab53e338c5aa79eea814 WatchSource:0}: Error finding container 477c0791cf1be36b66cd2e1f455ef7575b6a01c6d0a3ab53e338c5aa79eea814: Status 404 returned error can't find the container with id 477c0791cf1be36b66cd2e1f455ef7575b6a01c6d0a3ab53e338c5aa79eea814 Dec 06 09:22:41 crc kubenswrapper[4672]: I1206 09:22:41.577158 4672 generic.go:334] "Generic (PLEG): container finished" podID="b2e3e6e1-ca0e-405e-a482-539d4dc9cadd" containerID="381fc07eb5acff5db2a559d0392488dffabb117890007699ba0a7a8b6e65f3e1" exitCode=0 Dec 06 09:22:41 crc kubenswrapper[4672]: I1206 09:22:41.577245 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x2jpt" event={"ID":"b2e3e6e1-ca0e-405e-a482-539d4dc9cadd","Type":"ContainerDied","Data":"381fc07eb5acff5db2a559d0392488dffabb117890007699ba0a7a8b6e65f3e1"} Dec 06 09:22:41 crc kubenswrapper[4672]: I1206 09:22:41.577588 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x2jpt" event={"ID":"b2e3e6e1-ca0e-405e-a482-539d4dc9cadd","Type":"ContainerStarted","Data":"477c0791cf1be36b66cd2e1f455ef7575b6a01c6d0a3ab53e338c5aa79eea814"} Dec 06 09:22:41 crc kubenswrapper[4672]: I1206 09:22:41.581302 4672 generic.go:334] "Generic (PLEG): container finished" podID="4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9" containerID="dc597b7112339745322ed2b80445088fdd35a47d4a64341aecdc32474035ea27" exitCode=0 Dec 06 09:22:41 crc kubenswrapper[4672]: I1206 09:22:41.581363 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jp69p" event={"ID":"4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9","Type":"ContainerDied","Data":"dc597b7112339745322ed2b80445088fdd35a47d4a64341aecdc32474035ea27"} Dec 06 09:22:41 crc kubenswrapper[4672]: I1206 09:22:41.583851 4672 generic.go:334] "Generic (PLEG): container finished" podID="46dc695c-b1e7-4eb8-9345-63a7b9365334" containerID="f2049860aae71f4e068fa600205d6bc7f390cef78952f53e563e8510a67184c2" exitCode=0 Dec 06 09:22:41 crc kubenswrapper[4672]: I1206 09:22:41.584052 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ce36-account-create-update-66fz7" event={"ID":"46dc695c-b1e7-4eb8-9345-63a7b9365334","Type":"ContainerDied","Data":"f2049860aae71f4e068fa600205d6bc7f390cef78952f53e563e8510a67184c2"} Dec 06 09:22:41 crc kubenswrapper[4672]: I1206 09:22:41.589272 4672 generic.go:334] "Generic (PLEG): container finished" podID="21d9a4c5-61b5-4342-8b9f-853fefc25329" containerID="f65e4895b4e77ee8d5d836970adfc6c1ce905d5b3cfb8c43aae2719db8718e27" exitCode=0 Dec 06 09:22:41 crc kubenswrapper[4672]: I1206 09:22:41.589701 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f703-account-create-update-zxv8m" event={"ID":"21d9a4c5-61b5-4342-8b9f-853fefc25329","Type":"ContainerDied","Data":"f65e4895b4e77ee8d5d836970adfc6c1ce905d5b3cfb8c43aae2719db8718e27"} Dec 06 09:22:41 crc kubenswrapper[4672]: I1206 09:22:41.899826 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.061122 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vnjc2" Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.190868 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4ecba74-762d-4ca3-a6c4-99c9804d5d64-operator-scripts\") pod \"d4ecba74-762d-4ca3-a6c4-99c9804d5d64\" (UID: \"d4ecba74-762d-4ca3-a6c4-99c9804d5d64\") " Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.191269 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhtbx\" (UniqueName: \"kubernetes.io/projected/d4ecba74-762d-4ca3-a6c4-99c9804d5d64-kube-api-access-lhtbx\") pod \"d4ecba74-762d-4ca3-a6c4-99c9804d5d64\" (UID: \"d4ecba74-762d-4ca3-a6c4-99c9804d5d64\") " Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.192841 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4ecba74-762d-4ca3-a6c4-99c9804d5d64-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d4ecba74-762d-4ca3-a6c4-99c9804d5d64" (UID: "d4ecba74-762d-4ca3-a6c4-99c9804d5d64"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.194927 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b3bc-account-create-update-wf22b" Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.198119 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4ecba74-762d-4ca3-a6c4-99c9804d5d64-kube-api-access-lhtbx" (OuterVolumeSpecName: "kube-api-access-lhtbx") pod "d4ecba74-762d-4ca3-a6c4-99c9804d5d64" (UID: "d4ecba74-762d-4ca3-a6c4-99c9804d5d64"). InnerVolumeSpecName "kube-api-access-lhtbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.286094 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-6jp7t" Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.292313 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffeed4c1-b7f1-4475-850b-f768a7aafe8a-operator-scripts\") pod \"ffeed4c1-b7f1-4475-850b-f768a7aafe8a\" (UID: \"ffeed4c1-b7f1-4475-850b-f768a7aafe8a\") " Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.292417 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzg8r\" (UniqueName: \"kubernetes.io/projected/ffeed4c1-b7f1-4475-850b-f768a7aafe8a-kube-api-access-jzg8r\") pod \"ffeed4c1-b7f1-4475-850b-f768a7aafe8a\" (UID: \"ffeed4c1-b7f1-4475-850b-f768a7aafe8a\") " Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.292864 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhtbx\" (UniqueName: \"kubernetes.io/projected/d4ecba74-762d-4ca3-a6c4-99c9804d5d64-kube-api-access-lhtbx\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.292885 4672 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4ecba74-762d-4ca3-a6c4-99c9804d5d64-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.293891 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffeed4c1-b7f1-4475-850b-f768a7aafe8a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ffeed4c1-b7f1-4475-850b-f768a7aafe8a" (UID: "ffeed4c1-b7f1-4475-850b-f768a7aafe8a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.297881 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffeed4c1-b7f1-4475-850b-f768a7aafe8a-kube-api-access-jzg8r" (OuterVolumeSpecName: "kube-api-access-jzg8r") pod "ffeed4c1-b7f1-4475-850b-f768a7aafe8a" (UID: "ffeed4c1-b7f1-4475-850b-f768a7aafe8a"). InnerVolumeSpecName "kube-api-access-jzg8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.319321 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.319843 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.329417 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.398144 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hshxp\" (UniqueName: \"kubernetes.io/projected/438c535f-88ae-4e3d-98f9-014d67606706-kube-api-access-hshxp\") pod \"438c535f-88ae-4e3d-98f9-014d67606706\" (UID: \"438c535f-88ae-4e3d-98f9-014d67606706\") " Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.398228 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/438c535f-88ae-4e3d-98f9-014d67606706-operator-scripts\") pod \"438c535f-88ae-4e3d-98f9-014d67606706\" (UID: \"438c535f-88ae-4e3d-98f9-014d67606706\") " Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.398647 4672 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffeed4c1-b7f1-4475-850b-f768a7aafe8a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.398661 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzg8r\" (UniqueName: \"kubernetes.io/projected/ffeed4c1-b7f1-4475-850b-f768a7aafe8a-kube-api-access-jzg8r\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.399198 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/438c535f-88ae-4e3d-98f9-014d67606706-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "438c535f-88ae-4e3d-98f9-014d67606706" (UID: "438c535f-88ae-4e3d-98f9-014d67606706"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.401351 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/438c535f-88ae-4e3d-98f9-014d67606706-kube-api-access-hshxp" (OuterVolumeSpecName: "kube-api-access-hshxp") pod "438c535f-88ae-4e3d-98f9-014d67606706" (UID: "438c535f-88ae-4e3d-98f9-014d67606706"). InnerVolumeSpecName "kube-api-access-hshxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.500859 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hshxp\" (UniqueName: \"kubernetes.io/projected/438c535f-88ae-4e3d-98f9-014d67606706-kube-api-access-hshxp\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.500894 4672 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/438c535f-88ae-4e3d-98f9-014d67606706-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.601153 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x2jpt" event={"ID":"b2e3e6e1-ca0e-405e-a482-539d4dc9cadd","Type":"ContainerStarted","Data":"a677321b132a5f4d5a1b35113012695ad2b8e59c01e77fb3e3c6f8a826203fbd"} Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.602555 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b3bc-account-create-update-wf22b" Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.602558 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b3bc-account-create-update-wf22b" event={"ID":"ffeed4c1-b7f1-4475-850b-f768a7aafe8a","Type":"ContainerDied","Data":"ddde069b6bc9decd425ae6433f290fbc802c200b96299b79c9c52ac95b5a83d0"} Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.602691 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddde069b6bc9decd425ae6433f290fbc802c200b96299b79c9c52ac95b5a83d0" Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.606821 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vnjc2" Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.606820 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vnjc2" event={"ID":"d4ecba74-762d-4ca3-a6c4-99c9804d5d64","Type":"ContainerDied","Data":"b39ad11815e92acff6fe566c41b188dbd79e3d87717fe75990ec0ff325c57102"} Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.607060 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b39ad11815e92acff6fe566c41b188dbd79e3d87717fe75990ec0ff325c57102" Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.610333 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-6jp7t" event={"ID":"438c535f-88ae-4e3d-98f9-014d67606706","Type":"ContainerDied","Data":"f26e91387c8911521d4c8260b2e3a0ee0d80330124baa0c16996ba2c2acf47d3"} Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.610363 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f26e91387c8911521d4c8260b2e3a0ee0d80330124baa0c16996ba2c2acf47d3" Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.610388 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-6jp7t" Dec 06 09:22:42 crc kubenswrapper[4672]: I1206 09:22:42.868419 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f703-account-create-update-zxv8m" Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.040971 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjh48\" (UniqueName: \"kubernetes.io/projected/21d9a4c5-61b5-4342-8b9f-853fefc25329-kube-api-access-hjh48\") pod \"21d9a4c5-61b5-4342-8b9f-853fefc25329\" (UID: \"21d9a4c5-61b5-4342-8b9f-853fefc25329\") " Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.041214 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21d9a4c5-61b5-4342-8b9f-853fefc25329-operator-scripts\") pod \"21d9a4c5-61b5-4342-8b9f-853fefc25329\" (UID: \"21d9a4c5-61b5-4342-8b9f-853fefc25329\") " Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.042033 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21d9a4c5-61b5-4342-8b9f-853fefc25329-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "21d9a4c5-61b5-4342-8b9f-853fefc25329" (UID: "21d9a4c5-61b5-4342-8b9f-853fefc25329"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.064400 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21d9a4c5-61b5-4342-8b9f-853fefc25329-kube-api-access-hjh48" (OuterVolumeSpecName: "kube-api-access-hjh48") pod "21d9a4c5-61b5-4342-8b9f-853fefc25329" (UID: "21d9a4c5-61b5-4342-8b9f-853fefc25329"). InnerVolumeSpecName "kube-api-access-hjh48". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.145915 4672 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21d9a4c5-61b5-4342-8b9f-853fefc25329-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.145947 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjh48\" (UniqueName: \"kubernetes.io/projected/21d9a4c5-61b5-4342-8b9f-853fefc25329-kube-api-access-hjh48\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.343101 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ce36-account-create-update-66fz7" Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.455843 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkb2w\" (UniqueName: \"kubernetes.io/projected/46dc695c-b1e7-4eb8-9345-63a7b9365334-kube-api-access-lkb2w\") pod \"46dc695c-b1e7-4eb8-9345-63a7b9365334\" (UID: \"46dc695c-b1e7-4eb8-9345-63a7b9365334\") " Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.456313 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46dc695c-b1e7-4eb8-9345-63a7b9365334-operator-scripts\") pod \"46dc695c-b1e7-4eb8-9345-63a7b9365334\" (UID: \"46dc695c-b1e7-4eb8-9345-63a7b9365334\") " Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.456934 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46dc695c-b1e7-4eb8-9345-63a7b9365334-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "46dc695c-b1e7-4eb8-9345-63a7b9365334" (UID: "46dc695c-b1e7-4eb8-9345-63a7b9365334"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.461276 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46dc695c-b1e7-4eb8-9345-63a7b9365334-kube-api-access-lkb2w" (OuterVolumeSpecName: "kube-api-access-lkb2w") pod "46dc695c-b1e7-4eb8-9345-63a7b9365334" (UID: "46dc695c-b1e7-4eb8-9345-63a7b9365334"). InnerVolumeSpecName "kube-api-access-lkb2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.548762 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jp69p" Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.561559 4672 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46dc695c-b1e7-4eb8-9345-63a7b9365334-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.561592 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkb2w\" (UniqueName: \"kubernetes.io/projected/46dc695c-b1e7-4eb8-9345-63a7b9365334-kube-api-access-lkb2w\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.619383 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f703-account-create-update-zxv8m" event={"ID":"21d9a4c5-61b5-4342-8b9f-853fefc25329","Type":"ContainerDied","Data":"65b1ac430e569847fbe70886007934b5cda1f05df852f7a16eb472d8becbeb02"} Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.619422 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65b1ac430e569847fbe70886007934b5cda1f05df852f7a16eb472d8becbeb02" Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.619436 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f703-account-create-update-zxv8m" Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.620997 4672 generic.go:334] "Generic (PLEG): container finished" podID="b2e3e6e1-ca0e-405e-a482-539d4dc9cadd" containerID="a677321b132a5f4d5a1b35113012695ad2b8e59c01e77fb3e3c6f8a826203fbd" exitCode=0 Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.621048 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x2jpt" event={"ID":"b2e3e6e1-ca0e-405e-a482-539d4dc9cadd","Type":"ContainerDied","Data":"a677321b132a5f4d5a1b35113012695ad2b8e59c01e77fb3e3c6f8a826203fbd"} Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.624716 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jp69p" event={"ID":"4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9","Type":"ContainerDied","Data":"64eb6069959a7107275c0ab74eecffa4fa33ac167684f5076ab8948e371e0041"} Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.624753 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64eb6069959a7107275c0ab74eecffa4fa33ac167684f5076ab8948e371e0041" Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.624814 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jp69p" Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.630016 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ce36-account-create-update-66fz7" event={"ID":"46dc695c-b1e7-4eb8-9345-63a7b9365334","Type":"ContainerDied","Data":"695bb61b60e82d45c0fc7be72c98225d18cbcdb4507413235f2a37c4ad46d8ed"} Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.630081 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="695bb61b60e82d45c0fc7be72c98225d18cbcdb4507413235f2a37c4ad46d8ed" Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.630104 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ce36-account-create-update-66fz7" Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.662940 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qzkw\" (UniqueName: \"kubernetes.io/projected/4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9-kube-api-access-8qzkw\") pod \"4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9\" (UID: \"4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9\") " Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.663045 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9-operator-scripts\") pod \"4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9\" (UID: \"4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9\") " Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.663691 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9" (UID: "4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.666123 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9-kube-api-access-8qzkw" (OuterVolumeSpecName: "kube-api-access-8qzkw") pod "4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9" (UID: "4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9"). InnerVolumeSpecName "kube-api-access-8qzkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.764876 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qzkw\" (UniqueName: \"kubernetes.io/projected/4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9-kube-api-access-8qzkw\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:43 crc kubenswrapper[4672]: I1206 09:22:43.764902 4672 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.612732 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-tlfw9"] Dec 06 09:22:44 crc kubenswrapper[4672]: E1206 09:22:44.613440 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="438c535f-88ae-4e3d-98f9-014d67606706" containerName="mariadb-database-create" Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.613465 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="438c535f-88ae-4e3d-98f9-014d67606706" containerName="mariadb-database-create" Dec 06 09:22:44 crc kubenswrapper[4672]: E1206 09:22:44.613484 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21d9a4c5-61b5-4342-8b9f-853fefc25329" containerName="mariadb-account-create-update" Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.613492 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="21d9a4c5-61b5-4342-8b9f-853fefc25329" containerName="mariadb-account-create-update" Dec 06 09:22:44 crc kubenswrapper[4672]: E1206 09:22:44.613501 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffeed4c1-b7f1-4475-850b-f768a7aafe8a" containerName="mariadb-account-create-update" Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.613508 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffeed4c1-b7f1-4475-850b-f768a7aafe8a" containerName="mariadb-account-create-update" Dec 06 09:22:44 crc kubenswrapper[4672]: E1206 09:22:44.613525 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9" containerName="mariadb-database-create" Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.613532 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9" containerName="mariadb-database-create" Dec 06 09:22:44 crc kubenswrapper[4672]: E1206 09:22:44.613550 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4ecba74-762d-4ca3-a6c4-99c9804d5d64" containerName="mariadb-database-create" Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.613557 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4ecba74-762d-4ca3-a6c4-99c9804d5d64" containerName="mariadb-database-create" Dec 06 09:22:44 crc kubenswrapper[4672]: E1206 09:22:44.613582 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46dc695c-b1e7-4eb8-9345-63a7b9365334" containerName="mariadb-account-create-update" Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.613591 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="46dc695c-b1e7-4eb8-9345-63a7b9365334" containerName="mariadb-account-create-update" Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.613814 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffeed4c1-b7f1-4475-850b-f768a7aafe8a" containerName="mariadb-account-create-update" Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.613839 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="21d9a4c5-61b5-4342-8b9f-853fefc25329" containerName="mariadb-account-create-update" Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.613855 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4ecba74-762d-4ca3-a6c4-99c9804d5d64" containerName="mariadb-database-create" Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.613866 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="438c535f-88ae-4e3d-98f9-014d67606706" containerName="mariadb-database-create" Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.613877 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="46dc695c-b1e7-4eb8-9345-63a7b9365334" containerName="mariadb-account-create-update" Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.613896 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9" containerName="mariadb-database-create" Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.614473 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tlfw9" Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.616717 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.617362 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zbcbp" Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.629835 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-tlfw9"] Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.637195 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x2jpt" event={"ID":"b2e3e6e1-ca0e-405e-a482-539d4dc9cadd","Type":"ContainerStarted","Data":"3bac0248798368475a264c4f86e0308d2c7d9a750719eb30513ecc5b2159349d"} Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.677917 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d834bae-e9bc-4c2e-be87-2749743f8ef0-combined-ca-bundle\") pod \"glance-db-sync-tlfw9\" (UID: \"1d834bae-e9bc-4c2e-be87-2749743f8ef0\") " pod="openstack/glance-db-sync-tlfw9" Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.678025 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1d834bae-e9bc-4c2e-be87-2749743f8ef0-db-sync-config-data\") pod \"glance-db-sync-tlfw9\" (UID: \"1d834bae-e9bc-4c2e-be87-2749743f8ef0\") " pod="openstack/glance-db-sync-tlfw9" Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.678098 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2mwg\" (UniqueName: \"kubernetes.io/projected/1d834bae-e9bc-4c2e-be87-2749743f8ef0-kube-api-access-h2mwg\") pod \"glance-db-sync-tlfw9\" (UID: \"1d834bae-e9bc-4c2e-be87-2749743f8ef0\") " pod="openstack/glance-db-sync-tlfw9" Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.678161 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d834bae-e9bc-4c2e-be87-2749743f8ef0-config-data\") pod \"glance-db-sync-tlfw9\" (UID: \"1d834bae-e9bc-4c2e-be87-2749743f8ef0\") " pod="openstack/glance-db-sync-tlfw9" Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.779824 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d834bae-e9bc-4c2e-be87-2749743f8ef0-combined-ca-bundle\") pod \"glance-db-sync-tlfw9\" (UID: \"1d834bae-e9bc-4c2e-be87-2749743f8ef0\") " pod="openstack/glance-db-sync-tlfw9" Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.779892 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1d834bae-e9bc-4c2e-be87-2749743f8ef0-db-sync-config-data\") pod \"glance-db-sync-tlfw9\" (UID: \"1d834bae-e9bc-4c2e-be87-2749743f8ef0\") " pod="openstack/glance-db-sync-tlfw9" Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.779941 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2mwg\" (UniqueName: \"kubernetes.io/projected/1d834bae-e9bc-4c2e-be87-2749743f8ef0-kube-api-access-h2mwg\") pod \"glance-db-sync-tlfw9\" (UID: \"1d834bae-e9bc-4c2e-be87-2749743f8ef0\") " pod="openstack/glance-db-sync-tlfw9" Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.779980 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d834bae-e9bc-4c2e-be87-2749743f8ef0-config-data\") pod \"glance-db-sync-tlfw9\" (UID: \"1d834bae-e9bc-4c2e-be87-2749743f8ef0\") " pod="openstack/glance-db-sync-tlfw9" Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.785068 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d834bae-e9bc-4c2e-be87-2749743f8ef0-combined-ca-bundle\") pod \"glance-db-sync-tlfw9\" (UID: \"1d834bae-e9bc-4c2e-be87-2749743f8ef0\") " pod="openstack/glance-db-sync-tlfw9" Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.785372 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1d834bae-e9bc-4c2e-be87-2749743f8ef0-db-sync-config-data\") pod \"glance-db-sync-tlfw9\" (UID: \"1d834bae-e9bc-4c2e-be87-2749743f8ef0\") " pod="openstack/glance-db-sync-tlfw9" Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.793237 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d834bae-e9bc-4c2e-be87-2749743f8ef0-config-data\") pod \"glance-db-sync-tlfw9\" (UID: \"1d834bae-e9bc-4c2e-be87-2749743f8ef0\") " pod="openstack/glance-db-sync-tlfw9" Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.799979 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2mwg\" (UniqueName: \"kubernetes.io/projected/1d834bae-e9bc-4c2e-be87-2749743f8ef0-kube-api-access-h2mwg\") pod \"glance-db-sync-tlfw9\" (UID: \"1d834bae-e9bc-4c2e-be87-2749743f8ef0\") " pod="openstack/glance-db-sync-tlfw9" Dec 06 09:22:44 crc kubenswrapper[4672]: I1206 09:22:44.930509 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tlfw9" Dec 06 09:22:45 crc kubenswrapper[4672]: I1206 09:22:45.582725 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-x2jpt" podStartSLOduration=4.112156314 podStartE2EDuration="6.582369908s" podCreationTimestamp="2025-12-06 09:22:39 +0000 UTC" firstStartedPulling="2025-12-06 09:22:41.579536126 +0000 UTC m=+979.323796413" lastFinishedPulling="2025-12-06 09:22:44.04974972 +0000 UTC m=+981.794010007" observedRunningTime="2025-12-06 09:22:44.679239239 +0000 UTC m=+982.423499536" watchObservedRunningTime="2025-12-06 09:22:45.582369908 +0000 UTC m=+983.326630195" Dec 06 09:22:45 crc kubenswrapper[4672]: I1206 09:22:45.587768 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-tlfw9"] Dec 06 09:22:45 crc kubenswrapper[4672]: I1206 09:22:45.646412 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tlfw9" event={"ID":"1d834bae-e9bc-4c2e-be87-2749743f8ef0","Type":"ContainerStarted","Data":"e00a98a936db6d3617bb693fb205c39aba6487a6e07a73bb1f6f0edd4d119996"} Dec 06 09:22:45 crc kubenswrapper[4672]: I1206 09:22:45.994246 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-hxgmq" podUID="2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6" containerName="ovn-controller" probeResult="failure" output=< Dec 06 09:22:45 crc kubenswrapper[4672]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 06 09:22:45 crc kubenswrapper[4672]: > Dec 06 09:22:46 crc kubenswrapper[4672]: I1206 09:22:46.080449 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-rsxq7" Dec 06 09:22:46 crc kubenswrapper[4672]: I1206 09:22:46.116831 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-rsxq7" Dec 06 09:22:46 crc kubenswrapper[4672]: I1206 09:22:46.328956 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-hxgmq-config-92rwn"] Dec 06 09:22:46 crc kubenswrapper[4672]: I1206 09:22:46.330047 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hxgmq-config-92rwn" Dec 06 09:22:46 crc kubenswrapper[4672]: I1206 09:22:46.334032 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 06 09:22:46 crc kubenswrapper[4672]: I1206 09:22:46.346997 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hxgmq-config-92rwn"] Dec 06 09:22:46 crc kubenswrapper[4672]: I1206 09:22:46.412772 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/348be5bc-2aea-4ab4-aa50-4d1312b4de17-var-log-ovn\") pod \"ovn-controller-hxgmq-config-92rwn\" (UID: \"348be5bc-2aea-4ab4-aa50-4d1312b4de17\") " pod="openstack/ovn-controller-hxgmq-config-92rwn" Dec 06 09:22:46 crc kubenswrapper[4672]: I1206 09:22:46.412820 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n7z9\" (UniqueName: \"kubernetes.io/projected/348be5bc-2aea-4ab4-aa50-4d1312b4de17-kube-api-access-4n7z9\") pod \"ovn-controller-hxgmq-config-92rwn\" (UID: \"348be5bc-2aea-4ab4-aa50-4d1312b4de17\") " pod="openstack/ovn-controller-hxgmq-config-92rwn" Dec 06 09:22:46 crc kubenswrapper[4672]: I1206 09:22:46.412859 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/348be5bc-2aea-4ab4-aa50-4d1312b4de17-additional-scripts\") pod \"ovn-controller-hxgmq-config-92rwn\" (UID: \"348be5bc-2aea-4ab4-aa50-4d1312b4de17\") " pod="openstack/ovn-controller-hxgmq-config-92rwn" Dec 06 09:22:46 crc kubenswrapper[4672]: I1206 09:22:46.412887 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/348be5bc-2aea-4ab4-aa50-4d1312b4de17-var-run-ovn\") pod \"ovn-controller-hxgmq-config-92rwn\" (UID: \"348be5bc-2aea-4ab4-aa50-4d1312b4de17\") " pod="openstack/ovn-controller-hxgmq-config-92rwn" Dec 06 09:22:46 crc kubenswrapper[4672]: I1206 09:22:46.412918 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/348be5bc-2aea-4ab4-aa50-4d1312b4de17-scripts\") pod \"ovn-controller-hxgmq-config-92rwn\" (UID: \"348be5bc-2aea-4ab4-aa50-4d1312b4de17\") " pod="openstack/ovn-controller-hxgmq-config-92rwn" Dec 06 09:22:46 crc kubenswrapper[4672]: I1206 09:22:46.412967 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/348be5bc-2aea-4ab4-aa50-4d1312b4de17-var-run\") pod \"ovn-controller-hxgmq-config-92rwn\" (UID: \"348be5bc-2aea-4ab4-aa50-4d1312b4de17\") " pod="openstack/ovn-controller-hxgmq-config-92rwn" Dec 06 09:22:46 crc kubenswrapper[4672]: I1206 09:22:46.513847 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/348be5bc-2aea-4ab4-aa50-4d1312b4de17-var-run\") pod \"ovn-controller-hxgmq-config-92rwn\" (UID: \"348be5bc-2aea-4ab4-aa50-4d1312b4de17\") " pod="openstack/ovn-controller-hxgmq-config-92rwn" Dec 06 09:22:46 crc kubenswrapper[4672]: I1206 09:22:46.513924 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/348be5bc-2aea-4ab4-aa50-4d1312b4de17-var-log-ovn\") pod \"ovn-controller-hxgmq-config-92rwn\" (UID: \"348be5bc-2aea-4ab4-aa50-4d1312b4de17\") " pod="openstack/ovn-controller-hxgmq-config-92rwn" Dec 06 09:22:46 crc kubenswrapper[4672]: I1206 09:22:46.513947 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n7z9\" (UniqueName: \"kubernetes.io/projected/348be5bc-2aea-4ab4-aa50-4d1312b4de17-kube-api-access-4n7z9\") pod \"ovn-controller-hxgmq-config-92rwn\" (UID: \"348be5bc-2aea-4ab4-aa50-4d1312b4de17\") " pod="openstack/ovn-controller-hxgmq-config-92rwn" Dec 06 09:22:46 crc kubenswrapper[4672]: I1206 09:22:46.513980 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/348be5bc-2aea-4ab4-aa50-4d1312b4de17-additional-scripts\") pod \"ovn-controller-hxgmq-config-92rwn\" (UID: \"348be5bc-2aea-4ab4-aa50-4d1312b4de17\") " pod="openstack/ovn-controller-hxgmq-config-92rwn" Dec 06 09:22:46 crc kubenswrapper[4672]: I1206 09:22:46.514014 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/348be5bc-2aea-4ab4-aa50-4d1312b4de17-var-run-ovn\") pod \"ovn-controller-hxgmq-config-92rwn\" (UID: \"348be5bc-2aea-4ab4-aa50-4d1312b4de17\") " pod="openstack/ovn-controller-hxgmq-config-92rwn" Dec 06 09:22:46 crc kubenswrapper[4672]: I1206 09:22:46.514037 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/348be5bc-2aea-4ab4-aa50-4d1312b4de17-scripts\") pod \"ovn-controller-hxgmq-config-92rwn\" (UID: \"348be5bc-2aea-4ab4-aa50-4d1312b4de17\") " pod="openstack/ovn-controller-hxgmq-config-92rwn" Dec 06 09:22:46 crc kubenswrapper[4672]: I1206 09:22:46.516014 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/348be5bc-2aea-4ab4-aa50-4d1312b4de17-scripts\") pod \"ovn-controller-hxgmq-config-92rwn\" (UID: \"348be5bc-2aea-4ab4-aa50-4d1312b4de17\") " pod="openstack/ovn-controller-hxgmq-config-92rwn" Dec 06 09:22:46 crc kubenswrapper[4672]: I1206 09:22:46.516236 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/348be5bc-2aea-4ab4-aa50-4d1312b4de17-var-run\") pod \"ovn-controller-hxgmq-config-92rwn\" (UID: \"348be5bc-2aea-4ab4-aa50-4d1312b4de17\") " pod="openstack/ovn-controller-hxgmq-config-92rwn" Dec 06 09:22:46 crc kubenswrapper[4672]: I1206 09:22:46.516280 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/348be5bc-2aea-4ab4-aa50-4d1312b4de17-var-log-ovn\") pod \"ovn-controller-hxgmq-config-92rwn\" (UID: \"348be5bc-2aea-4ab4-aa50-4d1312b4de17\") " pod="openstack/ovn-controller-hxgmq-config-92rwn" Dec 06 09:22:46 crc kubenswrapper[4672]: I1206 09:22:46.517009 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/348be5bc-2aea-4ab4-aa50-4d1312b4de17-additional-scripts\") pod \"ovn-controller-hxgmq-config-92rwn\" (UID: \"348be5bc-2aea-4ab4-aa50-4d1312b4de17\") " pod="openstack/ovn-controller-hxgmq-config-92rwn" Dec 06 09:22:46 crc kubenswrapper[4672]: I1206 09:22:46.517062 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/348be5bc-2aea-4ab4-aa50-4d1312b4de17-var-run-ovn\") pod \"ovn-controller-hxgmq-config-92rwn\" (UID: \"348be5bc-2aea-4ab4-aa50-4d1312b4de17\") " pod="openstack/ovn-controller-hxgmq-config-92rwn" Dec 06 09:22:46 crc kubenswrapper[4672]: I1206 09:22:46.536929 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n7z9\" (UniqueName: \"kubernetes.io/projected/348be5bc-2aea-4ab4-aa50-4d1312b4de17-kube-api-access-4n7z9\") pod \"ovn-controller-hxgmq-config-92rwn\" (UID: \"348be5bc-2aea-4ab4-aa50-4d1312b4de17\") " pod="openstack/ovn-controller-hxgmq-config-92rwn" Dec 06 09:22:46 crc kubenswrapper[4672]: I1206 09:22:46.656099 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hxgmq-config-92rwn" Dec 06 09:22:47 crc kubenswrapper[4672]: I1206 09:22:47.131267 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hxgmq-config-92rwn"] Dec 06 09:22:47 crc kubenswrapper[4672]: I1206 09:22:47.670823 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hxgmq-config-92rwn" event={"ID":"348be5bc-2aea-4ab4-aa50-4d1312b4de17","Type":"ContainerStarted","Data":"41d55caf93c46448c406ef609682cf843f2af2eab851696b701eb94d34dab6eb"} Dec 06 09:22:48 crc kubenswrapper[4672]: I1206 09:22:48.683917 4672 generic.go:334] "Generic (PLEG): container finished" podID="348be5bc-2aea-4ab4-aa50-4d1312b4de17" containerID="309afffbabec4d43c69870d78c6a5d2f44dc65cf1d57c3157ccc80462025e840" exitCode=0 Dec 06 09:22:48 crc kubenswrapper[4672]: I1206 09:22:48.684089 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hxgmq-config-92rwn" event={"ID":"348be5bc-2aea-4ab4-aa50-4d1312b4de17","Type":"ContainerDied","Data":"309afffbabec4d43c69870d78c6a5d2f44dc65cf1d57c3157ccc80462025e840"} Dec 06 09:22:50 crc kubenswrapper[4672]: I1206 09:22:50.049569 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hxgmq-config-92rwn" Dec 06 09:22:50 crc kubenswrapper[4672]: I1206 09:22:50.170013 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-x2jpt" Dec 06 09:22:50 crc kubenswrapper[4672]: I1206 09:22:50.170433 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-x2jpt" Dec 06 09:22:50 crc kubenswrapper[4672]: I1206 09:22:50.193551 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n7z9\" (UniqueName: \"kubernetes.io/projected/348be5bc-2aea-4ab4-aa50-4d1312b4de17-kube-api-access-4n7z9\") pod \"348be5bc-2aea-4ab4-aa50-4d1312b4de17\" (UID: \"348be5bc-2aea-4ab4-aa50-4d1312b4de17\") " Dec 06 09:22:50 crc kubenswrapper[4672]: I1206 09:22:50.193632 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/348be5bc-2aea-4ab4-aa50-4d1312b4de17-additional-scripts\") pod \"348be5bc-2aea-4ab4-aa50-4d1312b4de17\" (UID: \"348be5bc-2aea-4ab4-aa50-4d1312b4de17\") " Dec 06 09:22:50 crc kubenswrapper[4672]: I1206 09:22:50.193661 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/348be5bc-2aea-4ab4-aa50-4d1312b4de17-var-log-ovn\") pod \"348be5bc-2aea-4ab4-aa50-4d1312b4de17\" (UID: \"348be5bc-2aea-4ab4-aa50-4d1312b4de17\") " Dec 06 09:22:50 crc kubenswrapper[4672]: I1206 09:22:50.193734 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/348be5bc-2aea-4ab4-aa50-4d1312b4de17-var-run-ovn\") pod \"348be5bc-2aea-4ab4-aa50-4d1312b4de17\" (UID: \"348be5bc-2aea-4ab4-aa50-4d1312b4de17\") " Dec 06 09:22:50 crc kubenswrapper[4672]: I1206 09:22:50.193786 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/348be5bc-2aea-4ab4-aa50-4d1312b4de17-var-run\") pod \"348be5bc-2aea-4ab4-aa50-4d1312b4de17\" (UID: \"348be5bc-2aea-4ab4-aa50-4d1312b4de17\") " Dec 06 09:22:50 crc kubenswrapper[4672]: I1206 09:22:50.193825 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/348be5bc-2aea-4ab4-aa50-4d1312b4de17-scripts\") pod \"348be5bc-2aea-4ab4-aa50-4d1312b4de17\" (UID: \"348be5bc-2aea-4ab4-aa50-4d1312b4de17\") " Dec 06 09:22:50 crc kubenswrapper[4672]: I1206 09:22:50.194238 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/348be5bc-2aea-4ab4-aa50-4d1312b4de17-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "348be5bc-2aea-4ab4-aa50-4d1312b4de17" (UID: "348be5bc-2aea-4ab4-aa50-4d1312b4de17"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:22:50 crc kubenswrapper[4672]: I1206 09:22:50.194319 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/348be5bc-2aea-4ab4-aa50-4d1312b4de17-var-run" (OuterVolumeSpecName: "var-run") pod "348be5bc-2aea-4ab4-aa50-4d1312b4de17" (UID: "348be5bc-2aea-4ab4-aa50-4d1312b4de17"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:22:50 crc kubenswrapper[4672]: I1206 09:22:50.194411 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/348be5bc-2aea-4ab4-aa50-4d1312b4de17-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "348be5bc-2aea-4ab4-aa50-4d1312b4de17" (UID: "348be5bc-2aea-4ab4-aa50-4d1312b4de17"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:22:50 crc kubenswrapper[4672]: I1206 09:22:50.194482 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/348be5bc-2aea-4ab4-aa50-4d1312b4de17-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "348be5bc-2aea-4ab4-aa50-4d1312b4de17" (UID: "348be5bc-2aea-4ab4-aa50-4d1312b4de17"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:22:50 crc kubenswrapper[4672]: I1206 09:22:50.195068 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/348be5bc-2aea-4ab4-aa50-4d1312b4de17-scripts" (OuterVolumeSpecName: "scripts") pod "348be5bc-2aea-4ab4-aa50-4d1312b4de17" (UID: "348be5bc-2aea-4ab4-aa50-4d1312b4de17"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:22:50 crc kubenswrapper[4672]: I1206 09:22:50.202788 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/348be5bc-2aea-4ab4-aa50-4d1312b4de17-kube-api-access-4n7z9" (OuterVolumeSpecName: "kube-api-access-4n7z9") pod "348be5bc-2aea-4ab4-aa50-4d1312b4de17" (UID: "348be5bc-2aea-4ab4-aa50-4d1312b4de17"). InnerVolumeSpecName "kube-api-access-4n7z9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:22:50 crc kubenswrapper[4672]: I1206 09:22:50.226257 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-x2jpt" Dec 06 09:22:50 crc kubenswrapper[4672]: I1206 09:22:50.295771 4672 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/348be5bc-2aea-4ab4-aa50-4d1312b4de17-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:50 crc kubenswrapper[4672]: I1206 09:22:50.295812 4672 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/348be5bc-2aea-4ab4-aa50-4d1312b4de17-var-run\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:50 crc kubenswrapper[4672]: I1206 09:22:50.295822 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/348be5bc-2aea-4ab4-aa50-4d1312b4de17-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:50 crc kubenswrapper[4672]: I1206 09:22:50.295834 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n7z9\" (UniqueName: \"kubernetes.io/projected/348be5bc-2aea-4ab4-aa50-4d1312b4de17-kube-api-access-4n7z9\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:50 crc kubenswrapper[4672]: I1206 09:22:50.295845 4672 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/348be5bc-2aea-4ab4-aa50-4d1312b4de17-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:50 crc kubenswrapper[4672]: I1206 09:22:50.295855 4672 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/348be5bc-2aea-4ab4-aa50-4d1312b4de17-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:50 crc kubenswrapper[4672]: I1206 09:22:50.715227 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hxgmq-config-92rwn" Dec 06 09:22:50 crc kubenswrapper[4672]: I1206 09:22:50.716131 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hxgmq-config-92rwn" event={"ID":"348be5bc-2aea-4ab4-aa50-4d1312b4de17","Type":"ContainerDied","Data":"41d55caf93c46448c406ef609682cf843f2af2eab851696b701eb94d34dab6eb"} Dec 06 09:22:50 crc kubenswrapper[4672]: I1206 09:22:50.716184 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41d55caf93c46448c406ef609682cf843f2af2eab851696b701eb94d34dab6eb" Dec 06 09:22:50 crc kubenswrapper[4672]: I1206 09:22:50.763169 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-x2jpt" Dec 06 09:22:50 crc kubenswrapper[4672]: I1206 09:22:50.802840 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x2jpt"] Dec 06 09:22:50 crc kubenswrapper[4672]: I1206 09:22:50.985275 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-hxgmq" Dec 06 09:22:51 crc kubenswrapper[4672]: I1206 09:22:51.142558 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-hxgmq-config-92rwn"] Dec 06 09:22:51 crc kubenswrapper[4672]: I1206 09:22:51.148367 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-hxgmq-config-92rwn"] Dec 06 09:22:51 crc kubenswrapper[4672]: I1206 09:22:51.290919 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-hxgmq-config-g9fpp"] Dec 06 09:22:51 crc kubenswrapper[4672]: E1206 09:22:51.291237 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="348be5bc-2aea-4ab4-aa50-4d1312b4de17" containerName="ovn-config" Dec 06 09:22:51 crc kubenswrapper[4672]: I1206 09:22:51.291249 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="348be5bc-2aea-4ab4-aa50-4d1312b4de17" containerName="ovn-config" Dec 06 09:22:51 crc kubenswrapper[4672]: I1206 09:22:51.291440 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="348be5bc-2aea-4ab4-aa50-4d1312b4de17" containerName="ovn-config" Dec 06 09:22:51 crc kubenswrapper[4672]: I1206 09:22:51.294047 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hxgmq-config-g9fpp" Dec 06 09:22:51 crc kubenswrapper[4672]: I1206 09:22:51.296435 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 06 09:22:51 crc kubenswrapper[4672]: I1206 09:22:51.319227 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hxgmq-config-g9fpp"] Dec 06 09:22:51 crc kubenswrapper[4672]: I1206 09:22:51.415755 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/09781861-e068-40bb-9190-cf582605a8ad-var-run\") pod \"ovn-controller-hxgmq-config-g9fpp\" (UID: \"09781861-e068-40bb-9190-cf582605a8ad\") " pod="openstack/ovn-controller-hxgmq-config-g9fpp" Dec 06 09:22:51 crc kubenswrapper[4672]: I1206 09:22:51.415796 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/09781861-e068-40bb-9190-cf582605a8ad-var-log-ovn\") pod \"ovn-controller-hxgmq-config-g9fpp\" (UID: \"09781861-e068-40bb-9190-cf582605a8ad\") " pod="openstack/ovn-controller-hxgmq-config-g9fpp" Dec 06 09:22:51 crc kubenswrapper[4672]: I1206 09:22:51.415831 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/09781861-e068-40bb-9190-cf582605a8ad-additional-scripts\") pod \"ovn-controller-hxgmq-config-g9fpp\" (UID: \"09781861-e068-40bb-9190-cf582605a8ad\") " pod="openstack/ovn-controller-hxgmq-config-g9fpp" Dec 06 09:22:51 crc kubenswrapper[4672]: I1206 09:22:51.415886 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hmkh\" (UniqueName: \"kubernetes.io/projected/09781861-e068-40bb-9190-cf582605a8ad-kube-api-access-2hmkh\") pod \"ovn-controller-hxgmq-config-g9fpp\" (UID: \"09781861-e068-40bb-9190-cf582605a8ad\") " pod="openstack/ovn-controller-hxgmq-config-g9fpp" Dec 06 09:22:51 crc kubenswrapper[4672]: I1206 09:22:51.415929 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09781861-e068-40bb-9190-cf582605a8ad-scripts\") pod \"ovn-controller-hxgmq-config-g9fpp\" (UID: \"09781861-e068-40bb-9190-cf582605a8ad\") " pod="openstack/ovn-controller-hxgmq-config-g9fpp" Dec 06 09:22:51 crc kubenswrapper[4672]: I1206 09:22:51.415961 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/09781861-e068-40bb-9190-cf582605a8ad-var-run-ovn\") pod \"ovn-controller-hxgmq-config-g9fpp\" (UID: \"09781861-e068-40bb-9190-cf582605a8ad\") " pod="openstack/ovn-controller-hxgmq-config-g9fpp" Dec 06 09:22:51 crc kubenswrapper[4672]: I1206 09:22:51.518058 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hmkh\" (UniqueName: \"kubernetes.io/projected/09781861-e068-40bb-9190-cf582605a8ad-kube-api-access-2hmkh\") pod \"ovn-controller-hxgmq-config-g9fpp\" (UID: \"09781861-e068-40bb-9190-cf582605a8ad\") " pod="openstack/ovn-controller-hxgmq-config-g9fpp" Dec 06 09:22:51 crc kubenswrapper[4672]: I1206 09:22:51.518123 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09781861-e068-40bb-9190-cf582605a8ad-scripts\") pod \"ovn-controller-hxgmq-config-g9fpp\" (UID: \"09781861-e068-40bb-9190-cf582605a8ad\") " pod="openstack/ovn-controller-hxgmq-config-g9fpp" Dec 06 09:22:51 crc kubenswrapper[4672]: I1206 09:22:51.518153 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/09781861-e068-40bb-9190-cf582605a8ad-var-run-ovn\") pod \"ovn-controller-hxgmq-config-g9fpp\" (UID: \"09781861-e068-40bb-9190-cf582605a8ad\") " pod="openstack/ovn-controller-hxgmq-config-g9fpp" Dec 06 09:22:51 crc kubenswrapper[4672]: I1206 09:22:51.518187 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/09781861-e068-40bb-9190-cf582605a8ad-var-run\") pod \"ovn-controller-hxgmq-config-g9fpp\" (UID: \"09781861-e068-40bb-9190-cf582605a8ad\") " pod="openstack/ovn-controller-hxgmq-config-g9fpp" Dec 06 09:22:51 crc kubenswrapper[4672]: I1206 09:22:51.518204 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/09781861-e068-40bb-9190-cf582605a8ad-var-log-ovn\") pod \"ovn-controller-hxgmq-config-g9fpp\" (UID: \"09781861-e068-40bb-9190-cf582605a8ad\") " pod="openstack/ovn-controller-hxgmq-config-g9fpp" Dec 06 09:22:51 crc kubenswrapper[4672]: I1206 09:22:51.518228 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/09781861-e068-40bb-9190-cf582605a8ad-additional-scripts\") pod \"ovn-controller-hxgmq-config-g9fpp\" (UID: \"09781861-e068-40bb-9190-cf582605a8ad\") " pod="openstack/ovn-controller-hxgmq-config-g9fpp" Dec 06 09:22:51 crc kubenswrapper[4672]: I1206 09:22:51.518993 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/09781861-e068-40bb-9190-cf582605a8ad-additional-scripts\") pod \"ovn-controller-hxgmq-config-g9fpp\" (UID: \"09781861-e068-40bb-9190-cf582605a8ad\") " pod="openstack/ovn-controller-hxgmq-config-g9fpp" Dec 06 09:22:51 crc kubenswrapper[4672]: I1206 09:22:51.519669 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/09781861-e068-40bb-9190-cf582605a8ad-var-run\") pod \"ovn-controller-hxgmq-config-g9fpp\" (UID: \"09781861-e068-40bb-9190-cf582605a8ad\") " pod="openstack/ovn-controller-hxgmq-config-g9fpp" Dec 06 09:22:51 crc kubenswrapper[4672]: I1206 09:22:51.519716 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/09781861-e068-40bb-9190-cf582605a8ad-var-run-ovn\") pod \"ovn-controller-hxgmq-config-g9fpp\" (UID: \"09781861-e068-40bb-9190-cf582605a8ad\") " pod="openstack/ovn-controller-hxgmq-config-g9fpp" Dec 06 09:22:51 crc kubenswrapper[4672]: I1206 09:22:51.519752 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/09781861-e068-40bb-9190-cf582605a8ad-var-log-ovn\") pod \"ovn-controller-hxgmq-config-g9fpp\" (UID: \"09781861-e068-40bb-9190-cf582605a8ad\") " pod="openstack/ovn-controller-hxgmq-config-g9fpp" Dec 06 09:22:51 crc kubenswrapper[4672]: I1206 09:22:51.523022 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09781861-e068-40bb-9190-cf582605a8ad-scripts\") pod \"ovn-controller-hxgmq-config-g9fpp\" (UID: \"09781861-e068-40bb-9190-cf582605a8ad\") " pod="openstack/ovn-controller-hxgmq-config-g9fpp" Dec 06 09:22:51 crc kubenswrapper[4672]: I1206 09:22:51.538205 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hmkh\" (UniqueName: \"kubernetes.io/projected/09781861-e068-40bb-9190-cf582605a8ad-kube-api-access-2hmkh\") pod \"ovn-controller-hxgmq-config-g9fpp\" (UID: \"09781861-e068-40bb-9190-cf582605a8ad\") " pod="openstack/ovn-controller-hxgmq-config-g9fpp" Dec 06 09:22:51 crc kubenswrapper[4672]: I1206 09:22:51.661248 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hxgmq-config-g9fpp" Dec 06 09:22:52 crc kubenswrapper[4672]: I1206 09:22:52.578028 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="348be5bc-2aea-4ab4-aa50-4d1312b4de17" path="/var/lib/kubelet/pods/348be5bc-2aea-4ab4-aa50-4d1312b4de17/volumes" Dec 06 09:22:52 crc kubenswrapper[4672]: I1206 09:22:52.731949 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-x2jpt" podUID="b2e3e6e1-ca0e-405e-a482-539d4dc9cadd" containerName="registry-server" containerID="cri-o://3bac0248798368475a264c4f86e0308d2c7d9a750719eb30513ecc5b2159349d" gracePeriod=2 Dec 06 09:22:52 crc kubenswrapper[4672]: I1206 09:22:52.892736 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-js95z"] Dec 06 09:22:52 crc kubenswrapper[4672]: I1206 09:22:52.895458 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-js95z" Dec 06 09:22:52 crc kubenswrapper[4672]: I1206 09:22:52.916194 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-js95z"] Dec 06 09:22:52 crc kubenswrapper[4672]: I1206 09:22:52.959680 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr2fk\" (UniqueName: \"kubernetes.io/projected/5aed11bb-ab80-496c-b92a-3fdc6a9fc044-kube-api-access-hr2fk\") pod \"community-operators-js95z\" (UID: \"5aed11bb-ab80-496c-b92a-3fdc6a9fc044\") " pod="openshift-marketplace/community-operators-js95z" Dec 06 09:22:52 crc kubenswrapper[4672]: I1206 09:22:52.959750 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aed11bb-ab80-496c-b92a-3fdc6a9fc044-catalog-content\") pod \"community-operators-js95z\" (UID: \"5aed11bb-ab80-496c-b92a-3fdc6a9fc044\") " pod="openshift-marketplace/community-operators-js95z" Dec 06 09:22:52 crc kubenswrapper[4672]: I1206 09:22:52.959783 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aed11bb-ab80-496c-b92a-3fdc6a9fc044-utilities\") pod \"community-operators-js95z\" (UID: \"5aed11bb-ab80-496c-b92a-3fdc6a9fc044\") " pod="openshift-marketplace/community-operators-js95z" Dec 06 09:22:53 crc kubenswrapper[4672]: I1206 09:22:53.060843 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aed11bb-ab80-496c-b92a-3fdc6a9fc044-catalog-content\") pod \"community-operators-js95z\" (UID: \"5aed11bb-ab80-496c-b92a-3fdc6a9fc044\") " pod="openshift-marketplace/community-operators-js95z" Dec 06 09:22:53 crc kubenswrapper[4672]: I1206 09:22:53.061125 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aed11bb-ab80-496c-b92a-3fdc6a9fc044-utilities\") pod \"community-operators-js95z\" (UID: \"5aed11bb-ab80-496c-b92a-3fdc6a9fc044\") " pod="openshift-marketplace/community-operators-js95z" Dec 06 09:22:53 crc kubenswrapper[4672]: I1206 09:22:53.061399 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aed11bb-ab80-496c-b92a-3fdc6a9fc044-catalog-content\") pod \"community-operators-js95z\" (UID: \"5aed11bb-ab80-496c-b92a-3fdc6a9fc044\") " pod="openshift-marketplace/community-operators-js95z" Dec 06 09:22:53 crc kubenswrapper[4672]: I1206 09:22:53.061545 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aed11bb-ab80-496c-b92a-3fdc6a9fc044-utilities\") pod \"community-operators-js95z\" (UID: \"5aed11bb-ab80-496c-b92a-3fdc6a9fc044\") " pod="openshift-marketplace/community-operators-js95z" Dec 06 09:22:53 crc kubenswrapper[4672]: I1206 09:22:53.062063 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr2fk\" (UniqueName: \"kubernetes.io/projected/5aed11bb-ab80-496c-b92a-3fdc6a9fc044-kube-api-access-hr2fk\") pod \"community-operators-js95z\" (UID: \"5aed11bb-ab80-496c-b92a-3fdc6a9fc044\") " pod="openshift-marketplace/community-operators-js95z" Dec 06 09:22:53 crc kubenswrapper[4672]: I1206 09:22:53.094631 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr2fk\" (UniqueName: \"kubernetes.io/projected/5aed11bb-ab80-496c-b92a-3fdc6a9fc044-kube-api-access-hr2fk\") pod \"community-operators-js95z\" (UID: \"5aed11bb-ab80-496c-b92a-3fdc6a9fc044\") " pod="openshift-marketplace/community-operators-js95z" Dec 06 09:22:53 crc kubenswrapper[4672]: I1206 09:22:53.236748 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-js95z" Dec 06 09:22:53 crc kubenswrapper[4672]: I1206 09:22:53.754265 4672 generic.go:334] "Generic (PLEG): container finished" podID="b2e3e6e1-ca0e-405e-a482-539d4dc9cadd" containerID="3bac0248798368475a264c4f86e0308d2c7d9a750719eb30513ecc5b2159349d" exitCode=0 Dec 06 09:22:53 crc kubenswrapper[4672]: I1206 09:22:53.754305 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x2jpt" event={"ID":"b2e3e6e1-ca0e-405e-a482-539d4dc9cadd","Type":"ContainerDied","Data":"3bac0248798368475a264c4f86e0308d2c7d9a750719eb30513ecc5b2159349d"} Dec 06 09:22:54 crc kubenswrapper[4672]: I1206 09:22:54.776493 4672 generic.go:334] "Generic (PLEG): container finished" podID="1bbe623e-19ec-49f2-bfa4-65728b94d035" containerID="0146bf3c3518e6366d9ba504ec80e4a862ae7f202781214ec92564f51d9798e2" exitCode=0 Dec 06 09:22:54 crc kubenswrapper[4672]: I1206 09:22:54.776556 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1bbe623e-19ec-49f2-bfa4-65728b94d035","Type":"ContainerDied","Data":"0146bf3c3518e6366d9ba504ec80e4a862ae7f202781214ec92564f51d9798e2"} Dec 06 09:22:54 crc kubenswrapper[4672]: I1206 09:22:54.780131 4672 generic.go:334] "Generic (PLEG): container finished" podID="54ae723f-36b7-4991-9439-23af064249fa" containerID="89614a71a15f9d9f77af3aefe664b3203631d0d02c91b9ee18b19df2d3ae7473" exitCode=0 Dec 06 09:22:54 crc kubenswrapper[4672]: I1206 09:22:54.780156 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"54ae723f-36b7-4991-9439-23af064249fa","Type":"ContainerDied","Data":"89614a71a15f9d9f77af3aefe664b3203631d0d02c91b9ee18b19df2d3ae7473"} Dec 06 09:22:58 crc kubenswrapper[4672]: I1206 09:22:58.992006 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x2jpt" Dec 06 09:22:59 crc kubenswrapper[4672]: I1206 09:22:59.087250 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzn76\" (UniqueName: \"kubernetes.io/projected/b2e3e6e1-ca0e-405e-a482-539d4dc9cadd-kube-api-access-mzn76\") pod \"b2e3e6e1-ca0e-405e-a482-539d4dc9cadd\" (UID: \"b2e3e6e1-ca0e-405e-a482-539d4dc9cadd\") " Dec 06 09:22:59 crc kubenswrapper[4672]: I1206 09:22:59.087845 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2e3e6e1-ca0e-405e-a482-539d4dc9cadd-catalog-content\") pod \"b2e3e6e1-ca0e-405e-a482-539d4dc9cadd\" (UID: \"b2e3e6e1-ca0e-405e-a482-539d4dc9cadd\") " Dec 06 09:22:59 crc kubenswrapper[4672]: I1206 09:22:59.088040 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2e3e6e1-ca0e-405e-a482-539d4dc9cadd-utilities\") pod \"b2e3e6e1-ca0e-405e-a482-539d4dc9cadd\" (UID: \"b2e3e6e1-ca0e-405e-a482-539d4dc9cadd\") " Dec 06 09:22:59 crc kubenswrapper[4672]: I1206 09:22:59.089178 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2e3e6e1-ca0e-405e-a482-539d4dc9cadd-utilities" (OuterVolumeSpecName: "utilities") pod "b2e3e6e1-ca0e-405e-a482-539d4dc9cadd" (UID: "b2e3e6e1-ca0e-405e-a482-539d4dc9cadd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:22:59 crc kubenswrapper[4672]: I1206 09:22:59.091660 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2e3e6e1-ca0e-405e-a482-539d4dc9cadd-kube-api-access-mzn76" (OuterVolumeSpecName: "kube-api-access-mzn76") pod "b2e3e6e1-ca0e-405e-a482-539d4dc9cadd" (UID: "b2e3e6e1-ca0e-405e-a482-539d4dc9cadd"). InnerVolumeSpecName "kube-api-access-mzn76". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:22:59 crc kubenswrapper[4672]: I1206 09:22:59.190477 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzn76\" (UniqueName: \"kubernetes.io/projected/b2e3e6e1-ca0e-405e-a482-539d4dc9cadd-kube-api-access-mzn76\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:59 crc kubenswrapper[4672]: I1206 09:22:59.190509 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2e3e6e1-ca0e-405e-a482-539d4dc9cadd-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:59 crc kubenswrapper[4672]: I1206 09:22:59.236313 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2e3e6e1-ca0e-405e-a482-539d4dc9cadd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2e3e6e1-ca0e-405e-a482-539d4dc9cadd" (UID: "b2e3e6e1-ca0e-405e-a482-539d4dc9cadd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:22:59 crc kubenswrapper[4672]: I1206 09:22:59.292156 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2e3e6e1-ca0e-405e-a482-539d4dc9cadd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:59 crc kubenswrapper[4672]: I1206 09:22:59.319575 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-js95z"] Dec 06 09:22:59 crc kubenswrapper[4672]: I1206 09:22:59.345631 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hxgmq-config-g9fpp"] Dec 06 09:22:59 crc kubenswrapper[4672]: W1206 09:22:59.351347 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5aed11bb_ab80_496c_b92a_3fdc6a9fc044.slice/crio-e78d25dc3fa28360c8731b92cf623da1f1e6287a5246616f6729e5fa42a809e3 WatchSource:0}: Error finding container e78d25dc3fa28360c8731b92cf623da1f1e6287a5246616f6729e5fa42a809e3: Status 404 returned error can't find the container with id e78d25dc3fa28360c8731b92cf623da1f1e6287a5246616f6729e5fa42a809e3 Dec 06 09:22:59 crc kubenswrapper[4672]: I1206 09:22:59.837421 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tlfw9" event={"ID":"1d834bae-e9bc-4c2e-be87-2749743f8ef0","Type":"ContainerStarted","Data":"b76c56d4d0ca7315c8a755fa2c649eaa1e616d7550c7c65fdb869c70285c12a7"} Dec 06 09:22:59 crc kubenswrapper[4672]: I1206 09:22:59.840313 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x2jpt" event={"ID":"b2e3e6e1-ca0e-405e-a482-539d4dc9cadd","Type":"ContainerDied","Data":"477c0791cf1be36b66cd2e1f455ef7575b6a01c6d0a3ab53e338c5aa79eea814"} Dec 06 09:22:59 crc kubenswrapper[4672]: I1206 09:22:59.840577 4672 scope.go:117] "RemoveContainer" containerID="3bac0248798368475a264c4f86e0308d2c7d9a750719eb30513ecc5b2159349d" Dec 06 09:22:59 crc kubenswrapper[4672]: I1206 09:22:59.841326 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x2jpt" Dec 06 09:22:59 crc kubenswrapper[4672]: I1206 09:22:59.842898 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1bbe623e-19ec-49f2-bfa4-65728b94d035","Type":"ContainerStarted","Data":"c96b7335657d68f744cd6906081458b139e9f155f899c5b2acdd86893bf9b88f"} Dec 06 09:22:59 crc kubenswrapper[4672]: I1206 09:22:59.843148 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:22:59 crc kubenswrapper[4672]: I1206 09:22:59.845693 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"54ae723f-36b7-4991-9439-23af064249fa","Type":"ContainerStarted","Data":"e7d4689ccc9708e66c57367ceb978a99f26aa88a6ba0b89b1a5be2d209c0238f"} Dec 06 09:22:59 crc kubenswrapper[4672]: I1206 09:22:59.846231 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 06 09:22:59 crc kubenswrapper[4672]: I1206 09:22:59.850326 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hxgmq-config-g9fpp" event={"ID":"09781861-e068-40bb-9190-cf582605a8ad","Type":"ContainerStarted","Data":"12d31ea3e9e8a632f558ec03f94c5d53a20da96056cf20b983e3317e458b5510"} Dec 06 09:22:59 crc kubenswrapper[4672]: I1206 09:22:59.850408 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hxgmq-config-g9fpp" event={"ID":"09781861-e068-40bb-9190-cf582605a8ad","Type":"ContainerStarted","Data":"984815d6799c4c7871bd7e19eb5ebadd87bd0ce522758b1bfdaa0bff51684189"} Dec 06 09:22:59 crc kubenswrapper[4672]: I1206 09:22:59.857043 4672 generic.go:334] "Generic (PLEG): container finished" podID="5aed11bb-ab80-496c-b92a-3fdc6a9fc044" containerID="c54289caead21061b5a5c1d9cb6799fd2624702516c75103c82ff99abb02563d" exitCode=0 Dec 06 09:22:59 crc kubenswrapper[4672]: I1206 09:22:59.857454 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-js95z" event={"ID":"5aed11bb-ab80-496c-b92a-3fdc6a9fc044","Type":"ContainerDied","Data":"c54289caead21061b5a5c1d9cb6799fd2624702516c75103c82ff99abb02563d"} Dec 06 09:22:59 crc kubenswrapper[4672]: I1206 09:22:59.857565 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-js95z" event={"ID":"5aed11bb-ab80-496c-b92a-3fdc6a9fc044","Type":"ContainerStarted","Data":"e78d25dc3fa28360c8731b92cf623da1f1e6287a5246616f6729e5fa42a809e3"} Dec 06 09:22:59 crc kubenswrapper[4672]: I1206 09:22:59.872242 4672 scope.go:117] "RemoveContainer" containerID="a677321b132a5f4d5a1b35113012695ad2b8e59c01e77fb3e3c6f8a826203fbd" Dec 06 09:22:59 crc kubenswrapper[4672]: I1206 09:22:59.880483 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-tlfw9" podStartSLOduration=2.6326188249999998 podStartE2EDuration="15.880465675s" podCreationTimestamp="2025-12-06 09:22:44 +0000 UTC" firstStartedPulling="2025-12-06 09:22:45.598446245 +0000 UTC m=+983.342706562" lastFinishedPulling="2025-12-06 09:22:58.846293125 +0000 UTC m=+996.590553412" observedRunningTime="2025-12-06 09:22:59.874515543 +0000 UTC m=+997.618775830" watchObservedRunningTime="2025-12-06 09:22:59.880465675 +0000 UTC m=+997.624725962" Dec 06 09:22:59 crc kubenswrapper[4672]: I1206 09:22:59.911715 4672 scope.go:117] "RemoveContainer" containerID="381fc07eb5acff5db2a559d0392488dffabb117890007699ba0a7a8b6e65f3e1" Dec 06 09:22:59 crc kubenswrapper[4672]: I1206 09:22:59.958915 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-hxgmq-config-g9fpp" podStartSLOduration=8.958891135 podStartE2EDuration="8.958891135s" podCreationTimestamp="2025-12-06 09:22:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:22:59.952984515 +0000 UTC m=+997.697244802" watchObservedRunningTime="2025-12-06 09:22:59.958891135 +0000 UTC m=+997.703151422" Dec 06 09:22:59 crc kubenswrapper[4672]: I1206 09:22:59.979780 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x2jpt"] Dec 06 09:22:59 crc kubenswrapper[4672]: I1206 09:22:59.994845 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-x2jpt"] Dec 06 09:23:00 crc kubenswrapper[4672]: I1206 09:23:00.017192 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=42.53245337 podStartE2EDuration="1m26.017169448s" podCreationTimestamp="2025-12-06 09:21:34 +0000 UTC" firstStartedPulling="2025-12-06 09:21:36.524047464 +0000 UTC m=+914.268307751" lastFinishedPulling="2025-12-06 09:22:20.008763542 +0000 UTC m=+957.753023829" observedRunningTime="2025-12-06 09:23:00.010456966 +0000 UTC m=+997.754717253" watchObservedRunningTime="2025-12-06 09:23:00.017169448 +0000 UTC m=+997.761429735" Dec 06 09:23:00 crc kubenswrapper[4672]: I1206 09:23:00.061496 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=42.218183433 podStartE2EDuration="1m26.061469931s" podCreationTimestamp="2025-12-06 09:21:34 +0000 UTC" firstStartedPulling="2025-12-06 09:21:36.413031809 +0000 UTC m=+914.157292096" lastFinishedPulling="2025-12-06 09:22:20.256318297 +0000 UTC m=+958.000578594" observedRunningTime="2025-12-06 09:23:00.058491431 +0000 UTC m=+997.802751718" watchObservedRunningTime="2025-12-06 09:23:00.061469931 +0000 UTC m=+997.805730218" Dec 06 09:23:00 crc kubenswrapper[4672]: I1206 09:23:00.613995 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2e3e6e1-ca0e-405e-a482-539d4dc9cadd" path="/var/lib/kubelet/pods/b2e3e6e1-ca0e-405e-a482-539d4dc9cadd/volumes" Dec 06 09:23:00 crc kubenswrapper[4672]: I1206 09:23:00.865388 4672 generic.go:334] "Generic (PLEG): container finished" podID="09781861-e068-40bb-9190-cf582605a8ad" containerID="12d31ea3e9e8a632f558ec03f94c5d53a20da96056cf20b983e3317e458b5510" exitCode=0 Dec 06 09:23:00 crc kubenswrapper[4672]: I1206 09:23:00.865445 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hxgmq-config-g9fpp" event={"ID":"09781861-e068-40bb-9190-cf582605a8ad","Type":"ContainerDied","Data":"12d31ea3e9e8a632f558ec03f94c5d53a20da96056cf20b983e3317e458b5510"} Dec 06 09:23:00 crc kubenswrapper[4672]: I1206 09:23:00.867639 4672 generic.go:334] "Generic (PLEG): container finished" podID="5aed11bb-ab80-496c-b92a-3fdc6a9fc044" containerID="1a78a95a57df082ebb045637c5030fe26705fc21f930f2e02f3f140651e95b5a" exitCode=0 Dec 06 09:23:00 crc kubenswrapper[4672]: I1206 09:23:00.867757 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-js95z" event={"ID":"5aed11bb-ab80-496c-b92a-3fdc6a9fc044","Type":"ContainerDied","Data":"1a78a95a57df082ebb045637c5030fe26705fc21f930f2e02f3f140651e95b5a"} Dec 06 09:23:01 crc kubenswrapper[4672]: I1206 09:23:01.877255 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-js95z" event={"ID":"5aed11bb-ab80-496c-b92a-3fdc6a9fc044","Type":"ContainerStarted","Data":"0ffb5393da1cf97ec786feb9117d39c56c32db2ea7e5f5b8fe8ee6468d2b7dd7"} Dec 06 09:23:01 crc kubenswrapper[4672]: I1206 09:23:01.899490 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-js95z" podStartSLOduration=8.511811514 podStartE2EDuration="9.899449804s" podCreationTimestamp="2025-12-06 09:22:52 +0000 UTC" firstStartedPulling="2025-12-06 09:22:59.872096327 +0000 UTC m=+997.616356624" lastFinishedPulling="2025-12-06 09:23:01.259734627 +0000 UTC m=+999.003994914" observedRunningTime="2025-12-06 09:23:01.898646922 +0000 UTC m=+999.642907209" watchObservedRunningTime="2025-12-06 09:23:01.899449804 +0000 UTC m=+999.643710111" Dec 06 09:23:02 crc kubenswrapper[4672]: I1206 09:23:02.208384 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hxgmq-config-g9fpp" Dec 06 09:23:02 crc kubenswrapper[4672]: I1206 09:23:02.276340 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/09781861-e068-40bb-9190-cf582605a8ad-var-run-ovn\") pod \"09781861-e068-40bb-9190-cf582605a8ad\" (UID: \"09781861-e068-40bb-9190-cf582605a8ad\") " Dec 06 09:23:02 crc kubenswrapper[4672]: I1206 09:23:02.276461 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/09781861-e068-40bb-9190-cf582605a8ad-additional-scripts\") pod \"09781861-e068-40bb-9190-cf582605a8ad\" (UID: \"09781861-e068-40bb-9190-cf582605a8ad\") " Dec 06 09:23:02 crc kubenswrapper[4672]: I1206 09:23:02.276483 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09781861-e068-40bb-9190-cf582605a8ad-scripts\") pod \"09781861-e068-40bb-9190-cf582605a8ad\" (UID: \"09781861-e068-40bb-9190-cf582605a8ad\") " Dec 06 09:23:02 crc kubenswrapper[4672]: I1206 09:23:02.276480 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/09781861-e068-40bb-9190-cf582605a8ad-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "09781861-e068-40bb-9190-cf582605a8ad" (UID: "09781861-e068-40bb-9190-cf582605a8ad"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:23:02 crc kubenswrapper[4672]: I1206 09:23:02.276546 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/09781861-e068-40bb-9190-cf582605a8ad-var-run\") pod \"09781861-e068-40bb-9190-cf582605a8ad\" (UID: \"09781861-e068-40bb-9190-cf582605a8ad\") " Dec 06 09:23:02 crc kubenswrapper[4672]: I1206 09:23:02.276569 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hmkh\" (UniqueName: \"kubernetes.io/projected/09781861-e068-40bb-9190-cf582605a8ad-kube-api-access-2hmkh\") pod \"09781861-e068-40bb-9190-cf582605a8ad\" (UID: \"09781861-e068-40bb-9190-cf582605a8ad\") " Dec 06 09:23:02 crc kubenswrapper[4672]: I1206 09:23:02.276590 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/09781861-e068-40bb-9190-cf582605a8ad-var-log-ovn\") pod \"09781861-e068-40bb-9190-cf582605a8ad\" (UID: \"09781861-e068-40bb-9190-cf582605a8ad\") " Dec 06 09:23:02 crc kubenswrapper[4672]: I1206 09:23:02.276937 4672 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/09781861-e068-40bb-9190-cf582605a8ad-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:02 crc kubenswrapper[4672]: I1206 09:23:02.276985 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/09781861-e068-40bb-9190-cf582605a8ad-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "09781861-e068-40bb-9190-cf582605a8ad" (UID: "09781861-e068-40bb-9190-cf582605a8ad"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:23:02 crc kubenswrapper[4672]: I1206 09:23:02.277014 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/09781861-e068-40bb-9190-cf582605a8ad-var-run" (OuterVolumeSpecName: "var-run") pod "09781861-e068-40bb-9190-cf582605a8ad" (UID: "09781861-e068-40bb-9190-cf582605a8ad"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:23:02 crc kubenswrapper[4672]: I1206 09:23:02.277308 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09781861-e068-40bb-9190-cf582605a8ad-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "09781861-e068-40bb-9190-cf582605a8ad" (UID: "09781861-e068-40bb-9190-cf582605a8ad"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:23:02 crc kubenswrapper[4672]: I1206 09:23:02.277485 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09781861-e068-40bb-9190-cf582605a8ad-scripts" (OuterVolumeSpecName: "scripts") pod "09781861-e068-40bb-9190-cf582605a8ad" (UID: "09781861-e068-40bb-9190-cf582605a8ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:23:02 crc kubenswrapper[4672]: I1206 09:23:02.295718 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09781861-e068-40bb-9190-cf582605a8ad-kube-api-access-2hmkh" (OuterVolumeSpecName: "kube-api-access-2hmkh") pod "09781861-e068-40bb-9190-cf582605a8ad" (UID: "09781861-e068-40bb-9190-cf582605a8ad"). InnerVolumeSpecName "kube-api-access-2hmkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:23:02 crc kubenswrapper[4672]: I1206 09:23:02.378712 4672 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/09781861-e068-40bb-9190-cf582605a8ad-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:02 crc kubenswrapper[4672]: I1206 09:23:02.378738 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09781861-e068-40bb-9190-cf582605a8ad-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:02 crc kubenswrapper[4672]: I1206 09:23:02.378749 4672 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/09781861-e068-40bb-9190-cf582605a8ad-var-run\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:02 crc kubenswrapper[4672]: I1206 09:23:02.378757 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hmkh\" (UniqueName: \"kubernetes.io/projected/09781861-e068-40bb-9190-cf582605a8ad-kube-api-access-2hmkh\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:02 crc kubenswrapper[4672]: I1206 09:23:02.378773 4672 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/09781861-e068-40bb-9190-cf582605a8ad-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:02 crc kubenswrapper[4672]: I1206 09:23:02.413610 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-hxgmq-config-g9fpp"] Dec 06 09:23:02 crc kubenswrapper[4672]: I1206 09:23:02.421495 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-hxgmq-config-g9fpp"] Dec 06 09:23:02 crc kubenswrapper[4672]: I1206 09:23:02.565473 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09781861-e068-40bb-9190-cf582605a8ad" path="/var/lib/kubelet/pods/09781861-e068-40bb-9190-cf582605a8ad/volumes" Dec 06 09:23:02 crc kubenswrapper[4672]: I1206 09:23:02.885023 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hxgmq-config-g9fpp" Dec 06 09:23:02 crc kubenswrapper[4672]: I1206 09:23:02.885034 4672 scope.go:117] "RemoveContainer" containerID="12d31ea3e9e8a632f558ec03f94c5d53a20da96056cf20b983e3317e458b5510" Dec 06 09:23:03 crc kubenswrapper[4672]: I1206 09:23:03.237290 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-js95z" Dec 06 09:23:03 crc kubenswrapper[4672]: I1206 09:23:03.237492 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-js95z" Dec 06 09:23:04 crc kubenswrapper[4672]: I1206 09:23:04.293928 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-js95z" podUID="5aed11bb-ab80-496c-b92a-3fdc6a9fc044" containerName="registry-server" probeResult="failure" output=< Dec 06 09:23:04 crc kubenswrapper[4672]: timeout: failed to connect service ":50051" within 1s Dec 06 09:23:04 crc kubenswrapper[4672]: > Dec 06 09:23:07 crc kubenswrapper[4672]: I1206 09:23:07.929921 4672 generic.go:334] "Generic (PLEG): container finished" podID="1d834bae-e9bc-4c2e-be87-2749743f8ef0" containerID="b76c56d4d0ca7315c8a755fa2c649eaa1e616d7550c7c65fdb869c70285c12a7" exitCode=0 Dec 06 09:23:07 crc kubenswrapper[4672]: I1206 09:23:07.930428 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tlfw9" event={"ID":"1d834bae-e9bc-4c2e-be87-2749743f8ef0","Type":"ContainerDied","Data":"b76c56d4d0ca7315c8a755fa2c649eaa1e616d7550c7c65fdb869c70285c12a7"} Dec 06 09:23:09 crc kubenswrapper[4672]: I1206 09:23:09.299293 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tlfw9" Dec 06 09:23:09 crc kubenswrapper[4672]: I1206 09:23:09.383459 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d834bae-e9bc-4c2e-be87-2749743f8ef0-combined-ca-bundle\") pod \"1d834bae-e9bc-4c2e-be87-2749743f8ef0\" (UID: \"1d834bae-e9bc-4c2e-be87-2749743f8ef0\") " Dec 06 09:23:09 crc kubenswrapper[4672]: I1206 09:23:09.383531 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d834bae-e9bc-4c2e-be87-2749743f8ef0-config-data\") pod \"1d834bae-e9bc-4c2e-be87-2749743f8ef0\" (UID: \"1d834bae-e9bc-4c2e-be87-2749743f8ef0\") " Dec 06 09:23:09 crc kubenswrapper[4672]: I1206 09:23:09.383571 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1d834bae-e9bc-4c2e-be87-2749743f8ef0-db-sync-config-data\") pod \"1d834bae-e9bc-4c2e-be87-2749743f8ef0\" (UID: \"1d834bae-e9bc-4c2e-be87-2749743f8ef0\") " Dec 06 09:23:09 crc kubenswrapper[4672]: I1206 09:23:09.383848 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2mwg\" (UniqueName: \"kubernetes.io/projected/1d834bae-e9bc-4c2e-be87-2749743f8ef0-kube-api-access-h2mwg\") pod \"1d834bae-e9bc-4c2e-be87-2749743f8ef0\" (UID: \"1d834bae-e9bc-4c2e-be87-2749743f8ef0\") " Dec 06 09:23:09 crc kubenswrapper[4672]: I1206 09:23:09.389740 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d834bae-e9bc-4c2e-be87-2749743f8ef0-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1d834bae-e9bc-4c2e-be87-2749743f8ef0" (UID: "1d834bae-e9bc-4c2e-be87-2749743f8ef0"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:23:09 crc kubenswrapper[4672]: I1206 09:23:09.389960 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d834bae-e9bc-4c2e-be87-2749743f8ef0-kube-api-access-h2mwg" (OuterVolumeSpecName: "kube-api-access-h2mwg") pod "1d834bae-e9bc-4c2e-be87-2749743f8ef0" (UID: "1d834bae-e9bc-4c2e-be87-2749743f8ef0"). InnerVolumeSpecName "kube-api-access-h2mwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:23:09 crc kubenswrapper[4672]: I1206 09:23:09.405571 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d834bae-e9bc-4c2e-be87-2749743f8ef0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d834bae-e9bc-4c2e-be87-2749743f8ef0" (UID: "1d834bae-e9bc-4c2e-be87-2749743f8ef0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:23:09 crc kubenswrapper[4672]: I1206 09:23:09.421035 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d834bae-e9bc-4c2e-be87-2749743f8ef0-config-data" (OuterVolumeSpecName: "config-data") pod "1d834bae-e9bc-4c2e-be87-2749743f8ef0" (UID: "1d834bae-e9bc-4c2e-be87-2749743f8ef0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:23:09 crc kubenswrapper[4672]: I1206 09:23:09.485683 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d834bae-e9bc-4c2e-be87-2749743f8ef0-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:09 crc kubenswrapper[4672]: I1206 09:23:09.485714 4672 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1d834bae-e9bc-4c2e-be87-2749743f8ef0-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:09 crc kubenswrapper[4672]: I1206 09:23:09.485725 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2mwg\" (UniqueName: \"kubernetes.io/projected/1d834bae-e9bc-4c2e-be87-2749743f8ef0-kube-api-access-h2mwg\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:09 crc kubenswrapper[4672]: I1206 09:23:09.485736 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d834bae-e9bc-4c2e-be87-2749743f8ef0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:09 crc kubenswrapper[4672]: I1206 09:23:09.953158 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tlfw9" event={"ID":"1d834bae-e9bc-4c2e-be87-2749743f8ef0","Type":"ContainerDied","Data":"e00a98a936db6d3617bb693fb205c39aba6487a6e07a73bb1f6f0edd4d119996"} Dec 06 09:23:09 crc kubenswrapper[4672]: I1206 09:23:09.953423 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e00a98a936db6d3617bb693fb205c39aba6487a6e07a73bb1f6f0edd4d119996" Dec 06 09:23:09 crc kubenswrapper[4672]: I1206 09:23:09.953255 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tlfw9" Dec 06 09:23:10 crc kubenswrapper[4672]: I1206 09:23:10.428191 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75cb88fd77-prz47"] Dec 06 09:23:10 crc kubenswrapper[4672]: E1206 09:23:10.429029 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d834bae-e9bc-4c2e-be87-2749743f8ef0" containerName="glance-db-sync" Dec 06 09:23:10 crc kubenswrapper[4672]: I1206 09:23:10.429047 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d834bae-e9bc-4c2e-be87-2749743f8ef0" containerName="glance-db-sync" Dec 06 09:23:10 crc kubenswrapper[4672]: E1206 09:23:10.429069 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2e3e6e1-ca0e-405e-a482-539d4dc9cadd" containerName="extract-content" Dec 06 09:23:10 crc kubenswrapper[4672]: I1206 09:23:10.429079 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2e3e6e1-ca0e-405e-a482-539d4dc9cadd" containerName="extract-content" Dec 06 09:23:10 crc kubenswrapper[4672]: E1206 09:23:10.429103 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09781861-e068-40bb-9190-cf582605a8ad" containerName="ovn-config" Dec 06 09:23:10 crc kubenswrapper[4672]: I1206 09:23:10.429113 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="09781861-e068-40bb-9190-cf582605a8ad" containerName="ovn-config" Dec 06 09:23:10 crc kubenswrapper[4672]: E1206 09:23:10.429131 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2e3e6e1-ca0e-405e-a482-539d4dc9cadd" containerName="extract-utilities" Dec 06 09:23:10 crc kubenswrapper[4672]: I1206 09:23:10.436537 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2e3e6e1-ca0e-405e-a482-539d4dc9cadd" containerName="extract-utilities" Dec 06 09:23:10 crc kubenswrapper[4672]: E1206 09:23:10.436626 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2e3e6e1-ca0e-405e-a482-539d4dc9cadd" containerName="registry-server" Dec 06 09:23:10 crc kubenswrapper[4672]: I1206 09:23:10.436636 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2e3e6e1-ca0e-405e-a482-539d4dc9cadd" containerName="registry-server" Dec 06 09:23:10 crc kubenswrapper[4672]: I1206 09:23:10.437136 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="09781861-e068-40bb-9190-cf582605a8ad" containerName="ovn-config" Dec 06 09:23:10 crc kubenswrapper[4672]: I1206 09:23:10.437157 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2e3e6e1-ca0e-405e-a482-539d4dc9cadd" containerName="registry-server" Dec 06 09:23:10 crc kubenswrapper[4672]: I1206 09:23:10.437208 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d834bae-e9bc-4c2e-be87-2749743f8ef0" containerName="glance-db-sync" Dec 06 09:23:10 crc kubenswrapper[4672]: I1206 09:23:10.438321 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75cb88fd77-prz47" Dec 06 09:23:10 crc kubenswrapper[4672]: I1206 09:23:10.444301 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75cb88fd77-prz47"] Dec 06 09:23:10 crc kubenswrapper[4672]: I1206 09:23:10.510392 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd82121b-106e-4488-b0e7-5f1c081077d4-dns-svc\") pod \"dnsmasq-dns-75cb88fd77-prz47\" (UID: \"dd82121b-106e-4488-b0e7-5f1c081077d4\") " pod="openstack/dnsmasq-dns-75cb88fd77-prz47" Dec 06 09:23:10 crc kubenswrapper[4672]: I1206 09:23:10.510446 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd82121b-106e-4488-b0e7-5f1c081077d4-config\") pod \"dnsmasq-dns-75cb88fd77-prz47\" (UID: \"dd82121b-106e-4488-b0e7-5f1c081077d4\") " pod="openstack/dnsmasq-dns-75cb88fd77-prz47" Dec 06 09:23:10 crc kubenswrapper[4672]: I1206 09:23:10.510469 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pskrq\" (UniqueName: \"kubernetes.io/projected/dd82121b-106e-4488-b0e7-5f1c081077d4-kube-api-access-pskrq\") pod \"dnsmasq-dns-75cb88fd77-prz47\" (UID: \"dd82121b-106e-4488-b0e7-5f1c081077d4\") " pod="openstack/dnsmasq-dns-75cb88fd77-prz47" Dec 06 09:23:10 crc kubenswrapper[4672]: I1206 09:23:10.510541 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd82121b-106e-4488-b0e7-5f1c081077d4-ovsdbserver-sb\") pod \"dnsmasq-dns-75cb88fd77-prz47\" (UID: \"dd82121b-106e-4488-b0e7-5f1c081077d4\") " pod="openstack/dnsmasq-dns-75cb88fd77-prz47" Dec 06 09:23:10 crc kubenswrapper[4672]: I1206 09:23:10.510562 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd82121b-106e-4488-b0e7-5f1c081077d4-ovsdbserver-nb\") pod \"dnsmasq-dns-75cb88fd77-prz47\" (UID: \"dd82121b-106e-4488-b0e7-5f1c081077d4\") " pod="openstack/dnsmasq-dns-75cb88fd77-prz47" Dec 06 09:23:10 crc kubenswrapper[4672]: I1206 09:23:10.612325 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd82121b-106e-4488-b0e7-5f1c081077d4-ovsdbserver-nb\") pod \"dnsmasq-dns-75cb88fd77-prz47\" (UID: \"dd82121b-106e-4488-b0e7-5f1c081077d4\") " pod="openstack/dnsmasq-dns-75cb88fd77-prz47" Dec 06 09:23:10 crc kubenswrapper[4672]: I1206 09:23:10.612413 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd82121b-106e-4488-b0e7-5f1c081077d4-dns-svc\") pod \"dnsmasq-dns-75cb88fd77-prz47\" (UID: \"dd82121b-106e-4488-b0e7-5f1c081077d4\") " pod="openstack/dnsmasq-dns-75cb88fd77-prz47" Dec 06 09:23:10 crc kubenswrapper[4672]: I1206 09:23:10.612453 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd82121b-106e-4488-b0e7-5f1c081077d4-config\") pod \"dnsmasq-dns-75cb88fd77-prz47\" (UID: \"dd82121b-106e-4488-b0e7-5f1c081077d4\") " pod="openstack/dnsmasq-dns-75cb88fd77-prz47" Dec 06 09:23:10 crc kubenswrapper[4672]: I1206 09:23:10.612472 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pskrq\" (UniqueName: \"kubernetes.io/projected/dd82121b-106e-4488-b0e7-5f1c081077d4-kube-api-access-pskrq\") pod \"dnsmasq-dns-75cb88fd77-prz47\" (UID: \"dd82121b-106e-4488-b0e7-5f1c081077d4\") " pod="openstack/dnsmasq-dns-75cb88fd77-prz47" Dec 06 09:23:10 crc kubenswrapper[4672]: I1206 09:23:10.612541 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd82121b-106e-4488-b0e7-5f1c081077d4-ovsdbserver-sb\") pod \"dnsmasq-dns-75cb88fd77-prz47\" (UID: \"dd82121b-106e-4488-b0e7-5f1c081077d4\") " pod="openstack/dnsmasq-dns-75cb88fd77-prz47" Dec 06 09:23:10 crc kubenswrapper[4672]: I1206 09:23:10.613384 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd82121b-106e-4488-b0e7-5f1c081077d4-dns-svc\") pod \"dnsmasq-dns-75cb88fd77-prz47\" (UID: \"dd82121b-106e-4488-b0e7-5f1c081077d4\") " pod="openstack/dnsmasq-dns-75cb88fd77-prz47" Dec 06 09:23:10 crc kubenswrapper[4672]: I1206 09:23:10.613393 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd82121b-106e-4488-b0e7-5f1c081077d4-ovsdbserver-nb\") pod \"dnsmasq-dns-75cb88fd77-prz47\" (UID: \"dd82121b-106e-4488-b0e7-5f1c081077d4\") " pod="openstack/dnsmasq-dns-75cb88fd77-prz47" Dec 06 09:23:10 crc kubenswrapper[4672]: I1206 09:23:10.613550 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd82121b-106e-4488-b0e7-5f1c081077d4-ovsdbserver-sb\") pod \"dnsmasq-dns-75cb88fd77-prz47\" (UID: \"dd82121b-106e-4488-b0e7-5f1c081077d4\") " pod="openstack/dnsmasq-dns-75cb88fd77-prz47" Dec 06 09:23:10 crc kubenswrapper[4672]: I1206 09:23:10.613670 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd82121b-106e-4488-b0e7-5f1c081077d4-config\") pod \"dnsmasq-dns-75cb88fd77-prz47\" (UID: \"dd82121b-106e-4488-b0e7-5f1c081077d4\") " pod="openstack/dnsmasq-dns-75cb88fd77-prz47" Dec 06 09:23:10 crc kubenswrapper[4672]: I1206 09:23:10.634857 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pskrq\" (UniqueName: \"kubernetes.io/projected/dd82121b-106e-4488-b0e7-5f1c081077d4-kube-api-access-pskrq\") pod \"dnsmasq-dns-75cb88fd77-prz47\" (UID: \"dd82121b-106e-4488-b0e7-5f1c081077d4\") " pod="openstack/dnsmasq-dns-75cb88fd77-prz47" Dec 06 09:23:10 crc kubenswrapper[4672]: I1206 09:23:10.759780 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75cb88fd77-prz47" Dec 06 09:23:11 crc kubenswrapper[4672]: I1206 09:23:11.004157 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75cb88fd77-prz47"] Dec 06 09:23:11 crc kubenswrapper[4672]: I1206 09:23:11.972963 4672 generic.go:334] "Generic (PLEG): container finished" podID="dd82121b-106e-4488-b0e7-5f1c081077d4" containerID="97e9da8f79d9127fbd4be22a3490dde3eb27d9700610f1d31dfc4437b7ac187a" exitCode=0 Dec 06 09:23:11 crc kubenswrapper[4672]: I1206 09:23:11.973239 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75cb88fd77-prz47" event={"ID":"dd82121b-106e-4488-b0e7-5f1c081077d4","Type":"ContainerDied","Data":"97e9da8f79d9127fbd4be22a3490dde3eb27d9700610f1d31dfc4437b7ac187a"} Dec 06 09:23:11 crc kubenswrapper[4672]: I1206 09:23:11.974939 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75cb88fd77-prz47" event={"ID":"dd82121b-106e-4488-b0e7-5f1c081077d4","Type":"ContainerStarted","Data":"c589be46e5af1650a4ff32af46254dc38666a3b1fdb8fd3f3536b0157a579ba0"} Dec 06 09:23:12 crc kubenswrapper[4672]: I1206 09:23:12.319722 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:23:12 crc kubenswrapper[4672]: I1206 09:23:12.319781 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:23:12 crc kubenswrapper[4672]: I1206 09:23:12.319825 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 09:23:12 crc kubenswrapper[4672]: I1206 09:23:12.320461 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6dc0e941a4dd3e79f056ce0d1f08eb3aa888fb31efcafdbd3ecc3f28c01b9f06"} pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 09:23:12 crc kubenswrapper[4672]: I1206 09:23:12.320505 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" containerID="cri-o://6dc0e941a4dd3e79f056ce0d1f08eb3aa888fb31efcafdbd3ecc3f28c01b9f06" gracePeriod=600 Dec 06 09:23:12 crc kubenswrapper[4672]: I1206 09:23:12.994100 4672 generic.go:334] "Generic (PLEG): container finished" podID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerID="6dc0e941a4dd3e79f056ce0d1f08eb3aa888fb31efcafdbd3ecc3f28c01b9f06" exitCode=0 Dec 06 09:23:12 crc kubenswrapper[4672]: I1206 09:23:12.994168 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerDied","Data":"6dc0e941a4dd3e79f056ce0d1f08eb3aa888fb31efcafdbd3ecc3f28c01b9f06"} Dec 06 09:23:12 crc kubenswrapper[4672]: I1206 09:23:12.994386 4672 scope.go:117] "RemoveContainer" containerID="157a1103c9931308d56d2a9afffb01b9138166ad6f81a369e330a682cba427f9" Dec 06 09:23:13 crc kubenswrapper[4672]: I1206 09:23:13.283357 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-js95z" Dec 06 09:23:13 crc kubenswrapper[4672]: I1206 09:23:13.341366 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-js95z" Dec 06 09:23:13 crc kubenswrapper[4672]: I1206 09:23:13.519922 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-js95z"] Dec 06 09:23:14 crc kubenswrapper[4672]: I1206 09:23:14.004043 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerStarted","Data":"a101a6d3a9ea73e6619b9412aec8733c3ef377249e41f4c656d15ff2d987965d"} Dec 06 09:23:14 crc kubenswrapper[4672]: I1206 09:23:14.008161 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75cb88fd77-prz47" event={"ID":"dd82121b-106e-4488-b0e7-5f1c081077d4","Type":"ContainerStarted","Data":"aa2e917ca3534c104ac0063fc2fe89051234da2a787ec4135f6f2e51e349eeea"} Dec 06 09:23:14 crc kubenswrapper[4672]: I1206 09:23:14.037298 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75cb88fd77-prz47" podStartSLOduration=4.037281285 podStartE2EDuration="4.037281285s" podCreationTimestamp="2025-12-06 09:23:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:23:14.033453411 +0000 UTC m=+1011.777713698" watchObservedRunningTime="2025-12-06 09:23:14.037281285 +0000 UTC m=+1011.781541562" Dec 06 09:23:15 crc kubenswrapper[4672]: I1206 09:23:15.019087 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-js95z" podUID="5aed11bb-ab80-496c-b92a-3fdc6a9fc044" containerName="registry-server" containerID="cri-o://0ffb5393da1cf97ec786feb9117d39c56c32db2ea7e5f5b8fe8ee6468d2b7dd7" gracePeriod=2 Dec 06 09:23:15 crc kubenswrapper[4672]: I1206 09:23:15.021253 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75cb88fd77-prz47" Dec 06 09:23:15 crc kubenswrapper[4672]: I1206 09:23:15.543727 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-js95z" Dec 06 09:23:15 crc kubenswrapper[4672]: I1206 09:23:15.691504 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aed11bb-ab80-496c-b92a-3fdc6a9fc044-catalog-content\") pod \"5aed11bb-ab80-496c-b92a-3fdc6a9fc044\" (UID: \"5aed11bb-ab80-496c-b92a-3fdc6a9fc044\") " Dec 06 09:23:15 crc kubenswrapper[4672]: I1206 09:23:15.691578 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aed11bb-ab80-496c-b92a-3fdc6a9fc044-utilities\") pod \"5aed11bb-ab80-496c-b92a-3fdc6a9fc044\" (UID: \"5aed11bb-ab80-496c-b92a-3fdc6a9fc044\") " Dec 06 09:23:15 crc kubenswrapper[4672]: I1206 09:23:15.691680 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr2fk\" (UniqueName: \"kubernetes.io/projected/5aed11bb-ab80-496c-b92a-3fdc6a9fc044-kube-api-access-hr2fk\") pod \"5aed11bb-ab80-496c-b92a-3fdc6a9fc044\" (UID: \"5aed11bb-ab80-496c-b92a-3fdc6a9fc044\") " Dec 06 09:23:15 crc kubenswrapper[4672]: I1206 09:23:15.692877 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5aed11bb-ab80-496c-b92a-3fdc6a9fc044-utilities" (OuterVolumeSpecName: "utilities") pod "5aed11bb-ab80-496c-b92a-3fdc6a9fc044" (UID: "5aed11bb-ab80-496c-b92a-3fdc6a9fc044"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:23:15 crc kubenswrapper[4672]: I1206 09:23:15.693418 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aed11bb-ab80-496c-b92a-3fdc6a9fc044-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:15 crc kubenswrapper[4672]: I1206 09:23:15.698494 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aed11bb-ab80-496c-b92a-3fdc6a9fc044-kube-api-access-hr2fk" (OuterVolumeSpecName: "kube-api-access-hr2fk") pod "5aed11bb-ab80-496c-b92a-3fdc6a9fc044" (UID: "5aed11bb-ab80-496c-b92a-3fdc6a9fc044"). InnerVolumeSpecName "kube-api-access-hr2fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:23:15 crc kubenswrapper[4672]: I1206 09:23:15.714203 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 06 09:23:15 crc kubenswrapper[4672]: I1206 09:23:15.760785 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:23:15 crc kubenswrapper[4672]: I1206 09:23:15.773083 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5aed11bb-ab80-496c-b92a-3fdc6a9fc044-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5aed11bb-ab80-496c-b92a-3fdc6a9fc044" (UID: "5aed11bb-ab80-496c-b92a-3fdc6a9fc044"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:23:15 crc kubenswrapper[4672]: I1206 09:23:15.796337 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aed11bb-ab80-496c-b92a-3fdc6a9fc044-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:15 crc kubenswrapper[4672]: I1206 09:23:15.796413 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr2fk\" (UniqueName: \"kubernetes.io/projected/5aed11bb-ab80-496c-b92a-3fdc6a9fc044-kube-api-access-hr2fk\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.027622 4672 generic.go:334] "Generic (PLEG): container finished" podID="5aed11bb-ab80-496c-b92a-3fdc6a9fc044" containerID="0ffb5393da1cf97ec786feb9117d39c56c32db2ea7e5f5b8fe8ee6468d2b7dd7" exitCode=0 Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.027686 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-js95z" event={"ID":"5aed11bb-ab80-496c-b92a-3fdc6a9fc044","Type":"ContainerDied","Data":"0ffb5393da1cf97ec786feb9117d39c56c32db2ea7e5f5b8fe8ee6468d2b7dd7"} Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.027748 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-js95z" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.027778 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-js95z" event={"ID":"5aed11bb-ab80-496c-b92a-3fdc6a9fc044","Type":"ContainerDied","Data":"e78d25dc3fa28360c8731b92cf623da1f1e6287a5246616f6729e5fa42a809e3"} Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.027798 4672 scope.go:117] "RemoveContainer" containerID="0ffb5393da1cf97ec786feb9117d39c56c32db2ea7e5f5b8fe8ee6468d2b7dd7" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.059832 4672 scope.go:117] "RemoveContainer" containerID="1a78a95a57df082ebb045637c5030fe26705fc21f930f2e02f3f140651e95b5a" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.119876 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-js95z"] Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.140675 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-js95z"] Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.146813 4672 scope.go:117] "RemoveContainer" containerID="c54289caead21061b5a5c1d9cb6799fd2624702516c75103c82ff99abb02563d" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.158956 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-9mbg7"] Dec 06 09:23:16 crc kubenswrapper[4672]: E1206 09:23:16.159267 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aed11bb-ab80-496c-b92a-3fdc6a9fc044" containerName="registry-server" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.159287 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aed11bb-ab80-496c-b92a-3fdc6a9fc044" containerName="registry-server" Dec 06 09:23:16 crc kubenswrapper[4672]: E1206 09:23:16.159298 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aed11bb-ab80-496c-b92a-3fdc6a9fc044" containerName="extract-utilities" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.159305 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aed11bb-ab80-496c-b92a-3fdc6a9fc044" containerName="extract-utilities" Dec 06 09:23:16 crc kubenswrapper[4672]: E1206 09:23:16.159351 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aed11bb-ab80-496c-b92a-3fdc6a9fc044" containerName="extract-content" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.159361 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aed11bb-ab80-496c-b92a-3fdc6a9fc044" containerName="extract-content" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.159506 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aed11bb-ab80-496c-b92a-3fdc6a9fc044" containerName="registry-server" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.163781 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-9mbg7" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.185439 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-9mbg7"] Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.185528 4672 scope.go:117] "RemoveContainer" containerID="0ffb5393da1cf97ec786feb9117d39c56c32db2ea7e5f5b8fe8ee6468d2b7dd7" Dec 06 09:23:16 crc kubenswrapper[4672]: E1206 09:23:16.186039 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ffb5393da1cf97ec786feb9117d39c56c32db2ea7e5f5b8fe8ee6468d2b7dd7\": container with ID starting with 0ffb5393da1cf97ec786feb9117d39c56c32db2ea7e5f5b8fe8ee6468d2b7dd7 not found: ID does not exist" containerID="0ffb5393da1cf97ec786feb9117d39c56c32db2ea7e5f5b8fe8ee6468d2b7dd7" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.186096 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ffb5393da1cf97ec786feb9117d39c56c32db2ea7e5f5b8fe8ee6468d2b7dd7"} err="failed to get container status \"0ffb5393da1cf97ec786feb9117d39c56c32db2ea7e5f5b8fe8ee6468d2b7dd7\": rpc error: code = NotFound desc = could not find container \"0ffb5393da1cf97ec786feb9117d39c56c32db2ea7e5f5b8fe8ee6468d2b7dd7\": container with ID starting with 0ffb5393da1cf97ec786feb9117d39c56c32db2ea7e5f5b8fe8ee6468d2b7dd7 not found: ID does not exist" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.186123 4672 scope.go:117] "RemoveContainer" containerID="1a78a95a57df082ebb045637c5030fe26705fc21f930f2e02f3f140651e95b5a" Dec 06 09:23:16 crc kubenswrapper[4672]: E1206 09:23:16.189961 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a78a95a57df082ebb045637c5030fe26705fc21f930f2e02f3f140651e95b5a\": container with ID starting with 1a78a95a57df082ebb045637c5030fe26705fc21f930f2e02f3f140651e95b5a not found: ID does not exist" containerID="1a78a95a57df082ebb045637c5030fe26705fc21f930f2e02f3f140651e95b5a" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.190000 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a78a95a57df082ebb045637c5030fe26705fc21f930f2e02f3f140651e95b5a"} err="failed to get container status \"1a78a95a57df082ebb045637c5030fe26705fc21f930f2e02f3f140651e95b5a\": rpc error: code = NotFound desc = could not find container \"1a78a95a57df082ebb045637c5030fe26705fc21f930f2e02f3f140651e95b5a\": container with ID starting with 1a78a95a57df082ebb045637c5030fe26705fc21f930f2e02f3f140651e95b5a not found: ID does not exist" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.190021 4672 scope.go:117] "RemoveContainer" containerID="c54289caead21061b5a5c1d9cb6799fd2624702516c75103c82ff99abb02563d" Dec 06 09:23:16 crc kubenswrapper[4672]: E1206 09:23:16.190330 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c54289caead21061b5a5c1d9cb6799fd2624702516c75103c82ff99abb02563d\": container with ID starting with c54289caead21061b5a5c1d9cb6799fd2624702516c75103c82ff99abb02563d not found: ID does not exist" containerID="c54289caead21061b5a5c1d9cb6799fd2624702516c75103c82ff99abb02563d" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.190350 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c54289caead21061b5a5c1d9cb6799fd2624702516c75103c82ff99abb02563d"} err="failed to get container status \"c54289caead21061b5a5c1d9cb6799fd2624702516c75103c82ff99abb02563d\": rpc error: code = NotFound desc = could not find container \"c54289caead21061b5a5c1d9cb6799fd2624702516c75103c82ff99abb02563d\": container with ID starting with c54289caead21061b5a5c1d9cb6799fd2624702516c75103c82ff99abb02563d not found: ID does not exist" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.209074 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/498d72d7-0324-4ab2-9208-0b32e1df2efa-operator-scripts\") pod \"cinder-db-create-9mbg7\" (UID: \"498d72d7-0324-4ab2-9208-0b32e1df2efa\") " pod="openstack/cinder-db-create-9mbg7" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.209289 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcncv\" (UniqueName: \"kubernetes.io/projected/498d72d7-0324-4ab2-9208-0b32e1df2efa-kube-api-access-fcncv\") pod \"cinder-db-create-9mbg7\" (UID: \"498d72d7-0324-4ab2-9208-0b32e1df2efa\") " pod="openstack/cinder-db-create-9mbg7" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.293356 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-c4c2-account-create-update-6lfzq"] Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.294393 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c4c2-account-create-update-6lfzq" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.301054 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.305822 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-cbj5x"] Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.307992 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cbj5x" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.310868 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcncv\" (UniqueName: \"kubernetes.io/projected/498d72d7-0324-4ab2-9208-0b32e1df2efa-kube-api-access-fcncv\") pod \"cinder-db-create-9mbg7\" (UID: \"498d72d7-0324-4ab2-9208-0b32e1df2efa\") " pod="openstack/cinder-db-create-9mbg7" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.310995 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/498d72d7-0324-4ab2-9208-0b32e1df2efa-operator-scripts\") pod \"cinder-db-create-9mbg7\" (UID: \"498d72d7-0324-4ab2-9208-0b32e1df2efa\") " pod="openstack/cinder-db-create-9mbg7" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.311550 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/498d72d7-0324-4ab2-9208-0b32e1df2efa-operator-scripts\") pod \"cinder-db-create-9mbg7\" (UID: \"498d72d7-0324-4ab2-9208-0b32e1df2efa\") " pod="openstack/cinder-db-create-9mbg7" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.338053 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-cbj5x"] Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.362450 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c4c2-account-create-update-6lfzq"] Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.400540 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcncv\" (UniqueName: \"kubernetes.io/projected/498d72d7-0324-4ab2-9208-0b32e1df2efa-kube-api-access-fcncv\") pod \"cinder-db-create-9mbg7\" (UID: \"498d72d7-0324-4ab2-9208-0b32e1df2efa\") " pod="openstack/cinder-db-create-9mbg7" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.412710 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbprm\" (UniqueName: \"kubernetes.io/projected/ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02-kube-api-access-hbprm\") pod \"cinder-c4c2-account-create-update-6lfzq\" (UID: \"ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02\") " pod="openstack/cinder-c4c2-account-create-update-6lfzq" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.412760 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmdzz\" (UniqueName: \"kubernetes.io/projected/165da568-dddb-472f-860a-0f36faa35334-kube-api-access-bmdzz\") pod \"barbican-db-create-cbj5x\" (UID: \"165da568-dddb-472f-860a-0f36faa35334\") " pod="openstack/barbican-db-create-cbj5x" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.412813 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/165da568-dddb-472f-860a-0f36faa35334-operator-scripts\") pod \"barbican-db-create-cbj5x\" (UID: \"165da568-dddb-472f-860a-0f36faa35334\") " pod="openstack/barbican-db-create-cbj5x" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.412925 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02-operator-scripts\") pod \"cinder-c4c2-account-create-update-6lfzq\" (UID: \"ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02\") " pod="openstack/cinder-c4c2-account-create-update-6lfzq" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.452722 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-0de7-account-create-update-zxjmn"] Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.454013 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0de7-account-create-update-zxjmn" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.457075 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.457563 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0de7-account-create-update-zxjmn"] Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.514069 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-jq2gm"] Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.515321 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jq2gm" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.516503 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02-operator-scripts\") pod \"cinder-c4c2-account-create-update-6lfzq\" (UID: \"ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02\") " pod="openstack/cinder-c4c2-account-create-update-6lfzq" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.517526 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02-operator-scripts\") pod \"cinder-c4c2-account-create-update-6lfzq\" (UID: \"ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02\") " pod="openstack/cinder-c4c2-account-create-update-6lfzq" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.518764 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-9mbg7" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.519082 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbprm\" (UniqueName: \"kubernetes.io/projected/ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02-kube-api-access-hbprm\") pod \"cinder-c4c2-account-create-update-6lfzq\" (UID: \"ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02\") " pod="openstack/cinder-c4c2-account-create-update-6lfzq" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.522929 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.523153 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.523268 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vq6n8" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.523390 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.523567 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmdzz\" (UniqueName: \"kubernetes.io/projected/165da568-dddb-472f-860a-0f36faa35334-kube-api-access-bmdzz\") pod \"barbican-db-create-cbj5x\" (UID: \"165da568-dddb-472f-860a-0f36faa35334\") " pod="openstack/barbican-db-create-cbj5x" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.523677 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/165da568-dddb-472f-860a-0f36faa35334-operator-scripts\") pod \"barbican-db-create-cbj5x\" (UID: \"165da568-dddb-472f-860a-0f36faa35334\") " pod="openstack/barbican-db-create-cbj5x" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.523750 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgx6w\" (UniqueName: \"kubernetes.io/projected/31cf51cc-3132-467e-8e83-a8633775caa9-kube-api-access-wgx6w\") pod \"barbican-0de7-account-create-update-zxjmn\" (UID: \"31cf51cc-3132-467e-8e83-a8633775caa9\") " pod="openstack/barbican-0de7-account-create-update-zxjmn" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.523878 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31cf51cc-3132-467e-8e83-a8633775caa9-operator-scripts\") pod \"barbican-0de7-account-create-update-zxjmn\" (UID: \"31cf51cc-3132-467e-8e83-a8633775caa9\") " pod="openstack/barbican-0de7-account-create-update-zxjmn" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.524747 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/165da568-dddb-472f-860a-0f36faa35334-operator-scripts\") pod \"barbican-db-create-cbj5x\" (UID: \"165da568-dddb-472f-860a-0f36faa35334\") " pod="openstack/barbican-db-create-cbj5x" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.533396 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jq2gm"] Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.573271 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmdzz\" (UniqueName: \"kubernetes.io/projected/165da568-dddb-472f-860a-0f36faa35334-kube-api-access-bmdzz\") pod \"barbican-db-create-cbj5x\" (UID: \"165da568-dddb-472f-860a-0f36faa35334\") " pod="openstack/barbican-db-create-cbj5x" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.578079 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbprm\" (UniqueName: \"kubernetes.io/projected/ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02-kube-api-access-hbprm\") pod \"cinder-c4c2-account-create-update-6lfzq\" (UID: \"ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02\") " pod="openstack/cinder-c4c2-account-create-update-6lfzq" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.601521 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5aed11bb-ab80-496c-b92a-3fdc6a9fc044" path="/var/lib/kubelet/pods/5aed11bb-ab80-496c-b92a-3fdc6a9fc044/volumes" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.607055 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-pp5qd"] Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.607999 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-pp5qd" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.611939 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c4c2-account-create-update-6lfzq" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.628798 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9d00f72-7218-4bbf-bbfd-cd664d5be035-combined-ca-bundle\") pod \"keystone-db-sync-jq2gm\" (UID: \"d9d00f72-7218-4bbf-bbfd-cd664d5be035\") " pod="openstack/keystone-db-sync-jq2gm" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.628851 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgx6w\" (UniqueName: \"kubernetes.io/projected/31cf51cc-3132-467e-8e83-a8633775caa9-kube-api-access-wgx6w\") pod \"barbican-0de7-account-create-update-zxjmn\" (UID: \"31cf51cc-3132-467e-8e83-a8633775caa9\") " pod="openstack/barbican-0de7-account-create-update-zxjmn" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.628913 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mqfk\" (UniqueName: \"kubernetes.io/projected/d9d00f72-7218-4bbf-bbfd-cd664d5be035-kube-api-access-2mqfk\") pod \"keystone-db-sync-jq2gm\" (UID: \"d9d00f72-7218-4bbf-bbfd-cd664d5be035\") " pod="openstack/keystone-db-sync-jq2gm" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.628937 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31cf51cc-3132-467e-8e83-a8633775caa9-operator-scripts\") pod \"barbican-0de7-account-create-update-zxjmn\" (UID: \"31cf51cc-3132-467e-8e83-a8633775caa9\") " pod="openstack/barbican-0de7-account-create-update-zxjmn" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.628986 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9d00f72-7218-4bbf-bbfd-cd664d5be035-config-data\") pod \"keystone-db-sync-jq2gm\" (UID: \"d9d00f72-7218-4bbf-bbfd-cd664d5be035\") " pod="openstack/keystone-db-sync-jq2gm" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.633096 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31cf51cc-3132-467e-8e83-a8633775caa9-operator-scripts\") pod \"barbican-0de7-account-create-update-zxjmn\" (UID: \"31cf51cc-3132-467e-8e83-a8633775caa9\") " pod="openstack/barbican-0de7-account-create-update-zxjmn" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.634382 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cbj5x" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.640743 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-pp5qd"] Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.668981 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgx6w\" (UniqueName: \"kubernetes.io/projected/31cf51cc-3132-467e-8e83-a8633775caa9-kube-api-access-wgx6w\") pod \"barbican-0de7-account-create-update-zxjmn\" (UID: \"31cf51cc-3132-467e-8e83-a8633775caa9\") " pod="openstack/barbican-0de7-account-create-update-zxjmn" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.709041 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-54df-account-create-update-cq6z9"] Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.713857 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54df-account-create-update-cq6z9" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.719997 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.726253 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-54df-account-create-update-cq6z9"] Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.731041 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9d00f72-7218-4bbf-bbfd-cd664d5be035-combined-ca-bundle\") pod \"keystone-db-sync-jq2gm\" (UID: \"d9d00f72-7218-4bbf-bbfd-cd664d5be035\") " pod="openstack/keystone-db-sync-jq2gm" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.731086 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/babff9b3-3ae2-49df-9b15-c4c7110c21f6-operator-scripts\") pod \"neutron-db-create-pp5qd\" (UID: \"babff9b3-3ae2-49df-9b15-c4c7110c21f6\") " pod="openstack/neutron-db-create-pp5qd" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.731136 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mqfk\" (UniqueName: \"kubernetes.io/projected/d9d00f72-7218-4bbf-bbfd-cd664d5be035-kube-api-access-2mqfk\") pod \"keystone-db-sync-jq2gm\" (UID: \"d9d00f72-7218-4bbf-bbfd-cd664d5be035\") " pod="openstack/keystone-db-sync-jq2gm" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.731167 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2b2s\" (UniqueName: \"kubernetes.io/projected/babff9b3-3ae2-49df-9b15-c4c7110c21f6-kube-api-access-t2b2s\") pod \"neutron-db-create-pp5qd\" (UID: \"babff9b3-3ae2-49df-9b15-c4c7110c21f6\") " pod="openstack/neutron-db-create-pp5qd" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.731207 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9d00f72-7218-4bbf-bbfd-cd664d5be035-config-data\") pod \"keystone-db-sync-jq2gm\" (UID: \"d9d00f72-7218-4bbf-bbfd-cd664d5be035\") " pod="openstack/keystone-db-sync-jq2gm" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.734350 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9d00f72-7218-4bbf-bbfd-cd664d5be035-config-data\") pod \"keystone-db-sync-jq2gm\" (UID: \"d9d00f72-7218-4bbf-bbfd-cd664d5be035\") " pod="openstack/keystone-db-sync-jq2gm" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.735980 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9d00f72-7218-4bbf-bbfd-cd664d5be035-combined-ca-bundle\") pod \"keystone-db-sync-jq2gm\" (UID: \"d9d00f72-7218-4bbf-bbfd-cd664d5be035\") " pod="openstack/keystone-db-sync-jq2gm" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.768854 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mqfk\" (UniqueName: \"kubernetes.io/projected/d9d00f72-7218-4bbf-bbfd-cd664d5be035-kube-api-access-2mqfk\") pod \"keystone-db-sync-jq2gm\" (UID: \"d9d00f72-7218-4bbf-bbfd-cd664d5be035\") " pod="openstack/keystone-db-sync-jq2gm" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.777038 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0de7-account-create-update-zxjmn" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.832731 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2b2s\" (UniqueName: \"kubernetes.io/projected/babff9b3-3ae2-49df-9b15-c4c7110c21f6-kube-api-access-t2b2s\") pod \"neutron-db-create-pp5qd\" (UID: \"babff9b3-3ae2-49df-9b15-c4c7110c21f6\") " pod="openstack/neutron-db-create-pp5qd" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.832819 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4eaceb20-8947-487b-b126-28b9509598ef-operator-scripts\") pod \"neutron-54df-account-create-update-cq6z9\" (UID: \"4eaceb20-8947-487b-b126-28b9509598ef\") " pod="openstack/neutron-54df-account-create-update-cq6z9" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.832860 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk44q\" (UniqueName: \"kubernetes.io/projected/4eaceb20-8947-487b-b126-28b9509598ef-kube-api-access-xk44q\") pod \"neutron-54df-account-create-update-cq6z9\" (UID: \"4eaceb20-8947-487b-b126-28b9509598ef\") " pod="openstack/neutron-54df-account-create-update-cq6z9" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.832896 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/babff9b3-3ae2-49df-9b15-c4c7110c21f6-operator-scripts\") pod \"neutron-db-create-pp5qd\" (UID: \"babff9b3-3ae2-49df-9b15-c4c7110c21f6\") " pod="openstack/neutron-db-create-pp5qd" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.833493 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/babff9b3-3ae2-49df-9b15-c4c7110c21f6-operator-scripts\") pod \"neutron-db-create-pp5qd\" (UID: \"babff9b3-3ae2-49df-9b15-c4c7110c21f6\") " pod="openstack/neutron-db-create-pp5qd" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.843854 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jq2gm" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.854736 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2b2s\" (UniqueName: \"kubernetes.io/projected/babff9b3-3ae2-49df-9b15-c4c7110c21f6-kube-api-access-t2b2s\") pod \"neutron-db-create-pp5qd\" (UID: \"babff9b3-3ae2-49df-9b15-c4c7110c21f6\") " pod="openstack/neutron-db-create-pp5qd" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.935562 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4eaceb20-8947-487b-b126-28b9509598ef-operator-scripts\") pod \"neutron-54df-account-create-update-cq6z9\" (UID: \"4eaceb20-8947-487b-b126-28b9509598ef\") " pod="openstack/neutron-54df-account-create-update-cq6z9" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.935635 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk44q\" (UniqueName: \"kubernetes.io/projected/4eaceb20-8947-487b-b126-28b9509598ef-kube-api-access-xk44q\") pod \"neutron-54df-account-create-update-cq6z9\" (UID: \"4eaceb20-8947-487b-b126-28b9509598ef\") " pod="openstack/neutron-54df-account-create-update-cq6z9" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.936406 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4eaceb20-8947-487b-b126-28b9509598ef-operator-scripts\") pod \"neutron-54df-account-create-update-cq6z9\" (UID: \"4eaceb20-8947-487b-b126-28b9509598ef\") " pod="openstack/neutron-54df-account-create-update-cq6z9" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.952869 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk44q\" (UniqueName: \"kubernetes.io/projected/4eaceb20-8947-487b-b126-28b9509598ef-kube-api-access-xk44q\") pod \"neutron-54df-account-create-update-cq6z9\" (UID: \"4eaceb20-8947-487b-b126-28b9509598ef\") " pod="openstack/neutron-54df-account-create-update-cq6z9" Dec 06 09:23:16 crc kubenswrapper[4672]: I1206 09:23:16.987830 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-pp5qd" Dec 06 09:23:17 crc kubenswrapper[4672]: I1206 09:23:17.037797 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54df-account-create-update-cq6z9" Dec 06 09:23:17 crc kubenswrapper[4672]: I1206 09:23:17.219319 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-cbj5x"] Dec 06 09:23:17 crc kubenswrapper[4672]: I1206 09:23:17.231042 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-9mbg7"] Dec 06 09:23:17 crc kubenswrapper[4672]: I1206 09:23:17.386711 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c4c2-account-create-update-6lfzq"] Dec 06 09:23:17 crc kubenswrapper[4672]: I1206 09:23:17.464913 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0de7-account-create-update-zxjmn"] Dec 06 09:23:17 crc kubenswrapper[4672]: I1206 09:23:17.610136 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jq2gm"] Dec 06 09:23:17 crc kubenswrapper[4672]: I1206 09:23:17.767984 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-54df-account-create-update-cq6z9"] Dec 06 09:23:17 crc kubenswrapper[4672]: I1206 09:23:17.786909 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-pp5qd"] Dec 06 09:23:17 crc kubenswrapper[4672]: W1206 09:23:17.792812 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbabff9b3_3ae2_49df_9b15_c4c7110c21f6.slice/crio-1f6b099b8d39409480c26b9d558179854159fbc653221940270b6bd86e444492 WatchSource:0}: Error finding container 1f6b099b8d39409480c26b9d558179854159fbc653221940270b6bd86e444492: Status 404 returned error can't find the container with id 1f6b099b8d39409480c26b9d558179854159fbc653221940270b6bd86e444492 Dec 06 09:23:18 crc kubenswrapper[4672]: I1206 09:23:18.099953 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0de7-account-create-update-zxjmn" event={"ID":"31cf51cc-3132-467e-8e83-a8633775caa9","Type":"ContainerStarted","Data":"d8b784aa4ff03fd54abb29abbb3979fe86931fe2c7fd1fbc2f6d783b470fe074"} Dec 06 09:23:18 crc kubenswrapper[4672]: I1206 09:23:18.099994 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0de7-account-create-update-zxjmn" event={"ID":"31cf51cc-3132-467e-8e83-a8633775caa9","Type":"ContainerStarted","Data":"e341de9190cd6ebb07193a19e12998e4186ee206385d11a830712ac8f0dc2656"} Dec 06 09:23:18 crc kubenswrapper[4672]: I1206 09:23:18.119434 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-pp5qd" event={"ID":"babff9b3-3ae2-49df-9b15-c4c7110c21f6","Type":"ContainerStarted","Data":"1f6b099b8d39409480c26b9d558179854159fbc653221940270b6bd86e444492"} Dec 06 09:23:18 crc kubenswrapper[4672]: I1206 09:23:18.127536 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54df-account-create-update-cq6z9" event={"ID":"4eaceb20-8947-487b-b126-28b9509598ef","Type":"ContainerStarted","Data":"fbfa4995d97197c8560fcb0f835b5ae70d88604e5b817ad3ea5e33d3ffd40679"} Dec 06 09:23:18 crc kubenswrapper[4672]: I1206 09:23:18.140275 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-9mbg7" event={"ID":"498d72d7-0324-4ab2-9208-0b32e1df2efa","Type":"ContainerStarted","Data":"805a53042030daa2c1cbac1e0e89a3b90c7e98a10747d1a136497b415ddccb40"} Dec 06 09:23:18 crc kubenswrapper[4672]: I1206 09:23:18.140805 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-9mbg7" event={"ID":"498d72d7-0324-4ab2-9208-0b32e1df2efa","Type":"ContainerStarted","Data":"ce86857446e6bc82347a3523f0a9090b0e92bb20eb13e197e4db4962980daa29"} Dec 06 09:23:18 crc kubenswrapper[4672]: I1206 09:23:18.142762 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cbj5x" event={"ID":"165da568-dddb-472f-860a-0f36faa35334","Type":"ContainerStarted","Data":"ee65c5fd5ed129c3f69bccfa941a91c09f6bcf684773da7bffea705b0be720e1"} Dec 06 09:23:18 crc kubenswrapper[4672]: I1206 09:23:18.142812 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cbj5x" event={"ID":"165da568-dddb-472f-860a-0f36faa35334","Type":"ContainerStarted","Data":"ffaad55eefc811e19b2d3be88134dec7ddf4b8339e6992514bc916e92b3883fa"} Dec 06 09:23:18 crc kubenswrapper[4672]: I1206 09:23:18.143504 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-0de7-account-create-update-zxjmn" podStartSLOduration=2.143488186 podStartE2EDuration="2.143488186s" podCreationTimestamp="2025-12-06 09:23:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:23:18.139459127 +0000 UTC m=+1015.883719414" watchObservedRunningTime="2025-12-06 09:23:18.143488186 +0000 UTC m=+1015.887748473" Dec 06 09:23:18 crc kubenswrapper[4672]: I1206 09:23:18.153353 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jq2gm" event={"ID":"d9d00f72-7218-4bbf-bbfd-cd664d5be035","Type":"ContainerStarted","Data":"a96d7778b124b1f9f3deca7f27b49526bb3c7b1a998d6e960c7b19992404879a"} Dec 06 09:23:18 crc kubenswrapper[4672]: I1206 09:23:18.173092 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c4c2-account-create-update-6lfzq" event={"ID":"ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02","Type":"ContainerStarted","Data":"fddd92464dd7a0e099dab5aa9f85a16b03b77d4a2cdbcee0fb70b97d2606c96e"} Dec 06 09:23:18 crc kubenswrapper[4672]: I1206 09:23:18.173142 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c4c2-account-create-update-6lfzq" event={"ID":"ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02","Type":"ContainerStarted","Data":"0304f6e99db79b998c09f4e9442ead4d0338b7af816cd7e2fbe528deea54079b"} Dec 06 09:23:18 crc kubenswrapper[4672]: I1206 09:23:18.202050 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-9mbg7" podStartSLOduration=2.202031126 podStartE2EDuration="2.202031126s" podCreationTimestamp="2025-12-06 09:23:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:23:18.19410739 +0000 UTC m=+1015.938367677" watchObservedRunningTime="2025-12-06 09:23:18.202031126 +0000 UTC m=+1015.946291413" Dec 06 09:23:18 crc kubenswrapper[4672]: I1206 09:23:18.202739 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-54df-account-create-update-cq6z9" podStartSLOduration=2.202732816 podStartE2EDuration="2.202732816s" podCreationTimestamp="2025-12-06 09:23:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:23:18.176461191 +0000 UTC m=+1015.920721468" watchObservedRunningTime="2025-12-06 09:23:18.202732816 +0000 UTC m=+1015.946993103" Dec 06 09:23:18 crc kubenswrapper[4672]: I1206 09:23:18.213316 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-c4c2-account-create-update-6lfzq" podStartSLOduration=2.213299842 podStartE2EDuration="2.213299842s" podCreationTimestamp="2025-12-06 09:23:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:23:18.21249039 +0000 UTC m=+1015.956750677" watchObservedRunningTime="2025-12-06 09:23:18.213299842 +0000 UTC m=+1015.957560129" Dec 06 09:23:18 crc kubenswrapper[4672]: I1206 09:23:18.251720 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-cbj5x" podStartSLOduration=2.251694225 podStartE2EDuration="2.251694225s" podCreationTimestamp="2025-12-06 09:23:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:23:18.246027991 +0000 UTC m=+1015.990288278" watchObservedRunningTime="2025-12-06 09:23:18.251694225 +0000 UTC m=+1015.995954512" Dec 06 09:23:19 crc kubenswrapper[4672]: I1206 09:23:19.256939 4672 generic.go:334] "Generic (PLEG): container finished" podID="4eaceb20-8947-487b-b126-28b9509598ef" containerID="38cfbdc5b578be4bc8957b97d561849aece809e2f3d4ade4cf597f1f7de51e61" exitCode=0 Dec 06 09:23:19 crc kubenswrapper[4672]: I1206 09:23:19.257028 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54df-account-create-update-cq6z9" event={"ID":"4eaceb20-8947-487b-b126-28b9509598ef","Type":"ContainerDied","Data":"38cfbdc5b578be4bc8957b97d561849aece809e2f3d4ade4cf597f1f7de51e61"} Dec 06 09:23:19 crc kubenswrapper[4672]: I1206 09:23:19.259008 4672 generic.go:334] "Generic (PLEG): container finished" podID="498d72d7-0324-4ab2-9208-0b32e1df2efa" containerID="805a53042030daa2c1cbac1e0e89a3b90c7e98a10747d1a136497b415ddccb40" exitCode=0 Dec 06 09:23:19 crc kubenswrapper[4672]: I1206 09:23:19.259103 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-9mbg7" event={"ID":"498d72d7-0324-4ab2-9208-0b32e1df2efa","Type":"ContainerDied","Data":"805a53042030daa2c1cbac1e0e89a3b90c7e98a10747d1a136497b415ddccb40"} Dec 06 09:23:19 crc kubenswrapper[4672]: I1206 09:23:19.278032 4672 generic.go:334] "Generic (PLEG): container finished" podID="165da568-dddb-472f-860a-0f36faa35334" containerID="ee65c5fd5ed129c3f69bccfa941a91c09f6bcf684773da7bffea705b0be720e1" exitCode=0 Dec 06 09:23:19 crc kubenswrapper[4672]: I1206 09:23:19.278148 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cbj5x" event={"ID":"165da568-dddb-472f-860a-0f36faa35334","Type":"ContainerDied","Data":"ee65c5fd5ed129c3f69bccfa941a91c09f6bcf684773da7bffea705b0be720e1"} Dec 06 09:23:19 crc kubenswrapper[4672]: I1206 09:23:19.285647 4672 generic.go:334] "Generic (PLEG): container finished" podID="ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02" containerID="fddd92464dd7a0e099dab5aa9f85a16b03b77d4a2cdbcee0fb70b97d2606c96e" exitCode=0 Dec 06 09:23:19 crc kubenswrapper[4672]: I1206 09:23:19.285745 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c4c2-account-create-update-6lfzq" event={"ID":"ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02","Type":"ContainerDied","Data":"fddd92464dd7a0e099dab5aa9f85a16b03b77d4a2cdbcee0fb70b97d2606c96e"} Dec 06 09:23:19 crc kubenswrapper[4672]: I1206 09:23:19.287960 4672 generic.go:334] "Generic (PLEG): container finished" podID="31cf51cc-3132-467e-8e83-a8633775caa9" containerID="d8b784aa4ff03fd54abb29abbb3979fe86931fe2c7fd1fbc2f6d783b470fe074" exitCode=0 Dec 06 09:23:19 crc kubenswrapper[4672]: I1206 09:23:19.288095 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0de7-account-create-update-zxjmn" event={"ID":"31cf51cc-3132-467e-8e83-a8633775caa9","Type":"ContainerDied","Data":"d8b784aa4ff03fd54abb29abbb3979fe86931fe2c7fd1fbc2f6d783b470fe074"} Dec 06 09:23:19 crc kubenswrapper[4672]: I1206 09:23:19.290586 4672 generic.go:334] "Generic (PLEG): container finished" podID="babff9b3-3ae2-49df-9b15-c4c7110c21f6" containerID="09408829ab4f8d61406eb319b486ffe5219bbaca59cc509747aa7eaa5cd79ca3" exitCode=0 Dec 06 09:23:19 crc kubenswrapper[4672]: I1206 09:23:19.290663 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-pp5qd" event={"ID":"babff9b3-3ae2-49df-9b15-c4c7110c21f6","Type":"ContainerDied","Data":"09408829ab4f8d61406eb319b486ffe5219bbaca59cc509747aa7eaa5cd79ca3"} Dec 06 09:23:20 crc kubenswrapper[4672]: I1206 09:23:20.762808 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75cb88fd77-prz47" Dec 06 09:23:20 crc kubenswrapper[4672]: I1206 09:23:20.822214 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-984c76dd7-mc6xv"] Dec 06 09:23:20 crc kubenswrapper[4672]: I1206 09:23:20.828004 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-984c76dd7-mc6xv" podUID="00fa21e3-1eba-4ac6-9eb7-330297e229fb" containerName="dnsmasq-dns" containerID="cri-o://069624d6f78c879722c48df2461e59ab428ff9cd50de3dd13265af4422a0f040" gracePeriod=10 Dec 06 09:23:21 crc kubenswrapper[4672]: I1206 09:23:21.318988 4672 generic.go:334] "Generic (PLEG): container finished" podID="00fa21e3-1eba-4ac6-9eb7-330297e229fb" containerID="069624d6f78c879722c48df2461e59ab428ff9cd50de3dd13265af4422a0f040" exitCode=0 Dec 06 09:23:21 crc kubenswrapper[4672]: I1206 09:23:21.319059 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-984c76dd7-mc6xv" event={"ID":"00fa21e3-1eba-4ac6-9eb7-330297e229fb","Type":"ContainerDied","Data":"069624d6f78c879722c48df2461e59ab428ff9cd50de3dd13265af4422a0f040"} Dec 06 09:23:22 crc kubenswrapper[4672]: I1206 09:23:22.385246 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-984c76dd7-mc6xv" podUID="00fa21e3-1eba-4ac6-9eb7-330297e229fb" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.109:5353: connect: connection refused" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.343030 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0de7-account-create-update-zxjmn" event={"ID":"31cf51cc-3132-467e-8e83-a8633775caa9","Type":"ContainerDied","Data":"e341de9190cd6ebb07193a19e12998e4186ee206385d11a830712ac8f0dc2656"} Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.343337 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e341de9190cd6ebb07193a19e12998e4186ee206385d11a830712ac8f0dc2656" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.344623 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-pp5qd" event={"ID":"babff9b3-3ae2-49df-9b15-c4c7110c21f6","Type":"ContainerDied","Data":"1f6b099b8d39409480c26b9d558179854159fbc653221940270b6bd86e444492"} Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.344643 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f6b099b8d39409480c26b9d558179854159fbc653221940270b6bd86e444492" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.348132 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54df-account-create-update-cq6z9" event={"ID":"4eaceb20-8947-487b-b126-28b9509598ef","Type":"ContainerDied","Data":"fbfa4995d97197c8560fcb0f835b5ae70d88604e5b817ad3ea5e33d3ffd40679"} Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.348160 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbfa4995d97197c8560fcb0f835b5ae70d88604e5b817ad3ea5e33d3ffd40679" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.351065 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-9mbg7" event={"ID":"498d72d7-0324-4ab2-9208-0b32e1df2efa","Type":"ContainerDied","Data":"ce86857446e6bc82347a3523f0a9090b0e92bb20eb13e197e4db4962980daa29"} Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.351091 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce86857446e6bc82347a3523f0a9090b0e92bb20eb13e197e4db4962980daa29" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.352803 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cbj5x" event={"ID":"165da568-dddb-472f-860a-0f36faa35334","Type":"ContainerDied","Data":"ffaad55eefc811e19b2d3be88134dec7ddf4b8339e6992514bc916e92b3883fa"} Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.352831 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffaad55eefc811e19b2d3be88134dec7ddf4b8339e6992514bc916e92b3883fa" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.354762 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c4c2-account-create-update-6lfzq" event={"ID":"ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02","Type":"ContainerDied","Data":"0304f6e99db79b998c09f4e9442ead4d0338b7af816cd7e2fbe528deea54079b"} Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.354791 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0304f6e99db79b998c09f4e9442ead4d0338b7af816cd7e2fbe528deea54079b" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.393350 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54df-account-create-update-cq6z9" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.398747 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-9mbg7" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.452915 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c4c2-account-create-update-6lfzq" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.454503 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-pp5qd" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.460306 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cbj5x" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.462555 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcncv\" (UniqueName: \"kubernetes.io/projected/498d72d7-0324-4ab2-9208-0b32e1df2efa-kube-api-access-fcncv\") pod \"498d72d7-0324-4ab2-9208-0b32e1df2efa\" (UID: \"498d72d7-0324-4ab2-9208-0b32e1df2efa\") " Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.462669 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4eaceb20-8947-487b-b126-28b9509598ef-operator-scripts\") pod \"4eaceb20-8947-487b-b126-28b9509598ef\" (UID: \"4eaceb20-8947-487b-b126-28b9509598ef\") " Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.462693 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/498d72d7-0324-4ab2-9208-0b32e1df2efa-operator-scripts\") pod \"498d72d7-0324-4ab2-9208-0b32e1df2efa\" (UID: \"498d72d7-0324-4ab2-9208-0b32e1df2efa\") " Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.462785 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xk44q\" (UniqueName: \"kubernetes.io/projected/4eaceb20-8947-487b-b126-28b9509598ef-kube-api-access-xk44q\") pod \"4eaceb20-8947-487b-b126-28b9509598ef\" (UID: \"4eaceb20-8947-487b-b126-28b9509598ef\") " Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.463960 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4eaceb20-8947-487b-b126-28b9509598ef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4eaceb20-8947-487b-b126-28b9509598ef" (UID: "4eaceb20-8947-487b-b126-28b9509598ef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.466001 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/498d72d7-0324-4ab2-9208-0b32e1df2efa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "498d72d7-0324-4ab2-9208-0b32e1df2efa" (UID: "498d72d7-0324-4ab2-9208-0b32e1df2efa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.470866 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eaceb20-8947-487b-b126-28b9509598ef-kube-api-access-xk44q" (OuterVolumeSpecName: "kube-api-access-xk44q") pod "4eaceb20-8947-487b-b126-28b9509598ef" (UID: "4eaceb20-8947-487b-b126-28b9509598ef"). InnerVolumeSpecName "kube-api-access-xk44q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.472748 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/498d72d7-0324-4ab2-9208-0b32e1df2efa-kube-api-access-fcncv" (OuterVolumeSpecName: "kube-api-access-fcncv") pod "498d72d7-0324-4ab2-9208-0b32e1df2efa" (UID: "498d72d7-0324-4ab2-9208-0b32e1df2efa"). InnerVolumeSpecName "kube-api-access-fcncv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.472983 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0de7-account-create-update-zxjmn" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.480031 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-984c76dd7-mc6xv" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.563848 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2b2s\" (UniqueName: \"kubernetes.io/projected/babff9b3-3ae2-49df-9b15-c4c7110c21f6-kube-api-access-t2b2s\") pod \"babff9b3-3ae2-49df-9b15-c4c7110c21f6\" (UID: \"babff9b3-3ae2-49df-9b15-c4c7110c21f6\") " Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.563889 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/babff9b3-3ae2-49df-9b15-c4c7110c21f6-operator-scripts\") pod \"babff9b3-3ae2-49df-9b15-c4c7110c21f6\" (UID: \"babff9b3-3ae2-49df-9b15-c4c7110c21f6\") " Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.563914 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00fa21e3-1eba-4ac6-9eb7-330297e229fb-ovsdbserver-sb\") pod \"00fa21e3-1eba-4ac6-9eb7-330297e229fb\" (UID: \"00fa21e3-1eba-4ac6-9eb7-330297e229fb\") " Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.563934 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00fa21e3-1eba-4ac6-9eb7-330297e229fb-ovsdbserver-nb\") pod \"00fa21e3-1eba-4ac6-9eb7-330297e229fb\" (UID: \"00fa21e3-1eba-4ac6-9eb7-330297e229fb\") " Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.563973 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/165da568-dddb-472f-860a-0f36faa35334-operator-scripts\") pod \"165da568-dddb-472f-860a-0f36faa35334\" (UID: \"165da568-dddb-472f-860a-0f36faa35334\") " Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.563998 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31cf51cc-3132-467e-8e83-a8633775caa9-operator-scripts\") pod \"31cf51cc-3132-467e-8e83-a8633775caa9\" (UID: \"31cf51cc-3132-467e-8e83-a8633775caa9\") " Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.564061 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00fa21e3-1eba-4ac6-9eb7-330297e229fb-dns-svc\") pod \"00fa21e3-1eba-4ac6-9eb7-330297e229fb\" (UID: \"00fa21e3-1eba-4ac6-9eb7-330297e229fb\") " Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.564080 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mghp8\" (UniqueName: \"kubernetes.io/projected/00fa21e3-1eba-4ac6-9eb7-330297e229fb-kube-api-access-mghp8\") pod \"00fa21e3-1eba-4ac6-9eb7-330297e229fb\" (UID: \"00fa21e3-1eba-4ac6-9eb7-330297e229fb\") " Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.564100 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmdzz\" (UniqueName: \"kubernetes.io/projected/165da568-dddb-472f-860a-0f36faa35334-kube-api-access-bmdzz\") pod \"165da568-dddb-472f-860a-0f36faa35334\" (UID: \"165da568-dddb-472f-860a-0f36faa35334\") " Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.564239 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbprm\" (UniqueName: \"kubernetes.io/projected/ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02-kube-api-access-hbprm\") pod \"ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02\" (UID: \"ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02\") " Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.564292 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00fa21e3-1eba-4ac6-9eb7-330297e229fb-config\") pod \"00fa21e3-1eba-4ac6-9eb7-330297e229fb\" (UID: \"00fa21e3-1eba-4ac6-9eb7-330297e229fb\") " Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.564311 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgx6w\" (UniqueName: \"kubernetes.io/projected/31cf51cc-3132-467e-8e83-a8633775caa9-kube-api-access-wgx6w\") pod \"31cf51cc-3132-467e-8e83-a8633775caa9\" (UID: \"31cf51cc-3132-467e-8e83-a8633775caa9\") " Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.564346 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02-operator-scripts\") pod \"ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02\" (UID: \"ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02\") " Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.564398 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/babff9b3-3ae2-49df-9b15-c4c7110c21f6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "babff9b3-3ae2-49df-9b15-c4c7110c21f6" (UID: "babff9b3-3ae2-49df-9b15-c4c7110c21f6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.564636 4672 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4eaceb20-8947-487b-b126-28b9509598ef-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.564654 4672 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/498d72d7-0324-4ab2-9208-0b32e1df2efa-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.564664 4672 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/babff9b3-3ae2-49df-9b15-c4c7110c21f6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.564672 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xk44q\" (UniqueName: \"kubernetes.io/projected/4eaceb20-8947-487b-b126-28b9509598ef-kube-api-access-xk44q\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.564685 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcncv\" (UniqueName: \"kubernetes.io/projected/498d72d7-0324-4ab2-9208-0b32e1df2efa-kube-api-access-fcncv\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.564814 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/165da568-dddb-472f-860a-0f36faa35334-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "165da568-dddb-472f-860a-0f36faa35334" (UID: "165da568-dddb-472f-860a-0f36faa35334"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.565062 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02" (UID: "ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.565194 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31cf51cc-3132-467e-8e83-a8633775caa9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "31cf51cc-3132-467e-8e83-a8633775caa9" (UID: "31cf51cc-3132-467e-8e83-a8633775caa9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.568538 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/babff9b3-3ae2-49df-9b15-c4c7110c21f6-kube-api-access-t2b2s" (OuterVolumeSpecName: "kube-api-access-t2b2s") pod "babff9b3-3ae2-49df-9b15-c4c7110c21f6" (UID: "babff9b3-3ae2-49df-9b15-c4c7110c21f6"). InnerVolumeSpecName "kube-api-access-t2b2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.568637 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00fa21e3-1eba-4ac6-9eb7-330297e229fb-kube-api-access-mghp8" (OuterVolumeSpecName: "kube-api-access-mghp8") pod "00fa21e3-1eba-4ac6-9eb7-330297e229fb" (UID: "00fa21e3-1eba-4ac6-9eb7-330297e229fb"). InnerVolumeSpecName "kube-api-access-mghp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.568753 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02-kube-api-access-hbprm" (OuterVolumeSpecName: "kube-api-access-hbprm") pod "ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02" (UID: "ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02"). InnerVolumeSpecName "kube-api-access-hbprm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.580364 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/165da568-dddb-472f-860a-0f36faa35334-kube-api-access-bmdzz" (OuterVolumeSpecName: "kube-api-access-bmdzz") pod "165da568-dddb-472f-860a-0f36faa35334" (UID: "165da568-dddb-472f-860a-0f36faa35334"). InnerVolumeSpecName "kube-api-access-bmdzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.582893 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31cf51cc-3132-467e-8e83-a8633775caa9-kube-api-access-wgx6w" (OuterVolumeSpecName: "kube-api-access-wgx6w") pod "31cf51cc-3132-467e-8e83-a8633775caa9" (UID: "31cf51cc-3132-467e-8e83-a8633775caa9"). InnerVolumeSpecName "kube-api-access-wgx6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.615319 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00fa21e3-1eba-4ac6-9eb7-330297e229fb-config" (OuterVolumeSpecName: "config") pod "00fa21e3-1eba-4ac6-9eb7-330297e229fb" (UID: "00fa21e3-1eba-4ac6-9eb7-330297e229fb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.615373 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00fa21e3-1eba-4ac6-9eb7-330297e229fb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "00fa21e3-1eba-4ac6-9eb7-330297e229fb" (UID: "00fa21e3-1eba-4ac6-9eb7-330297e229fb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.624115 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00fa21e3-1eba-4ac6-9eb7-330297e229fb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "00fa21e3-1eba-4ac6-9eb7-330297e229fb" (UID: "00fa21e3-1eba-4ac6-9eb7-330297e229fb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.625127 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00fa21e3-1eba-4ac6-9eb7-330297e229fb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "00fa21e3-1eba-4ac6-9eb7-330297e229fb" (UID: "00fa21e3-1eba-4ac6-9eb7-330297e229fb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.666646 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbprm\" (UniqueName: \"kubernetes.io/projected/ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02-kube-api-access-hbprm\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.666703 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00fa21e3-1eba-4ac6-9eb7-330297e229fb-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.666717 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgx6w\" (UniqueName: \"kubernetes.io/projected/31cf51cc-3132-467e-8e83-a8633775caa9-kube-api-access-wgx6w\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.666731 4672 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.666743 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2b2s\" (UniqueName: \"kubernetes.io/projected/babff9b3-3ae2-49df-9b15-c4c7110c21f6-kube-api-access-t2b2s\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.666754 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00fa21e3-1eba-4ac6-9eb7-330297e229fb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.666769 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00fa21e3-1eba-4ac6-9eb7-330297e229fb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.666780 4672 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/165da568-dddb-472f-860a-0f36faa35334-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.666796 4672 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31cf51cc-3132-467e-8e83-a8633775caa9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.666807 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00fa21e3-1eba-4ac6-9eb7-330297e229fb-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.666819 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mghp8\" (UniqueName: \"kubernetes.io/projected/00fa21e3-1eba-4ac6-9eb7-330297e229fb-kube-api-access-mghp8\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:23 crc kubenswrapper[4672]: I1206 09:23:23.666830 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmdzz\" (UniqueName: \"kubernetes.io/projected/165da568-dddb-472f-860a-0f36faa35334-kube-api-access-bmdzz\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:24 crc kubenswrapper[4672]: I1206 09:23:24.363825 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jq2gm" event={"ID":"d9d00f72-7218-4bbf-bbfd-cd664d5be035","Type":"ContainerStarted","Data":"1499aa2a253aa13e19e1279daa60b9266aaa4840c9640a1dd613305753d51684"} Dec 06 09:23:24 crc kubenswrapper[4672]: I1206 09:23:24.365779 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c4c2-account-create-update-6lfzq" Dec 06 09:23:24 crc kubenswrapper[4672]: I1206 09:23:24.365813 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cbj5x" Dec 06 09:23:24 crc kubenswrapper[4672]: I1206 09:23:24.365831 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54df-account-create-update-cq6z9" Dec 06 09:23:24 crc kubenswrapper[4672]: I1206 09:23:24.365839 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-984c76dd7-mc6xv" event={"ID":"00fa21e3-1eba-4ac6-9eb7-330297e229fb","Type":"ContainerDied","Data":"867943f01e7f1ec4d8475ac4ebb9b900728b58b74d7fdacd72af0d917ee6f4c6"} Dec 06 09:23:24 crc kubenswrapper[4672]: I1206 09:23:24.365872 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-9mbg7" Dec 06 09:23:24 crc kubenswrapper[4672]: I1206 09:23:24.365875 4672 scope.go:117] "RemoveContainer" containerID="069624d6f78c879722c48df2461e59ab428ff9cd50de3dd13265af4422a0f040" Dec 06 09:23:24 crc kubenswrapper[4672]: I1206 09:23:24.365914 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-pp5qd" Dec 06 09:23:24 crc kubenswrapper[4672]: I1206 09:23:24.365779 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-984c76dd7-mc6xv" Dec 06 09:23:24 crc kubenswrapper[4672]: I1206 09:23:24.365955 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0de7-account-create-update-zxjmn" Dec 06 09:23:24 crc kubenswrapper[4672]: I1206 09:23:24.403823 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-jq2gm" podStartSLOduration=2.83466739 podStartE2EDuration="8.403796785s" podCreationTimestamp="2025-12-06 09:23:16 +0000 UTC" firstStartedPulling="2025-12-06 09:23:17.630588215 +0000 UTC m=+1015.374848502" lastFinishedPulling="2025-12-06 09:23:23.19971761 +0000 UTC m=+1020.943977897" observedRunningTime="2025-12-06 09:23:24.380210495 +0000 UTC m=+1022.124470782" watchObservedRunningTime="2025-12-06 09:23:24.403796785 +0000 UTC m=+1022.148057082" Dec 06 09:23:24 crc kubenswrapper[4672]: I1206 09:23:24.415529 4672 scope.go:117] "RemoveContainer" containerID="90ebba68239c1a06270308e81ae0d15fa97058c9cff698e26c6c9dd75e2904a5" Dec 06 09:23:24 crc kubenswrapper[4672]: I1206 09:23:24.565766 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-984c76dd7-mc6xv"] Dec 06 09:23:24 crc kubenswrapper[4672]: I1206 09:23:24.567208 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-984c76dd7-mc6xv"] Dec 06 09:23:26 crc kubenswrapper[4672]: I1206 09:23:26.384892 4672 generic.go:334] "Generic (PLEG): container finished" podID="d9d00f72-7218-4bbf-bbfd-cd664d5be035" containerID="1499aa2a253aa13e19e1279daa60b9266aaa4840c9640a1dd613305753d51684" exitCode=0 Dec 06 09:23:26 crc kubenswrapper[4672]: I1206 09:23:26.385277 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jq2gm" event={"ID":"d9d00f72-7218-4bbf-bbfd-cd664d5be035","Type":"ContainerDied","Data":"1499aa2a253aa13e19e1279daa60b9266aaa4840c9640a1dd613305753d51684"} Dec 06 09:23:26 crc kubenswrapper[4672]: I1206 09:23:26.569235 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00fa21e3-1eba-4ac6-9eb7-330297e229fb" path="/var/lib/kubelet/pods/00fa21e3-1eba-4ac6-9eb7-330297e229fb/volumes" Dec 06 09:23:27 crc kubenswrapper[4672]: I1206 09:23:27.803874 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jq2gm" Dec 06 09:23:27 crc kubenswrapper[4672]: I1206 09:23:27.971356 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9d00f72-7218-4bbf-bbfd-cd664d5be035-combined-ca-bundle\") pod \"d9d00f72-7218-4bbf-bbfd-cd664d5be035\" (UID: \"d9d00f72-7218-4bbf-bbfd-cd664d5be035\") " Dec 06 09:23:27 crc kubenswrapper[4672]: I1206 09:23:27.971500 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9d00f72-7218-4bbf-bbfd-cd664d5be035-config-data\") pod \"d9d00f72-7218-4bbf-bbfd-cd664d5be035\" (UID: \"d9d00f72-7218-4bbf-bbfd-cd664d5be035\") " Dec 06 09:23:27 crc kubenswrapper[4672]: I1206 09:23:27.971570 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mqfk\" (UniqueName: \"kubernetes.io/projected/d9d00f72-7218-4bbf-bbfd-cd664d5be035-kube-api-access-2mqfk\") pod \"d9d00f72-7218-4bbf-bbfd-cd664d5be035\" (UID: \"d9d00f72-7218-4bbf-bbfd-cd664d5be035\") " Dec 06 09:23:27 crc kubenswrapper[4672]: I1206 09:23:27.982272 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9d00f72-7218-4bbf-bbfd-cd664d5be035-kube-api-access-2mqfk" (OuterVolumeSpecName: "kube-api-access-2mqfk") pod "d9d00f72-7218-4bbf-bbfd-cd664d5be035" (UID: "d9d00f72-7218-4bbf-bbfd-cd664d5be035"). InnerVolumeSpecName "kube-api-access-2mqfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.019096 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9d00f72-7218-4bbf-bbfd-cd664d5be035-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9d00f72-7218-4bbf-bbfd-cd664d5be035" (UID: "d9d00f72-7218-4bbf-bbfd-cd664d5be035"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.023450 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9d00f72-7218-4bbf-bbfd-cd664d5be035-config-data" (OuterVolumeSpecName: "config-data") pod "d9d00f72-7218-4bbf-bbfd-cd664d5be035" (UID: "d9d00f72-7218-4bbf-bbfd-cd664d5be035"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.073615 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9d00f72-7218-4bbf-bbfd-cd664d5be035-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.073676 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mqfk\" (UniqueName: \"kubernetes.io/projected/d9d00f72-7218-4bbf-bbfd-cd664d5be035-kube-api-access-2mqfk\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.073689 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9d00f72-7218-4bbf-bbfd-cd664d5be035-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.413657 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jq2gm" event={"ID":"d9d00f72-7218-4bbf-bbfd-cd664d5be035","Type":"ContainerDied","Data":"a96d7778b124b1f9f3deca7f27b49526bb3c7b1a998d6e960c7b19992404879a"} Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.413698 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a96d7778b124b1f9f3deca7f27b49526bb3c7b1a998d6e960c7b19992404879a" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.413716 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jq2gm" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.688288 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-qcdzb"] Dec 06 09:23:28 crc kubenswrapper[4672]: E1206 09:23:28.688926 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="498d72d7-0324-4ab2-9208-0b32e1df2efa" containerName="mariadb-database-create" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.688952 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="498d72d7-0324-4ab2-9208-0b32e1df2efa" containerName="mariadb-database-create" Dec 06 09:23:28 crc kubenswrapper[4672]: E1206 09:23:28.688969 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="babff9b3-3ae2-49df-9b15-c4c7110c21f6" containerName="mariadb-database-create" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.688977 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="babff9b3-3ae2-49df-9b15-c4c7110c21f6" containerName="mariadb-database-create" Dec 06 09:23:28 crc kubenswrapper[4672]: E1206 09:23:28.688986 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9d00f72-7218-4bbf-bbfd-cd664d5be035" containerName="keystone-db-sync" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.688995 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9d00f72-7218-4bbf-bbfd-cd664d5be035" containerName="keystone-db-sync" Dec 06 09:23:28 crc kubenswrapper[4672]: E1206 09:23:28.689009 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31cf51cc-3132-467e-8e83-a8633775caa9" containerName="mariadb-account-create-update" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.689017 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="31cf51cc-3132-467e-8e83-a8633775caa9" containerName="mariadb-account-create-update" Dec 06 09:23:28 crc kubenswrapper[4672]: E1206 09:23:28.689032 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00fa21e3-1eba-4ac6-9eb7-330297e229fb" containerName="init" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.689040 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="00fa21e3-1eba-4ac6-9eb7-330297e229fb" containerName="init" Dec 06 09:23:28 crc kubenswrapper[4672]: E1206 09:23:28.689058 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="165da568-dddb-472f-860a-0f36faa35334" containerName="mariadb-database-create" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.689065 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="165da568-dddb-472f-860a-0f36faa35334" containerName="mariadb-database-create" Dec 06 09:23:28 crc kubenswrapper[4672]: E1206 09:23:28.689078 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02" containerName="mariadb-account-create-update" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.689087 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02" containerName="mariadb-account-create-update" Dec 06 09:23:28 crc kubenswrapper[4672]: E1206 09:23:28.689106 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eaceb20-8947-487b-b126-28b9509598ef" containerName="mariadb-account-create-update" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.689114 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eaceb20-8947-487b-b126-28b9509598ef" containerName="mariadb-account-create-update" Dec 06 09:23:28 crc kubenswrapper[4672]: E1206 09:23:28.689133 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00fa21e3-1eba-4ac6-9eb7-330297e229fb" containerName="dnsmasq-dns" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.689140 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="00fa21e3-1eba-4ac6-9eb7-330297e229fb" containerName="dnsmasq-dns" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.689338 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02" containerName="mariadb-account-create-update" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.689361 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="165da568-dddb-472f-860a-0f36faa35334" containerName="mariadb-database-create" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.689377 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="00fa21e3-1eba-4ac6-9eb7-330297e229fb" containerName="dnsmasq-dns" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.689387 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="31cf51cc-3132-467e-8e83-a8633775caa9" containerName="mariadb-account-create-update" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.689396 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="babff9b3-3ae2-49df-9b15-c4c7110c21f6" containerName="mariadb-database-create" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.689408 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eaceb20-8947-487b-b126-28b9509598ef" containerName="mariadb-account-create-update" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.689421 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="498d72d7-0324-4ab2-9208-0b32e1df2efa" containerName="mariadb-database-create" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.689432 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9d00f72-7218-4bbf-bbfd-cd664d5be035" containerName="keystone-db-sync" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.690063 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qcdzb" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.693881 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.694027 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vq6n8" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.694095 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.694257 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.697063 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.713035 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6db5d9649c-rxkr8"] Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.714548 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6db5d9649c-rxkr8" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.723311 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qcdzb"] Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.774867 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6db5d9649c-rxkr8"] Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.792906 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a75d1d69-af3e-47a1-ac03-0d03b40c0999-scripts\") pod \"keystone-bootstrap-qcdzb\" (UID: \"a75d1d69-af3e-47a1-ac03-0d03b40c0999\") " pod="openstack/keystone-bootstrap-qcdzb" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.792948 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75d1d69-af3e-47a1-ac03-0d03b40c0999-combined-ca-bundle\") pod \"keystone-bootstrap-qcdzb\" (UID: \"a75d1d69-af3e-47a1-ac03-0d03b40c0999\") " pod="openstack/keystone-bootstrap-qcdzb" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.792973 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a75d1d69-af3e-47a1-ac03-0d03b40c0999-fernet-keys\") pod \"keystone-bootstrap-qcdzb\" (UID: \"a75d1d69-af3e-47a1-ac03-0d03b40c0999\") " pod="openstack/keystone-bootstrap-qcdzb" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.792996 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/296e7802-9ad8-4dc7-8adb-cccf91467e55-ovsdbserver-sb\") pod \"dnsmasq-dns-6db5d9649c-rxkr8\" (UID: \"296e7802-9ad8-4dc7-8adb-cccf91467e55\") " pod="openstack/dnsmasq-dns-6db5d9649c-rxkr8" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.793112 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a75d1d69-af3e-47a1-ac03-0d03b40c0999-credential-keys\") pod \"keystone-bootstrap-qcdzb\" (UID: \"a75d1d69-af3e-47a1-ac03-0d03b40c0999\") " pod="openstack/keystone-bootstrap-qcdzb" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.793167 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a75d1d69-af3e-47a1-ac03-0d03b40c0999-config-data\") pod \"keystone-bootstrap-qcdzb\" (UID: \"a75d1d69-af3e-47a1-ac03-0d03b40c0999\") " pod="openstack/keystone-bootstrap-qcdzb" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.793212 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/296e7802-9ad8-4dc7-8adb-cccf91467e55-dns-svc\") pod \"dnsmasq-dns-6db5d9649c-rxkr8\" (UID: \"296e7802-9ad8-4dc7-8adb-cccf91467e55\") " pod="openstack/dnsmasq-dns-6db5d9649c-rxkr8" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.793230 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b8mk\" (UniqueName: \"kubernetes.io/projected/a75d1d69-af3e-47a1-ac03-0d03b40c0999-kube-api-access-5b8mk\") pod \"keystone-bootstrap-qcdzb\" (UID: \"a75d1d69-af3e-47a1-ac03-0d03b40c0999\") " pod="openstack/keystone-bootstrap-qcdzb" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.793252 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/296e7802-9ad8-4dc7-8adb-cccf91467e55-ovsdbserver-nb\") pod \"dnsmasq-dns-6db5d9649c-rxkr8\" (UID: \"296e7802-9ad8-4dc7-8adb-cccf91467e55\") " pod="openstack/dnsmasq-dns-6db5d9649c-rxkr8" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.793309 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/296e7802-9ad8-4dc7-8adb-cccf91467e55-config\") pod \"dnsmasq-dns-6db5d9649c-rxkr8\" (UID: \"296e7802-9ad8-4dc7-8adb-cccf91467e55\") " pod="openstack/dnsmasq-dns-6db5d9649c-rxkr8" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.793392 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh6ln\" (UniqueName: \"kubernetes.io/projected/296e7802-9ad8-4dc7-8adb-cccf91467e55-kube-api-access-sh6ln\") pod \"dnsmasq-dns-6db5d9649c-rxkr8\" (UID: \"296e7802-9ad8-4dc7-8adb-cccf91467e55\") " pod="openstack/dnsmasq-dns-6db5d9649c-rxkr8" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.894889 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh6ln\" (UniqueName: \"kubernetes.io/projected/296e7802-9ad8-4dc7-8adb-cccf91467e55-kube-api-access-sh6ln\") pod \"dnsmasq-dns-6db5d9649c-rxkr8\" (UID: \"296e7802-9ad8-4dc7-8adb-cccf91467e55\") " pod="openstack/dnsmasq-dns-6db5d9649c-rxkr8" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.894964 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a75d1d69-af3e-47a1-ac03-0d03b40c0999-scripts\") pod \"keystone-bootstrap-qcdzb\" (UID: \"a75d1d69-af3e-47a1-ac03-0d03b40c0999\") " pod="openstack/keystone-bootstrap-qcdzb" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.894986 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75d1d69-af3e-47a1-ac03-0d03b40c0999-combined-ca-bundle\") pod \"keystone-bootstrap-qcdzb\" (UID: \"a75d1d69-af3e-47a1-ac03-0d03b40c0999\") " pod="openstack/keystone-bootstrap-qcdzb" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.895149 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a75d1d69-af3e-47a1-ac03-0d03b40c0999-fernet-keys\") pod \"keystone-bootstrap-qcdzb\" (UID: \"a75d1d69-af3e-47a1-ac03-0d03b40c0999\") " pod="openstack/keystone-bootstrap-qcdzb" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.895226 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/296e7802-9ad8-4dc7-8adb-cccf91467e55-ovsdbserver-sb\") pod \"dnsmasq-dns-6db5d9649c-rxkr8\" (UID: \"296e7802-9ad8-4dc7-8adb-cccf91467e55\") " pod="openstack/dnsmasq-dns-6db5d9649c-rxkr8" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.895293 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a75d1d69-af3e-47a1-ac03-0d03b40c0999-credential-keys\") pod \"keystone-bootstrap-qcdzb\" (UID: \"a75d1d69-af3e-47a1-ac03-0d03b40c0999\") " pod="openstack/keystone-bootstrap-qcdzb" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.895342 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a75d1d69-af3e-47a1-ac03-0d03b40c0999-config-data\") pod \"keystone-bootstrap-qcdzb\" (UID: \"a75d1d69-af3e-47a1-ac03-0d03b40c0999\") " pod="openstack/keystone-bootstrap-qcdzb" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.895395 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/296e7802-9ad8-4dc7-8adb-cccf91467e55-dns-svc\") pod \"dnsmasq-dns-6db5d9649c-rxkr8\" (UID: \"296e7802-9ad8-4dc7-8adb-cccf91467e55\") " pod="openstack/dnsmasq-dns-6db5d9649c-rxkr8" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.895423 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b8mk\" (UniqueName: \"kubernetes.io/projected/a75d1d69-af3e-47a1-ac03-0d03b40c0999-kube-api-access-5b8mk\") pod \"keystone-bootstrap-qcdzb\" (UID: \"a75d1d69-af3e-47a1-ac03-0d03b40c0999\") " pod="openstack/keystone-bootstrap-qcdzb" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.895452 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/296e7802-9ad8-4dc7-8adb-cccf91467e55-ovsdbserver-nb\") pod \"dnsmasq-dns-6db5d9649c-rxkr8\" (UID: \"296e7802-9ad8-4dc7-8adb-cccf91467e55\") " pod="openstack/dnsmasq-dns-6db5d9649c-rxkr8" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.895526 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/296e7802-9ad8-4dc7-8adb-cccf91467e55-config\") pod \"dnsmasq-dns-6db5d9649c-rxkr8\" (UID: \"296e7802-9ad8-4dc7-8adb-cccf91467e55\") " pod="openstack/dnsmasq-dns-6db5d9649c-rxkr8" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.896231 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/296e7802-9ad8-4dc7-8adb-cccf91467e55-ovsdbserver-sb\") pod \"dnsmasq-dns-6db5d9649c-rxkr8\" (UID: \"296e7802-9ad8-4dc7-8adb-cccf91467e55\") " pod="openstack/dnsmasq-dns-6db5d9649c-rxkr8" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.896241 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/296e7802-9ad8-4dc7-8adb-cccf91467e55-dns-svc\") pod \"dnsmasq-dns-6db5d9649c-rxkr8\" (UID: \"296e7802-9ad8-4dc7-8adb-cccf91467e55\") " pod="openstack/dnsmasq-dns-6db5d9649c-rxkr8" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.896797 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/296e7802-9ad8-4dc7-8adb-cccf91467e55-config\") pod \"dnsmasq-dns-6db5d9649c-rxkr8\" (UID: \"296e7802-9ad8-4dc7-8adb-cccf91467e55\") " pod="openstack/dnsmasq-dns-6db5d9649c-rxkr8" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.897042 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/296e7802-9ad8-4dc7-8adb-cccf91467e55-ovsdbserver-nb\") pod \"dnsmasq-dns-6db5d9649c-rxkr8\" (UID: \"296e7802-9ad8-4dc7-8adb-cccf91467e55\") " pod="openstack/dnsmasq-dns-6db5d9649c-rxkr8" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.901216 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a75d1d69-af3e-47a1-ac03-0d03b40c0999-credential-keys\") pod \"keystone-bootstrap-qcdzb\" (UID: \"a75d1d69-af3e-47a1-ac03-0d03b40c0999\") " pod="openstack/keystone-bootstrap-qcdzb" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.902467 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a75d1d69-af3e-47a1-ac03-0d03b40c0999-fernet-keys\") pod \"keystone-bootstrap-qcdzb\" (UID: \"a75d1d69-af3e-47a1-ac03-0d03b40c0999\") " pod="openstack/keystone-bootstrap-qcdzb" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.906173 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75d1d69-af3e-47a1-ac03-0d03b40c0999-combined-ca-bundle\") pod \"keystone-bootstrap-qcdzb\" (UID: \"a75d1d69-af3e-47a1-ac03-0d03b40c0999\") " pod="openstack/keystone-bootstrap-qcdzb" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.914666 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a75d1d69-af3e-47a1-ac03-0d03b40c0999-config-data\") pod \"keystone-bootstrap-qcdzb\" (UID: \"a75d1d69-af3e-47a1-ac03-0d03b40c0999\") " pod="openstack/keystone-bootstrap-qcdzb" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.915082 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b8mk\" (UniqueName: \"kubernetes.io/projected/a75d1d69-af3e-47a1-ac03-0d03b40c0999-kube-api-access-5b8mk\") pod \"keystone-bootstrap-qcdzb\" (UID: \"a75d1d69-af3e-47a1-ac03-0d03b40c0999\") " pod="openstack/keystone-bootstrap-qcdzb" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.924909 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh6ln\" (UniqueName: \"kubernetes.io/projected/296e7802-9ad8-4dc7-8adb-cccf91467e55-kube-api-access-sh6ln\") pod \"dnsmasq-dns-6db5d9649c-rxkr8\" (UID: \"296e7802-9ad8-4dc7-8adb-cccf91467e55\") " pod="openstack/dnsmasq-dns-6db5d9649c-rxkr8" Dec 06 09:23:28 crc kubenswrapper[4672]: I1206 09:23:28.928216 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a75d1d69-af3e-47a1-ac03-0d03b40c0999-scripts\") pod \"keystone-bootstrap-qcdzb\" (UID: \"a75d1d69-af3e-47a1-ac03-0d03b40c0999\") " pod="openstack/keystone-bootstrap-qcdzb" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.007856 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qcdzb" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.028576 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-jnkpx"] Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.037554 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jnkpx" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.053934 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.054187 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-lhcs8" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.064681 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.080463 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6db5d9649c-rxkr8" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.094536 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-jnkpx"] Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.098053 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/34617816-681d-44a7-b88d-73983735dd75-etc-machine-id\") pod \"cinder-db-sync-jnkpx\" (UID: \"34617816-681d-44a7-b88d-73983735dd75\") " pod="openstack/cinder-db-sync-jnkpx" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.098151 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/34617816-681d-44a7-b88d-73983735dd75-db-sync-config-data\") pod \"cinder-db-sync-jnkpx\" (UID: \"34617816-681d-44a7-b88d-73983735dd75\") " pod="openstack/cinder-db-sync-jnkpx" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.098205 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34617816-681d-44a7-b88d-73983735dd75-scripts\") pod \"cinder-db-sync-jnkpx\" (UID: \"34617816-681d-44a7-b88d-73983735dd75\") " pod="openstack/cinder-db-sync-jnkpx" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.098254 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34617816-681d-44a7-b88d-73983735dd75-config-data\") pod \"cinder-db-sync-jnkpx\" (UID: \"34617816-681d-44a7-b88d-73983735dd75\") " pod="openstack/cinder-db-sync-jnkpx" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.098290 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34617816-681d-44a7-b88d-73983735dd75-combined-ca-bundle\") pod \"cinder-db-sync-jnkpx\" (UID: \"34617816-681d-44a7-b88d-73983735dd75\") " pod="openstack/cinder-db-sync-jnkpx" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.098324 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9kdt\" (UniqueName: \"kubernetes.io/projected/34617816-681d-44a7-b88d-73983735dd75-kube-api-access-d9kdt\") pod \"cinder-db-sync-jnkpx\" (UID: \"34617816-681d-44a7-b88d-73983735dd75\") " pod="openstack/cinder-db-sync-jnkpx" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.172038 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-wz7rb"] Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.172954 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wz7rb" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.175298 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.177220 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.177350 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-kbwnm" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.199439 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/34617816-681d-44a7-b88d-73983735dd75-db-sync-config-data\") pod \"cinder-db-sync-jnkpx\" (UID: \"34617816-681d-44a7-b88d-73983735dd75\") " pod="openstack/cinder-db-sync-jnkpx" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.199487 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f78f7c5-5965-4932-845c-8f0be90b421a-combined-ca-bundle\") pod \"placement-db-sync-wz7rb\" (UID: \"4f78f7c5-5965-4932-845c-8f0be90b421a\") " pod="openstack/placement-db-sync-wz7rb" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.199514 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34617816-681d-44a7-b88d-73983735dd75-scripts\") pod \"cinder-db-sync-jnkpx\" (UID: \"34617816-681d-44a7-b88d-73983735dd75\") " pod="openstack/cinder-db-sync-jnkpx" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.199549 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34617816-681d-44a7-b88d-73983735dd75-config-data\") pod \"cinder-db-sync-jnkpx\" (UID: \"34617816-681d-44a7-b88d-73983735dd75\") " pod="openstack/cinder-db-sync-jnkpx" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.199567 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f78f7c5-5965-4932-845c-8f0be90b421a-config-data\") pod \"placement-db-sync-wz7rb\" (UID: \"4f78f7c5-5965-4932-845c-8f0be90b421a\") " pod="openstack/placement-db-sync-wz7rb" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.199589 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f78f7c5-5965-4932-845c-8f0be90b421a-logs\") pod \"placement-db-sync-wz7rb\" (UID: \"4f78f7c5-5965-4932-845c-8f0be90b421a\") " pod="openstack/placement-db-sync-wz7rb" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.199622 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f78f7c5-5965-4932-845c-8f0be90b421a-scripts\") pod \"placement-db-sync-wz7rb\" (UID: \"4f78f7c5-5965-4932-845c-8f0be90b421a\") " pod="openstack/placement-db-sync-wz7rb" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.199643 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34617816-681d-44a7-b88d-73983735dd75-combined-ca-bundle\") pod \"cinder-db-sync-jnkpx\" (UID: \"34617816-681d-44a7-b88d-73983735dd75\") " pod="openstack/cinder-db-sync-jnkpx" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.199659 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwpmj\" (UniqueName: \"kubernetes.io/projected/4f78f7c5-5965-4932-845c-8f0be90b421a-kube-api-access-wwpmj\") pod \"placement-db-sync-wz7rb\" (UID: \"4f78f7c5-5965-4932-845c-8f0be90b421a\") " pod="openstack/placement-db-sync-wz7rb" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.199679 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9kdt\" (UniqueName: \"kubernetes.io/projected/34617816-681d-44a7-b88d-73983735dd75-kube-api-access-d9kdt\") pod \"cinder-db-sync-jnkpx\" (UID: \"34617816-681d-44a7-b88d-73983735dd75\") " pod="openstack/cinder-db-sync-jnkpx" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.199704 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/34617816-681d-44a7-b88d-73983735dd75-etc-machine-id\") pod \"cinder-db-sync-jnkpx\" (UID: \"34617816-681d-44a7-b88d-73983735dd75\") " pod="openstack/cinder-db-sync-jnkpx" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.199767 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/34617816-681d-44a7-b88d-73983735dd75-etc-machine-id\") pod \"cinder-db-sync-jnkpx\" (UID: \"34617816-681d-44a7-b88d-73983735dd75\") " pod="openstack/cinder-db-sync-jnkpx" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.199783 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-wz7rb"] Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.208325 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34617816-681d-44a7-b88d-73983735dd75-scripts\") pod \"cinder-db-sync-jnkpx\" (UID: \"34617816-681d-44a7-b88d-73983735dd75\") " pod="openstack/cinder-db-sync-jnkpx" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.213428 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34617816-681d-44a7-b88d-73983735dd75-config-data\") pod \"cinder-db-sync-jnkpx\" (UID: \"34617816-681d-44a7-b88d-73983735dd75\") " pod="openstack/cinder-db-sync-jnkpx" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.215080 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34617816-681d-44a7-b88d-73983735dd75-combined-ca-bundle\") pod \"cinder-db-sync-jnkpx\" (UID: \"34617816-681d-44a7-b88d-73983735dd75\") " pod="openstack/cinder-db-sync-jnkpx" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.215415 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/34617816-681d-44a7-b88d-73983735dd75-db-sync-config-data\") pod \"cinder-db-sync-jnkpx\" (UID: \"34617816-681d-44a7-b88d-73983735dd75\") " pod="openstack/cinder-db-sync-jnkpx" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.253181 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9kdt\" (UniqueName: \"kubernetes.io/projected/34617816-681d-44a7-b88d-73983735dd75-kube-api-access-d9kdt\") pod \"cinder-db-sync-jnkpx\" (UID: \"34617816-681d-44a7-b88d-73983735dd75\") " pod="openstack/cinder-db-sync-jnkpx" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.253247 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6db5d9649c-rxkr8"] Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.295286 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-pmmnm"] Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.296634 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pmmnm" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.303657 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a9bafbf-4733-4178-8012-3e94d02aa9cb-combined-ca-bundle\") pod \"barbican-db-sync-pmmnm\" (UID: \"6a9bafbf-4733-4178-8012-3e94d02aa9cb\") " pod="openstack/barbican-db-sync-pmmnm" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.303711 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlnfk\" (UniqueName: \"kubernetes.io/projected/6a9bafbf-4733-4178-8012-3e94d02aa9cb-kube-api-access-tlnfk\") pod \"barbican-db-sync-pmmnm\" (UID: \"6a9bafbf-4733-4178-8012-3e94d02aa9cb\") " pod="openstack/barbican-db-sync-pmmnm" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.303736 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6a9bafbf-4733-4178-8012-3e94d02aa9cb-db-sync-config-data\") pod \"barbican-db-sync-pmmnm\" (UID: \"6a9bafbf-4733-4178-8012-3e94d02aa9cb\") " pod="openstack/barbican-db-sync-pmmnm" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.303773 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f78f7c5-5965-4932-845c-8f0be90b421a-combined-ca-bundle\") pod \"placement-db-sync-wz7rb\" (UID: \"4f78f7c5-5965-4932-845c-8f0be90b421a\") " pod="openstack/placement-db-sync-wz7rb" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.303819 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f78f7c5-5965-4932-845c-8f0be90b421a-config-data\") pod \"placement-db-sync-wz7rb\" (UID: \"4f78f7c5-5965-4932-845c-8f0be90b421a\") " pod="openstack/placement-db-sync-wz7rb" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.303843 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f78f7c5-5965-4932-845c-8f0be90b421a-logs\") pod \"placement-db-sync-wz7rb\" (UID: \"4f78f7c5-5965-4932-845c-8f0be90b421a\") " pod="openstack/placement-db-sync-wz7rb" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.303864 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f78f7c5-5965-4932-845c-8f0be90b421a-scripts\") pod \"placement-db-sync-wz7rb\" (UID: \"4f78f7c5-5965-4932-845c-8f0be90b421a\") " pod="openstack/placement-db-sync-wz7rb" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.303884 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwpmj\" (UniqueName: \"kubernetes.io/projected/4f78f7c5-5965-4932-845c-8f0be90b421a-kube-api-access-wwpmj\") pod \"placement-db-sync-wz7rb\" (UID: \"4f78f7c5-5965-4932-845c-8f0be90b421a\") " pod="openstack/placement-db-sync-wz7rb" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.305378 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2b8cp" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.305741 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f78f7c5-5965-4932-845c-8f0be90b421a-logs\") pod \"placement-db-sync-wz7rb\" (UID: \"4f78f7c5-5965-4932-845c-8f0be90b421a\") " pod="openstack/placement-db-sync-wz7rb" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.307979 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.308309 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f78f7c5-5965-4932-845c-8f0be90b421a-combined-ca-bundle\") pod \"placement-db-sync-wz7rb\" (UID: \"4f78f7c5-5965-4932-845c-8f0be90b421a\") " pod="openstack/placement-db-sync-wz7rb" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.308825 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f78f7c5-5965-4932-845c-8f0be90b421a-scripts\") pod \"placement-db-sync-wz7rb\" (UID: \"4f78f7c5-5965-4932-845c-8f0be90b421a\") " pod="openstack/placement-db-sync-wz7rb" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.340692 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwpmj\" (UniqueName: \"kubernetes.io/projected/4f78f7c5-5965-4932-845c-8f0be90b421a-kube-api-access-wwpmj\") pod \"placement-db-sync-wz7rb\" (UID: \"4f78f7c5-5965-4932-845c-8f0be90b421a\") " pod="openstack/placement-db-sync-wz7rb" Dec 06 09:23:29 crc kubenswrapper[4672]: I1206 09:23:29.377719 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-pmmnm"] Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.410312 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-4dcrc"] Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.412693 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f78f7c5-5965-4932-845c-8f0be90b421a-config-data\") pod \"placement-db-sync-wz7rb\" (UID: \"4f78f7c5-5965-4932-845c-8f0be90b421a\") " pod="openstack/placement-db-sync-wz7rb" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.419332 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a9bafbf-4733-4178-8012-3e94d02aa9cb-combined-ca-bundle\") pod \"barbican-db-sync-pmmnm\" (UID: \"6a9bafbf-4733-4178-8012-3e94d02aa9cb\") " pod="openstack/barbican-db-sync-pmmnm" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.419421 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlnfk\" (UniqueName: \"kubernetes.io/projected/6a9bafbf-4733-4178-8012-3e94d02aa9cb-kube-api-access-tlnfk\") pod \"barbican-db-sync-pmmnm\" (UID: \"6a9bafbf-4733-4178-8012-3e94d02aa9cb\") " pod="openstack/barbican-db-sync-pmmnm" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.419462 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6a9bafbf-4733-4178-8012-3e94d02aa9cb-db-sync-config-data\") pod \"barbican-db-sync-pmmnm\" (UID: \"6a9bafbf-4733-4178-8012-3e94d02aa9cb\") " pod="openstack/barbican-db-sync-pmmnm" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.421497 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4dcrc" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.426538 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.438025 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-bdl8h" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.438406 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.452040 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76445f8cf5-d9mzn"] Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.453640 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76445f8cf5-d9mzn" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.462624 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-4dcrc"] Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.463354 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a9bafbf-4733-4178-8012-3e94d02aa9cb-combined-ca-bundle\") pod \"barbican-db-sync-pmmnm\" (UID: \"6a9bafbf-4733-4178-8012-3e94d02aa9cb\") " pod="openstack/barbican-db-sync-pmmnm" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.468965 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6a9bafbf-4733-4178-8012-3e94d02aa9cb-db-sync-config-data\") pod \"barbican-db-sync-pmmnm\" (UID: \"6a9bafbf-4733-4178-8012-3e94d02aa9cb\") " pod="openstack/barbican-db-sync-pmmnm" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.472137 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlnfk\" (UniqueName: \"kubernetes.io/projected/6a9bafbf-4733-4178-8012-3e94d02aa9cb-kube-api-access-tlnfk\") pod \"barbican-db-sync-pmmnm\" (UID: \"6a9bafbf-4733-4178-8012-3e94d02aa9cb\") " pod="openstack/barbican-db-sync-pmmnm" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.485129 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76445f8cf5-d9mzn"] Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.489139 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jnkpx" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.499646 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.506046 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.513423 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.513806 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.514575 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.525379 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wz7rb" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.621778 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2437522-da6e-48b0-94b4-30b0968bccde-scripts\") pod \"ceilometer-0\" (UID: \"c2437522-da6e-48b0-94b4-30b0968bccde\") " pod="openstack/ceilometer-0" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.621815 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5e31a55-73d0-43a5-8308-40e18ab22f58-config\") pod \"neutron-db-sync-4dcrc\" (UID: \"e5e31a55-73d0-43a5-8308-40e18ab22f58\") " pod="openstack/neutron-db-sync-4dcrc" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.621857 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpswz\" (UniqueName: \"kubernetes.io/projected/e5e31a55-73d0-43a5-8308-40e18ab22f58-kube-api-access-hpswz\") pod \"neutron-db-sync-4dcrc\" (UID: \"e5e31a55-73d0-43a5-8308-40e18ab22f58\") " pod="openstack/neutron-db-sync-4dcrc" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.621877 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc4be349-0e33-467d-9214-134e91da88f2-dns-svc\") pod \"dnsmasq-dns-76445f8cf5-d9mzn\" (UID: \"dc4be349-0e33-467d-9214-134e91da88f2\") " pod="openstack/dnsmasq-dns-76445f8cf5-d9mzn" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.621892 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc4be349-0e33-467d-9214-134e91da88f2-ovsdbserver-nb\") pod \"dnsmasq-dns-76445f8cf5-d9mzn\" (UID: \"dc4be349-0e33-467d-9214-134e91da88f2\") " pod="openstack/dnsmasq-dns-76445f8cf5-d9mzn" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.621927 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2437522-da6e-48b0-94b4-30b0968bccde-config-data\") pod \"ceilometer-0\" (UID: \"c2437522-da6e-48b0-94b4-30b0968bccde\") " pod="openstack/ceilometer-0" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.621940 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc4be349-0e33-467d-9214-134e91da88f2-config\") pod \"dnsmasq-dns-76445f8cf5-d9mzn\" (UID: \"dc4be349-0e33-467d-9214-134e91da88f2\") " pod="openstack/dnsmasq-dns-76445f8cf5-d9mzn" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.621961 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8km79\" (UniqueName: \"kubernetes.io/projected/c2437522-da6e-48b0-94b4-30b0968bccde-kube-api-access-8km79\") pod \"ceilometer-0\" (UID: \"c2437522-da6e-48b0-94b4-30b0968bccde\") " pod="openstack/ceilometer-0" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.621974 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5e31a55-73d0-43a5-8308-40e18ab22f58-combined-ca-bundle\") pod \"neutron-db-sync-4dcrc\" (UID: \"e5e31a55-73d0-43a5-8308-40e18ab22f58\") " pod="openstack/neutron-db-sync-4dcrc" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.621990 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2437522-da6e-48b0-94b4-30b0968bccde-log-httpd\") pod \"ceilometer-0\" (UID: \"c2437522-da6e-48b0-94b4-30b0968bccde\") " pod="openstack/ceilometer-0" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.622202 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2437522-da6e-48b0-94b4-30b0968bccde-run-httpd\") pod \"ceilometer-0\" (UID: \"c2437522-da6e-48b0-94b4-30b0968bccde\") " pod="openstack/ceilometer-0" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.622258 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc4be349-0e33-467d-9214-134e91da88f2-ovsdbserver-sb\") pod \"dnsmasq-dns-76445f8cf5-d9mzn\" (UID: \"dc4be349-0e33-467d-9214-134e91da88f2\") " pod="openstack/dnsmasq-dns-76445f8cf5-d9mzn" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.622295 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2437522-da6e-48b0-94b4-30b0968bccde-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c2437522-da6e-48b0-94b4-30b0968bccde\") " pod="openstack/ceilometer-0" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.622347 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f28m7\" (UniqueName: \"kubernetes.io/projected/dc4be349-0e33-467d-9214-134e91da88f2-kube-api-access-f28m7\") pod \"dnsmasq-dns-76445f8cf5-d9mzn\" (UID: \"dc4be349-0e33-467d-9214-134e91da88f2\") " pod="openstack/dnsmasq-dns-76445f8cf5-d9mzn" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.622457 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c2437522-da6e-48b0-94b4-30b0968bccde-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c2437522-da6e-48b0-94b4-30b0968bccde\") " pod="openstack/ceilometer-0" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.676989 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pmmnm" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.729140 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2437522-da6e-48b0-94b4-30b0968bccde-run-httpd\") pod \"ceilometer-0\" (UID: \"c2437522-da6e-48b0-94b4-30b0968bccde\") " pod="openstack/ceilometer-0" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.729192 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc4be349-0e33-467d-9214-134e91da88f2-ovsdbserver-sb\") pod \"dnsmasq-dns-76445f8cf5-d9mzn\" (UID: \"dc4be349-0e33-467d-9214-134e91da88f2\") " pod="openstack/dnsmasq-dns-76445f8cf5-d9mzn" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.729220 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2437522-da6e-48b0-94b4-30b0968bccde-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c2437522-da6e-48b0-94b4-30b0968bccde\") " pod="openstack/ceilometer-0" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.729247 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f28m7\" (UniqueName: \"kubernetes.io/projected/dc4be349-0e33-467d-9214-134e91da88f2-kube-api-access-f28m7\") pod \"dnsmasq-dns-76445f8cf5-d9mzn\" (UID: \"dc4be349-0e33-467d-9214-134e91da88f2\") " pod="openstack/dnsmasq-dns-76445f8cf5-d9mzn" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.729704 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c2437522-da6e-48b0-94b4-30b0968bccde-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c2437522-da6e-48b0-94b4-30b0968bccde\") " pod="openstack/ceilometer-0" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.729738 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2437522-da6e-48b0-94b4-30b0968bccde-scripts\") pod \"ceilometer-0\" (UID: \"c2437522-da6e-48b0-94b4-30b0968bccde\") " pod="openstack/ceilometer-0" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.729763 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5e31a55-73d0-43a5-8308-40e18ab22f58-config\") pod \"neutron-db-sync-4dcrc\" (UID: \"e5e31a55-73d0-43a5-8308-40e18ab22f58\") " pod="openstack/neutron-db-sync-4dcrc" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.729798 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpswz\" (UniqueName: \"kubernetes.io/projected/e5e31a55-73d0-43a5-8308-40e18ab22f58-kube-api-access-hpswz\") pod \"neutron-db-sync-4dcrc\" (UID: \"e5e31a55-73d0-43a5-8308-40e18ab22f58\") " pod="openstack/neutron-db-sync-4dcrc" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.729814 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc4be349-0e33-467d-9214-134e91da88f2-dns-svc\") pod \"dnsmasq-dns-76445f8cf5-d9mzn\" (UID: \"dc4be349-0e33-467d-9214-134e91da88f2\") " pod="openstack/dnsmasq-dns-76445f8cf5-d9mzn" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.729829 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc4be349-0e33-467d-9214-134e91da88f2-ovsdbserver-nb\") pod \"dnsmasq-dns-76445f8cf5-d9mzn\" (UID: \"dc4be349-0e33-467d-9214-134e91da88f2\") " pod="openstack/dnsmasq-dns-76445f8cf5-d9mzn" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.729861 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2437522-da6e-48b0-94b4-30b0968bccde-config-data\") pod \"ceilometer-0\" (UID: \"c2437522-da6e-48b0-94b4-30b0968bccde\") " pod="openstack/ceilometer-0" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.729879 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc4be349-0e33-467d-9214-134e91da88f2-config\") pod \"dnsmasq-dns-76445f8cf5-d9mzn\" (UID: \"dc4be349-0e33-467d-9214-134e91da88f2\") " pod="openstack/dnsmasq-dns-76445f8cf5-d9mzn" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.729906 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8km79\" (UniqueName: \"kubernetes.io/projected/c2437522-da6e-48b0-94b4-30b0968bccde-kube-api-access-8km79\") pod \"ceilometer-0\" (UID: \"c2437522-da6e-48b0-94b4-30b0968bccde\") " pod="openstack/ceilometer-0" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.730031 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5e31a55-73d0-43a5-8308-40e18ab22f58-combined-ca-bundle\") pod \"neutron-db-sync-4dcrc\" (UID: \"e5e31a55-73d0-43a5-8308-40e18ab22f58\") " pod="openstack/neutron-db-sync-4dcrc" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.730056 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2437522-da6e-48b0-94b4-30b0968bccde-log-httpd\") pod \"ceilometer-0\" (UID: \"c2437522-da6e-48b0-94b4-30b0968bccde\") " pod="openstack/ceilometer-0" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.730581 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2437522-da6e-48b0-94b4-30b0968bccde-log-httpd\") pod \"ceilometer-0\" (UID: \"c2437522-da6e-48b0-94b4-30b0968bccde\") " pod="openstack/ceilometer-0" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.733920 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc4be349-0e33-467d-9214-134e91da88f2-ovsdbserver-nb\") pod \"dnsmasq-dns-76445f8cf5-d9mzn\" (UID: \"dc4be349-0e33-467d-9214-134e91da88f2\") " pod="openstack/dnsmasq-dns-76445f8cf5-d9mzn" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.734100 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc4be349-0e33-467d-9214-134e91da88f2-ovsdbserver-sb\") pod \"dnsmasq-dns-76445f8cf5-d9mzn\" (UID: \"dc4be349-0e33-467d-9214-134e91da88f2\") " pod="openstack/dnsmasq-dns-76445f8cf5-d9mzn" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.734236 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc4be349-0e33-467d-9214-134e91da88f2-config\") pod \"dnsmasq-dns-76445f8cf5-d9mzn\" (UID: \"dc4be349-0e33-467d-9214-134e91da88f2\") " pod="openstack/dnsmasq-dns-76445f8cf5-d9mzn" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.734998 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc4be349-0e33-467d-9214-134e91da88f2-dns-svc\") pod \"dnsmasq-dns-76445f8cf5-d9mzn\" (UID: \"dc4be349-0e33-467d-9214-134e91da88f2\") " pod="openstack/dnsmasq-dns-76445f8cf5-d9mzn" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.735347 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2437522-da6e-48b0-94b4-30b0968bccde-run-httpd\") pod \"ceilometer-0\" (UID: \"c2437522-da6e-48b0-94b4-30b0968bccde\") " pod="openstack/ceilometer-0" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.740338 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5e31a55-73d0-43a5-8308-40e18ab22f58-combined-ca-bundle\") pod \"neutron-db-sync-4dcrc\" (UID: \"e5e31a55-73d0-43a5-8308-40e18ab22f58\") " pod="openstack/neutron-db-sync-4dcrc" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.743196 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c2437522-da6e-48b0-94b4-30b0968bccde-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c2437522-da6e-48b0-94b4-30b0968bccde\") " pod="openstack/ceilometer-0" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.746299 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5e31a55-73d0-43a5-8308-40e18ab22f58-config\") pod \"neutron-db-sync-4dcrc\" (UID: \"e5e31a55-73d0-43a5-8308-40e18ab22f58\") " pod="openstack/neutron-db-sync-4dcrc" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.746328 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2437522-da6e-48b0-94b4-30b0968bccde-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c2437522-da6e-48b0-94b4-30b0968bccde\") " pod="openstack/ceilometer-0" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.748557 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2437522-da6e-48b0-94b4-30b0968bccde-config-data\") pod \"ceilometer-0\" (UID: \"c2437522-da6e-48b0-94b4-30b0968bccde\") " pod="openstack/ceilometer-0" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.755469 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2437522-da6e-48b0-94b4-30b0968bccde-scripts\") pod \"ceilometer-0\" (UID: \"c2437522-da6e-48b0-94b4-30b0968bccde\") " pod="openstack/ceilometer-0" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.758256 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpswz\" (UniqueName: \"kubernetes.io/projected/e5e31a55-73d0-43a5-8308-40e18ab22f58-kube-api-access-hpswz\") pod \"neutron-db-sync-4dcrc\" (UID: \"e5e31a55-73d0-43a5-8308-40e18ab22f58\") " pod="openstack/neutron-db-sync-4dcrc" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.760293 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8km79\" (UniqueName: \"kubernetes.io/projected/c2437522-da6e-48b0-94b4-30b0968bccde-kube-api-access-8km79\") pod \"ceilometer-0\" (UID: \"c2437522-da6e-48b0-94b4-30b0968bccde\") " pod="openstack/ceilometer-0" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.761666 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f28m7\" (UniqueName: \"kubernetes.io/projected/dc4be349-0e33-467d-9214-134e91da88f2-kube-api-access-f28m7\") pod \"dnsmasq-dns-76445f8cf5-d9mzn\" (UID: \"dc4be349-0e33-467d-9214-134e91da88f2\") " pod="openstack/dnsmasq-dns-76445f8cf5-d9mzn" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.773953 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4dcrc" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.792107 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76445f8cf5-d9mzn" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:29.864105 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:30.608680 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qcdzb"] Dec 06 09:23:30 crc kubenswrapper[4672]: I1206 09:23:30.987676 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-4dcrc"] Dec 06 09:23:31 crc kubenswrapper[4672]: I1206 09:23:31.009791 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76445f8cf5-d9mzn"] Dec 06 09:23:31 crc kubenswrapper[4672]: I1206 09:23:31.020585 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-jnkpx"] Dec 06 09:23:31 crc kubenswrapper[4672]: I1206 09:23:31.114896 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6db5d9649c-rxkr8"] Dec 06 09:23:31 crc kubenswrapper[4672]: I1206 09:23:31.164522 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:23:31 crc kubenswrapper[4672]: I1206 09:23:31.179399 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-wz7rb"] Dec 06 09:23:31 crc kubenswrapper[4672]: I1206 09:23:31.199524 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:23:31 crc kubenswrapper[4672]: W1206 09:23:31.201823 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod296e7802_9ad8_4dc7_8adb_cccf91467e55.slice/crio-e995e93b2ba50bd7e3932b08ab4e3b7d7eb3c03986b8af19a4a43027c7f6cf8a WatchSource:0}: Error finding container e995e93b2ba50bd7e3932b08ab4e3b7d7eb3c03986b8af19a4a43027c7f6cf8a: Status 404 returned error can't find the container with id e995e93b2ba50bd7e3932b08ab4e3b7d7eb3c03986b8af19a4a43027c7f6cf8a Dec 06 09:23:31 crc kubenswrapper[4672]: I1206 09:23:31.236975 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-pmmnm"] Dec 06 09:23:31 crc kubenswrapper[4672]: I1206 09:23:31.495306 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jnkpx" event={"ID":"34617816-681d-44a7-b88d-73983735dd75","Type":"ContainerStarted","Data":"704766205aca81a7409d17ad0a051289746b3fc747594baab804916133fce7ed"} Dec 06 09:23:31 crc kubenswrapper[4672]: I1206 09:23:31.496407 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c2437522-da6e-48b0-94b4-30b0968bccde","Type":"ContainerStarted","Data":"cf1a2d80d0cecb2f41beb1f6efbe7a39416dd9e46c15c8b9183c30eceeb413b5"} Dec 06 09:23:31 crc kubenswrapper[4672]: I1206 09:23:31.497242 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76445f8cf5-d9mzn" event={"ID":"dc4be349-0e33-467d-9214-134e91da88f2","Type":"ContainerStarted","Data":"04dcb28b07415016bda166d74989e7e067bce99dcec8daf5aaa1f49ae4bf9824"} Dec 06 09:23:31 crc kubenswrapper[4672]: I1206 09:23:31.497888 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4dcrc" event={"ID":"e5e31a55-73d0-43a5-8308-40e18ab22f58","Type":"ContainerStarted","Data":"06cdb953c5972893e8e5dc076407ee5f6822fdd3dcf5c4900fd0be883570b396"} Dec 06 09:23:31 crc kubenswrapper[4672]: I1206 09:23:31.498776 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qcdzb" event={"ID":"a75d1d69-af3e-47a1-ac03-0d03b40c0999","Type":"ContainerStarted","Data":"08985045aa95156649c283873a6329b5a506bb275fa4b543c4ba84e1df919191"} Dec 06 09:23:31 crc kubenswrapper[4672]: I1206 09:23:31.498794 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qcdzb" event={"ID":"a75d1d69-af3e-47a1-ac03-0d03b40c0999","Type":"ContainerStarted","Data":"773e8b6c9ee7823d220f6d95f7e4ceba39f73d173fea52c5aedbfa543a60a0a8"} Dec 06 09:23:31 crc kubenswrapper[4672]: I1206 09:23:31.500568 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pmmnm" event={"ID":"6a9bafbf-4733-4178-8012-3e94d02aa9cb","Type":"ContainerStarted","Data":"2d509fb51ee9d2ebacbee4c925cdbdf7800bbafd640f7014321f194c2e722115"} Dec 06 09:23:31 crc kubenswrapper[4672]: I1206 09:23:31.501152 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6db5d9649c-rxkr8" event={"ID":"296e7802-9ad8-4dc7-8adb-cccf91467e55","Type":"ContainerStarted","Data":"e995e93b2ba50bd7e3932b08ab4e3b7d7eb3c03986b8af19a4a43027c7f6cf8a"} Dec 06 09:23:31 crc kubenswrapper[4672]: I1206 09:23:31.501781 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wz7rb" event={"ID":"4f78f7c5-5965-4932-845c-8f0be90b421a","Type":"ContainerStarted","Data":"d596fefb46f713114548cd4a1d0e410fbd1fc6926a90928ecfa98ab61acd18a7"} Dec 06 09:23:31 crc kubenswrapper[4672]: I1206 09:23:31.535133 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-qcdzb" podStartSLOduration=3.535112323 podStartE2EDuration="3.535112323s" podCreationTimestamp="2025-12-06 09:23:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:23:31.522061118 +0000 UTC m=+1029.266321405" watchObservedRunningTime="2025-12-06 09:23:31.535112323 +0000 UTC m=+1029.279372610" Dec 06 09:23:32 crc kubenswrapper[4672]: I1206 09:23:32.530581 4672 generic.go:334] "Generic (PLEG): container finished" podID="dc4be349-0e33-467d-9214-134e91da88f2" containerID="c0f656ea5633a2c28d6df83d0b8b981a77b4ce557e0e91c5858fb26740879fb3" exitCode=0 Dec 06 09:23:32 crc kubenswrapper[4672]: I1206 09:23:32.530655 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76445f8cf5-d9mzn" event={"ID":"dc4be349-0e33-467d-9214-134e91da88f2","Type":"ContainerDied","Data":"c0f656ea5633a2c28d6df83d0b8b981a77b4ce557e0e91c5858fb26740879fb3"} Dec 06 09:23:32 crc kubenswrapper[4672]: I1206 09:23:32.536887 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4dcrc" event={"ID":"e5e31a55-73d0-43a5-8308-40e18ab22f58","Type":"ContainerStarted","Data":"7f4f5e161f8febd2d5ed444cf217a57031e468ee35323ee9e8ec347625d57236"} Dec 06 09:23:32 crc kubenswrapper[4672]: I1206 09:23:32.561852 4672 generic.go:334] "Generic (PLEG): container finished" podID="296e7802-9ad8-4dc7-8adb-cccf91467e55" containerID="a0f529b01d2753af627c9a8490392ed703dcad409b3b1a6431d129b8f691a2fa" exitCode=0 Dec 06 09:23:32 crc kubenswrapper[4672]: I1206 09:23:32.580197 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6db5d9649c-rxkr8" event={"ID":"296e7802-9ad8-4dc7-8adb-cccf91467e55","Type":"ContainerDied","Data":"a0f529b01d2753af627c9a8490392ed703dcad409b3b1a6431d129b8f691a2fa"} Dec 06 09:23:32 crc kubenswrapper[4672]: I1206 09:23:32.777576 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-4dcrc" podStartSLOduration=3.777559769 podStartE2EDuration="3.777559769s" podCreationTimestamp="2025-12-06 09:23:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:23:32.729994497 +0000 UTC m=+1030.474254784" watchObservedRunningTime="2025-12-06 09:23:32.777559769 +0000 UTC m=+1030.521820056" Dec 06 09:23:32 crc kubenswrapper[4672]: I1206 09:23:32.966496 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6db5d9649c-rxkr8" Dec 06 09:23:33 crc kubenswrapper[4672]: I1206 09:23:33.143208 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/296e7802-9ad8-4dc7-8adb-cccf91467e55-config\") pod \"296e7802-9ad8-4dc7-8adb-cccf91467e55\" (UID: \"296e7802-9ad8-4dc7-8adb-cccf91467e55\") " Dec 06 09:23:33 crc kubenswrapper[4672]: I1206 09:23:33.143322 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/296e7802-9ad8-4dc7-8adb-cccf91467e55-ovsdbserver-sb\") pod \"296e7802-9ad8-4dc7-8adb-cccf91467e55\" (UID: \"296e7802-9ad8-4dc7-8adb-cccf91467e55\") " Dec 06 09:23:33 crc kubenswrapper[4672]: I1206 09:23:33.143369 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/296e7802-9ad8-4dc7-8adb-cccf91467e55-ovsdbserver-nb\") pod \"296e7802-9ad8-4dc7-8adb-cccf91467e55\" (UID: \"296e7802-9ad8-4dc7-8adb-cccf91467e55\") " Dec 06 09:23:33 crc kubenswrapper[4672]: I1206 09:23:33.143453 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sh6ln\" (UniqueName: \"kubernetes.io/projected/296e7802-9ad8-4dc7-8adb-cccf91467e55-kube-api-access-sh6ln\") pod \"296e7802-9ad8-4dc7-8adb-cccf91467e55\" (UID: \"296e7802-9ad8-4dc7-8adb-cccf91467e55\") " Dec 06 09:23:33 crc kubenswrapper[4672]: I1206 09:23:33.143491 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/296e7802-9ad8-4dc7-8adb-cccf91467e55-dns-svc\") pod \"296e7802-9ad8-4dc7-8adb-cccf91467e55\" (UID: \"296e7802-9ad8-4dc7-8adb-cccf91467e55\") " Dec 06 09:23:33 crc kubenswrapper[4672]: I1206 09:23:33.168887 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/296e7802-9ad8-4dc7-8adb-cccf91467e55-kube-api-access-sh6ln" (OuterVolumeSpecName: "kube-api-access-sh6ln") pod "296e7802-9ad8-4dc7-8adb-cccf91467e55" (UID: "296e7802-9ad8-4dc7-8adb-cccf91467e55"). InnerVolumeSpecName "kube-api-access-sh6ln". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:23:33 crc kubenswrapper[4672]: I1206 09:23:33.178402 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/296e7802-9ad8-4dc7-8adb-cccf91467e55-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "296e7802-9ad8-4dc7-8adb-cccf91467e55" (UID: "296e7802-9ad8-4dc7-8adb-cccf91467e55"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:23:33 crc kubenswrapper[4672]: I1206 09:23:33.186219 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/296e7802-9ad8-4dc7-8adb-cccf91467e55-config" (OuterVolumeSpecName: "config") pod "296e7802-9ad8-4dc7-8adb-cccf91467e55" (UID: "296e7802-9ad8-4dc7-8adb-cccf91467e55"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:23:33 crc kubenswrapper[4672]: I1206 09:23:33.195265 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/296e7802-9ad8-4dc7-8adb-cccf91467e55-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "296e7802-9ad8-4dc7-8adb-cccf91467e55" (UID: "296e7802-9ad8-4dc7-8adb-cccf91467e55"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:23:33 crc kubenswrapper[4672]: I1206 09:23:33.199857 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/296e7802-9ad8-4dc7-8adb-cccf91467e55-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "296e7802-9ad8-4dc7-8adb-cccf91467e55" (UID: "296e7802-9ad8-4dc7-8adb-cccf91467e55"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:23:33 crc kubenswrapper[4672]: I1206 09:23:33.244867 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sh6ln\" (UniqueName: \"kubernetes.io/projected/296e7802-9ad8-4dc7-8adb-cccf91467e55-kube-api-access-sh6ln\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:33 crc kubenswrapper[4672]: I1206 09:23:33.244893 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/296e7802-9ad8-4dc7-8adb-cccf91467e55-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:33 crc kubenswrapper[4672]: I1206 09:23:33.244902 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/296e7802-9ad8-4dc7-8adb-cccf91467e55-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:33 crc kubenswrapper[4672]: I1206 09:23:33.244911 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/296e7802-9ad8-4dc7-8adb-cccf91467e55-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:33 crc kubenswrapper[4672]: I1206 09:23:33.244919 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/296e7802-9ad8-4dc7-8adb-cccf91467e55-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:33 crc kubenswrapper[4672]: I1206 09:23:33.583902 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6db5d9649c-rxkr8" Dec 06 09:23:33 crc kubenswrapper[4672]: I1206 09:23:33.583888 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6db5d9649c-rxkr8" event={"ID":"296e7802-9ad8-4dc7-8adb-cccf91467e55","Type":"ContainerDied","Data":"e995e93b2ba50bd7e3932b08ab4e3b7d7eb3c03986b8af19a4a43027c7f6cf8a"} Dec 06 09:23:33 crc kubenswrapper[4672]: I1206 09:23:33.583960 4672 scope.go:117] "RemoveContainer" containerID="a0f529b01d2753af627c9a8490392ed703dcad409b3b1a6431d129b8f691a2fa" Dec 06 09:23:33 crc kubenswrapper[4672]: I1206 09:23:33.591565 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76445f8cf5-d9mzn" event={"ID":"dc4be349-0e33-467d-9214-134e91da88f2","Type":"ContainerStarted","Data":"7ae30a5f2eb33c96762e4adef64716062087190ba3f4512d00813545c751f0b7"} Dec 06 09:23:33 crc kubenswrapper[4672]: I1206 09:23:33.611740 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76445f8cf5-d9mzn" podStartSLOduration=4.611724176 podStartE2EDuration="4.611724176s" podCreationTimestamp="2025-12-06 09:23:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:23:33.609333951 +0000 UTC m=+1031.353594258" watchObservedRunningTime="2025-12-06 09:23:33.611724176 +0000 UTC m=+1031.355984463" Dec 06 09:23:33 crc kubenswrapper[4672]: I1206 09:23:33.666507 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6db5d9649c-rxkr8"] Dec 06 09:23:33 crc kubenswrapper[4672]: I1206 09:23:33.674712 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6db5d9649c-rxkr8"] Dec 06 09:23:34 crc kubenswrapper[4672]: I1206 09:23:34.568876 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="296e7802-9ad8-4dc7-8adb-cccf91467e55" path="/var/lib/kubelet/pods/296e7802-9ad8-4dc7-8adb-cccf91467e55/volumes" Dec 06 09:23:34 crc kubenswrapper[4672]: I1206 09:23:34.611050 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76445f8cf5-d9mzn" Dec 06 09:23:36 crc kubenswrapper[4672]: I1206 09:23:36.627621 4672 generic.go:334] "Generic (PLEG): container finished" podID="a75d1d69-af3e-47a1-ac03-0d03b40c0999" containerID="08985045aa95156649c283873a6329b5a506bb275fa4b543c4ba84e1df919191" exitCode=0 Dec 06 09:23:36 crc kubenswrapper[4672]: I1206 09:23:36.627708 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qcdzb" event={"ID":"a75d1d69-af3e-47a1-ac03-0d03b40c0999","Type":"ContainerDied","Data":"08985045aa95156649c283873a6329b5a506bb275fa4b543c4ba84e1df919191"} Dec 06 09:23:39 crc kubenswrapper[4672]: I1206 09:23:39.793921 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76445f8cf5-d9mzn" Dec 06 09:23:39 crc kubenswrapper[4672]: I1206 09:23:39.869719 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75cb88fd77-prz47"] Dec 06 09:23:39 crc kubenswrapper[4672]: I1206 09:23:39.870327 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75cb88fd77-prz47" podUID="dd82121b-106e-4488-b0e7-5f1c081077d4" containerName="dnsmasq-dns" containerID="cri-o://aa2e917ca3534c104ac0063fc2fe89051234da2a787ec4135f6f2e51e349eeea" gracePeriod=10 Dec 06 09:23:40 crc kubenswrapper[4672]: I1206 09:23:40.661631 4672 generic.go:334] "Generic (PLEG): container finished" podID="dd82121b-106e-4488-b0e7-5f1c081077d4" containerID="aa2e917ca3534c104ac0063fc2fe89051234da2a787ec4135f6f2e51e349eeea" exitCode=0 Dec 06 09:23:40 crc kubenswrapper[4672]: I1206 09:23:40.661679 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75cb88fd77-prz47" event={"ID":"dd82121b-106e-4488-b0e7-5f1c081077d4","Type":"ContainerDied","Data":"aa2e917ca3534c104ac0063fc2fe89051234da2a787ec4135f6f2e51e349eeea"} Dec 06 09:23:40 crc kubenswrapper[4672]: I1206 09:23:40.760896 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-75cb88fd77-prz47" podUID="dd82121b-106e-4488-b0e7-5f1c081077d4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.122:5353: connect: connection refused" Dec 06 09:23:41 crc kubenswrapper[4672]: I1206 09:23:41.081223 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qcdzb" Dec 06 09:23:41 crc kubenswrapper[4672]: I1206 09:23:41.209627 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a75d1d69-af3e-47a1-ac03-0d03b40c0999-scripts\") pod \"a75d1d69-af3e-47a1-ac03-0d03b40c0999\" (UID: \"a75d1d69-af3e-47a1-ac03-0d03b40c0999\") " Dec 06 09:23:41 crc kubenswrapper[4672]: I1206 09:23:41.210025 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75d1d69-af3e-47a1-ac03-0d03b40c0999-combined-ca-bundle\") pod \"a75d1d69-af3e-47a1-ac03-0d03b40c0999\" (UID: \"a75d1d69-af3e-47a1-ac03-0d03b40c0999\") " Dec 06 09:23:41 crc kubenswrapper[4672]: I1206 09:23:41.210054 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5b8mk\" (UniqueName: \"kubernetes.io/projected/a75d1d69-af3e-47a1-ac03-0d03b40c0999-kube-api-access-5b8mk\") pod \"a75d1d69-af3e-47a1-ac03-0d03b40c0999\" (UID: \"a75d1d69-af3e-47a1-ac03-0d03b40c0999\") " Dec 06 09:23:41 crc kubenswrapper[4672]: I1206 09:23:41.210084 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a75d1d69-af3e-47a1-ac03-0d03b40c0999-credential-keys\") pod \"a75d1d69-af3e-47a1-ac03-0d03b40c0999\" (UID: \"a75d1d69-af3e-47a1-ac03-0d03b40c0999\") " Dec 06 09:23:41 crc kubenswrapper[4672]: I1206 09:23:41.210136 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a75d1d69-af3e-47a1-ac03-0d03b40c0999-fernet-keys\") pod \"a75d1d69-af3e-47a1-ac03-0d03b40c0999\" (UID: \"a75d1d69-af3e-47a1-ac03-0d03b40c0999\") " Dec 06 09:23:41 crc kubenswrapper[4672]: I1206 09:23:41.210211 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a75d1d69-af3e-47a1-ac03-0d03b40c0999-config-data\") pod \"a75d1d69-af3e-47a1-ac03-0d03b40c0999\" (UID: \"a75d1d69-af3e-47a1-ac03-0d03b40c0999\") " Dec 06 09:23:41 crc kubenswrapper[4672]: I1206 09:23:41.217043 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a75d1d69-af3e-47a1-ac03-0d03b40c0999-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a75d1d69-af3e-47a1-ac03-0d03b40c0999" (UID: "a75d1d69-af3e-47a1-ac03-0d03b40c0999"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:23:41 crc kubenswrapper[4672]: I1206 09:23:41.217065 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a75d1d69-af3e-47a1-ac03-0d03b40c0999-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a75d1d69-af3e-47a1-ac03-0d03b40c0999" (UID: "a75d1d69-af3e-47a1-ac03-0d03b40c0999"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:23:41 crc kubenswrapper[4672]: I1206 09:23:41.217348 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a75d1d69-af3e-47a1-ac03-0d03b40c0999-scripts" (OuterVolumeSpecName: "scripts") pod "a75d1d69-af3e-47a1-ac03-0d03b40c0999" (UID: "a75d1d69-af3e-47a1-ac03-0d03b40c0999"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:23:41 crc kubenswrapper[4672]: I1206 09:23:41.231977 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a75d1d69-af3e-47a1-ac03-0d03b40c0999-kube-api-access-5b8mk" (OuterVolumeSpecName: "kube-api-access-5b8mk") pod "a75d1d69-af3e-47a1-ac03-0d03b40c0999" (UID: "a75d1d69-af3e-47a1-ac03-0d03b40c0999"). InnerVolumeSpecName "kube-api-access-5b8mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:23:41 crc kubenswrapper[4672]: I1206 09:23:41.236533 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a75d1d69-af3e-47a1-ac03-0d03b40c0999-config-data" (OuterVolumeSpecName: "config-data") pod "a75d1d69-af3e-47a1-ac03-0d03b40c0999" (UID: "a75d1d69-af3e-47a1-ac03-0d03b40c0999"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:23:41 crc kubenswrapper[4672]: I1206 09:23:41.255913 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a75d1d69-af3e-47a1-ac03-0d03b40c0999-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a75d1d69-af3e-47a1-ac03-0d03b40c0999" (UID: "a75d1d69-af3e-47a1-ac03-0d03b40c0999"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:23:41 crc kubenswrapper[4672]: I1206 09:23:41.311670 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75d1d69-af3e-47a1-ac03-0d03b40c0999-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:41 crc kubenswrapper[4672]: I1206 09:23:41.311700 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5b8mk\" (UniqueName: \"kubernetes.io/projected/a75d1d69-af3e-47a1-ac03-0d03b40c0999-kube-api-access-5b8mk\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:41 crc kubenswrapper[4672]: I1206 09:23:41.311710 4672 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a75d1d69-af3e-47a1-ac03-0d03b40c0999-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:41 crc kubenswrapper[4672]: I1206 09:23:41.311718 4672 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a75d1d69-af3e-47a1-ac03-0d03b40c0999-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:41 crc kubenswrapper[4672]: I1206 09:23:41.311726 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a75d1d69-af3e-47a1-ac03-0d03b40c0999-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:41 crc kubenswrapper[4672]: I1206 09:23:41.311733 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a75d1d69-af3e-47a1-ac03-0d03b40c0999-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:41 crc kubenswrapper[4672]: I1206 09:23:41.671136 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qcdzb" event={"ID":"a75d1d69-af3e-47a1-ac03-0d03b40c0999","Type":"ContainerDied","Data":"773e8b6c9ee7823d220f6d95f7e4ceba39f73d173fea52c5aedbfa543a60a0a8"} Dec 06 09:23:41 crc kubenswrapper[4672]: I1206 09:23:41.671174 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="773e8b6c9ee7823d220f6d95f7e4ceba39f73d173fea52c5aedbfa543a60a0a8" Dec 06 09:23:41 crc kubenswrapper[4672]: I1206 09:23:41.671206 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qcdzb" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.170180 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-qcdzb"] Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.206681 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-qcdzb"] Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.272028 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-hrzmd"] Dec 06 09:23:42 crc kubenswrapper[4672]: E1206 09:23:42.272380 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75d1d69-af3e-47a1-ac03-0d03b40c0999" containerName="keystone-bootstrap" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.272399 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75d1d69-af3e-47a1-ac03-0d03b40c0999" containerName="keystone-bootstrap" Dec 06 09:23:42 crc kubenswrapper[4672]: E1206 09:23:42.272414 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="296e7802-9ad8-4dc7-8adb-cccf91467e55" containerName="init" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.272420 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="296e7802-9ad8-4dc7-8adb-cccf91467e55" containerName="init" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.272583 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="296e7802-9ad8-4dc7-8adb-cccf91467e55" containerName="init" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.272618 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="a75d1d69-af3e-47a1-ac03-0d03b40c0999" containerName="keystone-bootstrap" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.273124 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hrzmd" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.279476 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.279696 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vq6n8" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.279902 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.280033 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.280173 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.281894 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hrzmd"] Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.434686 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-fernet-keys\") pod \"keystone-bootstrap-hrzmd\" (UID: \"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401\") " pod="openstack/keystone-bootstrap-hrzmd" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.434982 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-credential-keys\") pod \"keystone-bootstrap-hrzmd\" (UID: \"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401\") " pod="openstack/keystone-bootstrap-hrzmd" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.435104 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-config-data\") pod \"keystone-bootstrap-hrzmd\" (UID: \"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401\") " pod="openstack/keystone-bootstrap-hrzmd" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.435134 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv2k6\" (UniqueName: \"kubernetes.io/projected/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-kube-api-access-nv2k6\") pod \"keystone-bootstrap-hrzmd\" (UID: \"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401\") " pod="openstack/keystone-bootstrap-hrzmd" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.435204 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-scripts\") pod \"keystone-bootstrap-hrzmd\" (UID: \"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401\") " pod="openstack/keystone-bootstrap-hrzmd" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.435372 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-combined-ca-bundle\") pod \"keystone-bootstrap-hrzmd\" (UID: \"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401\") " pod="openstack/keystone-bootstrap-hrzmd" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.537687 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-credential-keys\") pod \"keystone-bootstrap-hrzmd\" (UID: \"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401\") " pod="openstack/keystone-bootstrap-hrzmd" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.537775 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-config-data\") pod \"keystone-bootstrap-hrzmd\" (UID: \"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401\") " pod="openstack/keystone-bootstrap-hrzmd" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.537800 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv2k6\" (UniqueName: \"kubernetes.io/projected/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-kube-api-access-nv2k6\") pod \"keystone-bootstrap-hrzmd\" (UID: \"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401\") " pod="openstack/keystone-bootstrap-hrzmd" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.537867 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-scripts\") pod \"keystone-bootstrap-hrzmd\" (UID: \"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401\") " pod="openstack/keystone-bootstrap-hrzmd" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.537917 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-combined-ca-bundle\") pod \"keystone-bootstrap-hrzmd\" (UID: \"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401\") " pod="openstack/keystone-bootstrap-hrzmd" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.537956 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-fernet-keys\") pod \"keystone-bootstrap-hrzmd\" (UID: \"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401\") " pod="openstack/keystone-bootstrap-hrzmd" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.543276 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-config-data\") pod \"keystone-bootstrap-hrzmd\" (UID: \"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401\") " pod="openstack/keystone-bootstrap-hrzmd" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.543736 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-scripts\") pod \"keystone-bootstrap-hrzmd\" (UID: \"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401\") " pod="openstack/keystone-bootstrap-hrzmd" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.544946 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-combined-ca-bundle\") pod \"keystone-bootstrap-hrzmd\" (UID: \"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401\") " pod="openstack/keystone-bootstrap-hrzmd" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.546217 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-fernet-keys\") pod \"keystone-bootstrap-hrzmd\" (UID: \"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401\") " pod="openstack/keystone-bootstrap-hrzmd" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.554235 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-credential-keys\") pod \"keystone-bootstrap-hrzmd\" (UID: \"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401\") " pod="openstack/keystone-bootstrap-hrzmd" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.560128 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv2k6\" (UniqueName: \"kubernetes.io/projected/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-kube-api-access-nv2k6\") pod \"keystone-bootstrap-hrzmd\" (UID: \"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401\") " pod="openstack/keystone-bootstrap-hrzmd" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.568251 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a75d1d69-af3e-47a1-ac03-0d03b40c0999" path="/var/lib/kubelet/pods/a75d1d69-af3e-47a1-ac03-0d03b40c0999/volumes" Dec 06 09:23:42 crc kubenswrapper[4672]: I1206 09:23:42.593610 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hrzmd" Dec 06 09:23:49 crc kubenswrapper[4672]: I1206 09:23:49.498206 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75cb88fd77-prz47" Dec 06 09:23:49 crc kubenswrapper[4672]: I1206 09:23:49.664792 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd82121b-106e-4488-b0e7-5f1c081077d4-dns-svc\") pod \"dd82121b-106e-4488-b0e7-5f1c081077d4\" (UID: \"dd82121b-106e-4488-b0e7-5f1c081077d4\") " Dec 06 09:23:49 crc kubenswrapper[4672]: I1206 09:23:49.664874 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd82121b-106e-4488-b0e7-5f1c081077d4-config\") pod \"dd82121b-106e-4488-b0e7-5f1c081077d4\" (UID: \"dd82121b-106e-4488-b0e7-5f1c081077d4\") " Dec 06 09:23:49 crc kubenswrapper[4672]: I1206 09:23:49.664916 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd82121b-106e-4488-b0e7-5f1c081077d4-ovsdbserver-nb\") pod \"dd82121b-106e-4488-b0e7-5f1c081077d4\" (UID: \"dd82121b-106e-4488-b0e7-5f1c081077d4\") " Dec 06 09:23:49 crc kubenswrapper[4672]: I1206 09:23:49.664935 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd82121b-106e-4488-b0e7-5f1c081077d4-ovsdbserver-sb\") pod \"dd82121b-106e-4488-b0e7-5f1c081077d4\" (UID: \"dd82121b-106e-4488-b0e7-5f1c081077d4\") " Dec 06 09:23:49 crc kubenswrapper[4672]: I1206 09:23:49.664970 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pskrq\" (UniqueName: \"kubernetes.io/projected/dd82121b-106e-4488-b0e7-5f1c081077d4-kube-api-access-pskrq\") pod \"dd82121b-106e-4488-b0e7-5f1c081077d4\" (UID: \"dd82121b-106e-4488-b0e7-5f1c081077d4\") " Dec 06 09:23:49 crc kubenswrapper[4672]: I1206 09:23:49.671828 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd82121b-106e-4488-b0e7-5f1c081077d4-kube-api-access-pskrq" (OuterVolumeSpecName: "kube-api-access-pskrq") pod "dd82121b-106e-4488-b0e7-5f1c081077d4" (UID: "dd82121b-106e-4488-b0e7-5f1c081077d4"). InnerVolumeSpecName "kube-api-access-pskrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:23:49 crc kubenswrapper[4672]: I1206 09:23:49.717143 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd82121b-106e-4488-b0e7-5f1c081077d4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dd82121b-106e-4488-b0e7-5f1c081077d4" (UID: "dd82121b-106e-4488-b0e7-5f1c081077d4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:23:49 crc kubenswrapper[4672]: I1206 09:23:49.725092 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd82121b-106e-4488-b0e7-5f1c081077d4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dd82121b-106e-4488-b0e7-5f1c081077d4" (UID: "dd82121b-106e-4488-b0e7-5f1c081077d4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:23:49 crc kubenswrapper[4672]: I1206 09:23:49.727250 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd82121b-106e-4488-b0e7-5f1c081077d4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dd82121b-106e-4488-b0e7-5f1c081077d4" (UID: "dd82121b-106e-4488-b0e7-5f1c081077d4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:23:49 crc kubenswrapper[4672]: I1206 09:23:49.730903 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd82121b-106e-4488-b0e7-5f1c081077d4-config" (OuterVolumeSpecName: "config") pod "dd82121b-106e-4488-b0e7-5f1c081077d4" (UID: "dd82121b-106e-4488-b0e7-5f1c081077d4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:23:49 crc kubenswrapper[4672]: I1206 09:23:49.748232 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75cb88fd77-prz47" event={"ID":"dd82121b-106e-4488-b0e7-5f1c081077d4","Type":"ContainerDied","Data":"c589be46e5af1650a4ff32af46254dc38666a3b1fdb8fd3f3536b0157a579ba0"} Dec 06 09:23:49 crc kubenswrapper[4672]: I1206 09:23:49.748284 4672 scope.go:117] "RemoveContainer" containerID="aa2e917ca3534c104ac0063fc2fe89051234da2a787ec4135f6f2e51e349eeea" Dec 06 09:23:49 crc kubenswrapper[4672]: I1206 09:23:49.748428 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75cb88fd77-prz47" Dec 06 09:23:49 crc kubenswrapper[4672]: I1206 09:23:49.769360 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd82121b-106e-4488-b0e7-5f1c081077d4-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:49 crc kubenswrapper[4672]: I1206 09:23:49.769396 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd82121b-106e-4488-b0e7-5f1c081077d4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:49 crc kubenswrapper[4672]: I1206 09:23:49.769416 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd82121b-106e-4488-b0e7-5f1c081077d4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:49 crc kubenswrapper[4672]: I1206 09:23:49.769430 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pskrq\" (UniqueName: \"kubernetes.io/projected/dd82121b-106e-4488-b0e7-5f1c081077d4-kube-api-access-pskrq\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:49 crc kubenswrapper[4672]: I1206 09:23:49.769442 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd82121b-106e-4488-b0e7-5f1c081077d4-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:49 crc kubenswrapper[4672]: I1206 09:23:49.807196 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75cb88fd77-prz47"] Dec 06 09:23:49 crc kubenswrapper[4672]: I1206 09:23:49.813454 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75cb88fd77-prz47"] Dec 06 09:23:50 crc kubenswrapper[4672]: E1206 09:23:50.549320 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:7a2056615520e272bae43ec3f34e2ba7a92c1d364b8d9106b53bd694619fc9c2" Dec 06 09:23:50 crc kubenswrapper[4672]: E1206 09:23:50.549653 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:7a2056615520e272bae43ec3f34e2ba7a92c1d364b8d9106b53bd694619fc9c2,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d9kdt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-jnkpx_openstack(34617816-681d-44a7-b88d-73983735dd75): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 09:23:50 crc kubenswrapper[4672]: E1206 09:23:50.551660 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-jnkpx" podUID="34617816-681d-44a7-b88d-73983735dd75" Dec 06 09:23:50 crc kubenswrapper[4672]: I1206 09:23:50.615230 4672 scope.go:117] "RemoveContainer" containerID="97e9da8f79d9127fbd4be22a3490dde3eb27d9700610f1d31dfc4437b7ac187a" Dec 06 09:23:50 crc kubenswrapper[4672]: I1206 09:23:50.626001 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd82121b-106e-4488-b0e7-5f1c081077d4" path="/var/lib/kubelet/pods/dd82121b-106e-4488-b0e7-5f1c081077d4/volumes" Dec 06 09:23:50 crc kubenswrapper[4672]: I1206 09:23:50.757470 4672 generic.go:334] "Generic (PLEG): container finished" podID="e5e31a55-73d0-43a5-8308-40e18ab22f58" containerID="7f4f5e161f8febd2d5ed444cf217a57031e468ee35323ee9e8ec347625d57236" exitCode=0 Dec 06 09:23:50 crc kubenswrapper[4672]: I1206 09:23:50.757569 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4dcrc" event={"ID":"e5e31a55-73d0-43a5-8308-40e18ab22f58","Type":"ContainerDied","Data":"7f4f5e161f8febd2d5ed444cf217a57031e468ee35323ee9e8ec347625d57236"} Dec 06 09:23:50 crc kubenswrapper[4672]: I1206 09:23:50.760224 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-75cb88fd77-prz47" podUID="dd82121b-106e-4488-b0e7-5f1c081077d4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.122:5353: i/o timeout" Dec 06 09:23:50 crc kubenswrapper[4672]: E1206 09:23:50.767944 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:7a2056615520e272bae43ec3f34e2ba7a92c1d364b8d9106b53bd694619fc9c2\\\"\"" pod="openstack/cinder-db-sync-jnkpx" podUID="34617816-681d-44a7-b88d-73983735dd75" Dec 06 09:23:51 crc kubenswrapper[4672]: I1206 09:23:51.020166 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hrzmd"] Dec 06 09:23:51 crc kubenswrapper[4672]: W1206 09:23:51.025060 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e11e53b_de6c_4e98_8a9c_7fbdac1a1401.slice/crio-a70998765929e04f488b26645c5114600898cb79c1fdaa8910186cb37c9c82e4 WatchSource:0}: Error finding container a70998765929e04f488b26645c5114600898cb79c1fdaa8910186cb37c9c82e4: Status 404 returned error can't find the container with id a70998765929e04f488b26645c5114600898cb79c1fdaa8910186cb37c9c82e4 Dec 06 09:23:51 crc kubenswrapper[4672]: I1206 09:23:51.781187 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pmmnm" event={"ID":"6a9bafbf-4733-4178-8012-3e94d02aa9cb","Type":"ContainerStarted","Data":"3ce98b553cc83461588d7a633c3558405e42b7ae843dd18b41c50eb9a8cce884"} Dec 06 09:23:51 crc kubenswrapper[4672]: I1206 09:23:51.788531 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wz7rb" event={"ID":"4f78f7c5-5965-4932-845c-8f0be90b421a","Type":"ContainerStarted","Data":"7ea2d51b6c8a3a3d7dd06aa4aa9f77784520acda5263e9107f77febfec11562e"} Dec 06 09:23:51 crc kubenswrapper[4672]: I1206 09:23:51.790479 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c2437522-da6e-48b0-94b4-30b0968bccde","Type":"ContainerStarted","Data":"f5f794408b6d4724c62aae1e5ee1e4ac7383ab3451759f6b8e6aa0e0ab98ad7c"} Dec 06 09:23:51 crc kubenswrapper[4672]: I1206 09:23:51.791884 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hrzmd" event={"ID":"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401","Type":"ContainerStarted","Data":"a70998765929e04f488b26645c5114600898cb79c1fdaa8910186cb37c9c82e4"} Dec 06 09:23:51 crc kubenswrapper[4672]: I1206 09:23:51.841623 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-pmmnm" podStartSLOduration=3.463409881 podStartE2EDuration="22.841588744s" podCreationTimestamp="2025-12-06 09:23:29 +0000 UTC" firstStartedPulling="2025-12-06 09:23:31.23978205 +0000 UTC m=+1028.984042337" lastFinishedPulling="2025-12-06 09:23:50.617960913 +0000 UTC m=+1048.362221200" observedRunningTime="2025-12-06 09:23:51.822380594 +0000 UTC m=+1049.566640881" watchObservedRunningTime="2025-12-06 09:23:51.841588744 +0000 UTC m=+1049.585849031" Dec 06 09:23:51 crc kubenswrapper[4672]: I1206 09:23:51.842113 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-wz7rb" podStartSLOduration=3.4453191690000002 podStartE2EDuration="22.842108577s" podCreationTimestamp="2025-12-06 09:23:29 +0000 UTC" firstStartedPulling="2025-12-06 09:23:31.21877669 +0000 UTC m=+1028.963036977" lastFinishedPulling="2025-12-06 09:23:50.615566098 +0000 UTC m=+1048.359826385" observedRunningTime="2025-12-06 09:23:51.840525134 +0000 UTC m=+1049.584785431" watchObservedRunningTime="2025-12-06 09:23:51.842108577 +0000 UTC m=+1049.586368864" Dec 06 09:23:52 crc kubenswrapper[4672]: I1206 09:23:52.283568 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4dcrc" Dec 06 09:23:52 crc kubenswrapper[4672]: I1206 09:23:52.438200 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5e31a55-73d0-43a5-8308-40e18ab22f58-combined-ca-bundle\") pod \"e5e31a55-73d0-43a5-8308-40e18ab22f58\" (UID: \"e5e31a55-73d0-43a5-8308-40e18ab22f58\") " Dec 06 09:23:52 crc kubenswrapper[4672]: I1206 09:23:52.438334 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpswz\" (UniqueName: \"kubernetes.io/projected/e5e31a55-73d0-43a5-8308-40e18ab22f58-kube-api-access-hpswz\") pod \"e5e31a55-73d0-43a5-8308-40e18ab22f58\" (UID: \"e5e31a55-73d0-43a5-8308-40e18ab22f58\") " Dec 06 09:23:52 crc kubenswrapper[4672]: I1206 09:23:52.438457 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5e31a55-73d0-43a5-8308-40e18ab22f58-config\") pod \"e5e31a55-73d0-43a5-8308-40e18ab22f58\" (UID: \"e5e31a55-73d0-43a5-8308-40e18ab22f58\") " Dec 06 09:23:52 crc kubenswrapper[4672]: I1206 09:23:52.443370 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5e31a55-73d0-43a5-8308-40e18ab22f58-kube-api-access-hpswz" (OuterVolumeSpecName: "kube-api-access-hpswz") pod "e5e31a55-73d0-43a5-8308-40e18ab22f58" (UID: "e5e31a55-73d0-43a5-8308-40e18ab22f58"). InnerVolumeSpecName "kube-api-access-hpswz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:23:52 crc kubenswrapper[4672]: I1206 09:23:52.460255 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5e31a55-73d0-43a5-8308-40e18ab22f58-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5e31a55-73d0-43a5-8308-40e18ab22f58" (UID: "e5e31a55-73d0-43a5-8308-40e18ab22f58"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:23:52 crc kubenswrapper[4672]: I1206 09:23:52.470009 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5e31a55-73d0-43a5-8308-40e18ab22f58-config" (OuterVolumeSpecName: "config") pod "e5e31a55-73d0-43a5-8308-40e18ab22f58" (UID: "e5e31a55-73d0-43a5-8308-40e18ab22f58"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:23:52 crc kubenswrapper[4672]: I1206 09:23:52.540684 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5e31a55-73d0-43a5-8308-40e18ab22f58-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:52 crc kubenswrapper[4672]: I1206 09:23:52.540747 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpswz\" (UniqueName: \"kubernetes.io/projected/e5e31a55-73d0-43a5-8308-40e18ab22f58-kube-api-access-hpswz\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:52 crc kubenswrapper[4672]: I1206 09:23:52.540766 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5e31a55-73d0-43a5-8308-40e18ab22f58-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:52 crc kubenswrapper[4672]: I1206 09:23:52.817646 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4dcrc" event={"ID":"e5e31a55-73d0-43a5-8308-40e18ab22f58","Type":"ContainerDied","Data":"06cdb953c5972893e8e5dc076407ee5f6822fdd3dcf5c4900fd0be883570b396"} Dec 06 09:23:52 crc kubenswrapper[4672]: I1206 09:23:52.818231 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06cdb953c5972893e8e5dc076407ee5f6822fdd3dcf5c4900fd0be883570b396" Dec 06 09:23:52 crc kubenswrapper[4672]: I1206 09:23:52.817774 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4dcrc" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.121963 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d47f95b99-h4jdd"] Dec 06 09:23:53 crc kubenswrapper[4672]: E1206 09:23:53.122436 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd82121b-106e-4488-b0e7-5f1c081077d4" containerName="init" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.122456 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd82121b-106e-4488-b0e7-5f1c081077d4" containerName="init" Dec 06 09:23:53 crc kubenswrapper[4672]: E1206 09:23:53.122469 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5e31a55-73d0-43a5-8308-40e18ab22f58" containerName="neutron-db-sync" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.122475 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e31a55-73d0-43a5-8308-40e18ab22f58" containerName="neutron-db-sync" Dec 06 09:23:53 crc kubenswrapper[4672]: E1206 09:23:53.122495 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd82121b-106e-4488-b0e7-5f1c081077d4" containerName="dnsmasq-dns" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.122521 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd82121b-106e-4488-b0e7-5f1c081077d4" containerName="dnsmasq-dns" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.122775 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5e31a55-73d0-43a5-8308-40e18ab22f58" containerName="neutron-db-sync" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.122793 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd82121b-106e-4488-b0e7-5f1c081077d4" containerName="dnsmasq-dns" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.127437 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d47f95b99-h4jdd" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.196009 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d47f95b99-h4jdd"] Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.265736 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhddw\" (UniqueName: \"kubernetes.io/projected/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d-kube-api-access-lhddw\") pod \"dnsmasq-dns-5d47f95b99-h4jdd\" (UID: \"47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d\") " pod="openstack/dnsmasq-dns-5d47f95b99-h4jdd" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.266003 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d-config\") pod \"dnsmasq-dns-5d47f95b99-h4jdd\" (UID: \"47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d\") " pod="openstack/dnsmasq-dns-5d47f95b99-h4jdd" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.266106 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d-ovsdbserver-nb\") pod \"dnsmasq-dns-5d47f95b99-h4jdd\" (UID: \"47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d\") " pod="openstack/dnsmasq-dns-5d47f95b99-h4jdd" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.266171 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d-dns-svc\") pod \"dnsmasq-dns-5d47f95b99-h4jdd\" (UID: \"47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d\") " pod="openstack/dnsmasq-dns-5d47f95b99-h4jdd" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.266274 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d-ovsdbserver-sb\") pod \"dnsmasq-dns-5d47f95b99-h4jdd\" (UID: \"47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d\") " pod="openstack/dnsmasq-dns-5d47f95b99-h4jdd" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.310845 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-85f8bd445b-dxm5t"] Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.312041 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-85f8bd445b-dxm5t" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.314509 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.314733 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-bdl8h" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.314861 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.316996 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.337927 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-85f8bd445b-dxm5t"] Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.371837 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d-config\") pod \"dnsmasq-dns-5d47f95b99-h4jdd\" (UID: \"47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d\") " pod="openstack/dnsmasq-dns-5d47f95b99-h4jdd" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.371958 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d-ovsdbserver-nb\") pod \"dnsmasq-dns-5d47f95b99-h4jdd\" (UID: \"47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d\") " pod="openstack/dnsmasq-dns-5d47f95b99-h4jdd" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.371988 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d-dns-svc\") pod \"dnsmasq-dns-5d47f95b99-h4jdd\" (UID: \"47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d\") " pod="openstack/dnsmasq-dns-5d47f95b99-h4jdd" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.372065 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d-ovsdbserver-sb\") pod \"dnsmasq-dns-5d47f95b99-h4jdd\" (UID: \"47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d\") " pod="openstack/dnsmasq-dns-5d47f95b99-h4jdd" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.372394 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhddw\" (UniqueName: \"kubernetes.io/projected/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d-kube-api-access-lhddw\") pod \"dnsmasq-dns-5d47f95b99-h4jdd\" (UID: \"47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d\") " pod="openstack/dnsmasq-dns-5d47f95b99-h4jdd" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.373549 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d-config\") pod \"dnsmasq-dns-5d47f95b99-h4jdd\" (UID: \"47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d\") " pod="openstack/dnsmasq-dns-5d47f95b99-h4jdd" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.374676 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d-ovsdbserver-nb\") pod \"dnsmasq-dns-5d47f95b99-h4jdd\" (UID: \"47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d\") " pod="openstack/dnsmasq-dns-5d47f95b99-h4jdd" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.375530 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d-dns-svc\") pod \"dnsmasq-dns-5d47f95b99-h4jdd\" (UID: \"47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d\") " pod="openstack/dnsmasq-dns-5d47f95b99-h4jdd" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.376191 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d-ovsdbserver-sb\") pod \"dnsmasq-dns-5d47f95b99-h4jdd\" (UID: \"47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d\") " pod="openstack/dnsmasq-dns-5d47f95b99-h4jdd" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.408389 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhddw\" (UniqueName: \"kubernetes.io/projected/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d-kube-api-access-lhddw\") pod \"dnsmasq-dns-5d47f95b99-h4jdd\" (UID: \"47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d\") " pod="openstack/dnsmasq-dns-5d47f95b99-h4jdd" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.445310 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d47f95b99-h4jdd" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.473877 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b2eaecb7-d959-4452-ba91-029191056f70-httpd-config\") pod \"neutron-85f8bd445b-dxm5t\" (UID: \"b2eaecb7-d959-4452-ba91-029191056f70\") " pod="openstack/neutron-85f8bd445b-dxm5t" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.474163 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2eaecb7-d959-4452-ba91-029191056f70-ovndb-tls-certs\") pod \"neutron-85f8bd445b-dxm5t\" (UID: \"b2eaecb7-d959-4452-ba91-029191056f70\") " pod="openstack/neutron-85f8bd445b-dxm5t" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.474193 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpsdf\" (UniqueName: \"kubernetes.io/projected/b2eaecb7-d959-4452-ba91-029191056f70-kube-api-access-rpsdf\") pod \"neutron-85f8bd445b-dxm5t\" (UID: \"b2eaecb7-d959-4452-ba91-029191056f70\") " pod="openstack/neutron-85f8bd445b-dxm5t" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.474211 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b2eaecb7-d959-4452-ba91-029191056f70-config\") pod \"neutron-85f8bd445b-dxm5t\" (UID: \"b2eaecb7-d959-4452-ba91-029191056f70\") " pod="openstack/neutron-85f8bd445b-dxm5t" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.474399 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2eaecb7-d959-4452-ba91-029191056f70-combined-ca-bundle\") pod \"neutron-85f8bd445b-dxm5t\" (UID: \"b2eaecb7-d959-4452-ba91-029191056f70\") " pod="openstack/neutron-85f8bd445b-dxm5t" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.577426 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2eaecb7-d959-4452-ba91-029191056f70-ovndb-tls-certs\") pod \"neutron-85f8bd445b-dxm5t\" (UID: \"b2eaecb7-d959-4452-ba91-029191056f70\") " pod="openstack/neutron-85f8bd445b-dxm5t" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.577509 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpsdf\" (UniqueName: \"kubernetes.io/projected/b2eaecb7-d959-4452-ba91-029191056f70-kube-api-access-rpsdf\") pod \"neutron-85f8bd445b-dxm5t\" (UID: \"b2eaecb7-d959-4452-ba91-029191056f70\") " pod="openstack/neutron-85f8bd445b-dxm5t" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.577536 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b2eaecb7-d959-4452-ba91-029191056f70-config\") pod \"neutron-85f8bd445b-dxm5t\" (UID: \"b2eaecb7-d959-4452-ba91-029191056f70\") " pod="openstack/neutron-85f8bd445b-dxm5t" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.577588 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2eaecb7-d959-4452-ba91-029191056f70-combined-ca-bundle\") pod \"neutron-85f8bd445b-dxm5t\" (UID: \"b2eaecb7-d959-4452-ba91-029191056f70\") " pod="openstack/neutron-85f8bd445b-dxm5t" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.577708 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b2eaecb7-d959-4452-ba91-029191056f70-httpd-config\") pod \"neutron-85f8bd445b-dxm5t\" (UID: \"b2eaecb7-d959-4452-ba91-029191056f70\") " pod="openstack/neutron-85f8bd445b-dxm5t" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.587527 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b2eaecb7-d959-4452-ba91-029191056f70-httpd-config\") pod \"neutron-85f8bd445b-dxm5t\" (UID: \"b2eaecb7-d959-4452-ba91-029191056f70\") " pod="openstack/neutron-85f8bd445b-dxm5t" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.589849 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2eaecb7-d959-4452-ba91-029191056f70-combined-ca-bundle\") pod \"neutron-85f8bd445b-dxm5t\" (UID: \"b2eaecb7-d959-4452-ba91-029191056f70\") " pod="openstack/neutron-85f8bd445b-dxm5t" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.590589 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b2eaecb7-d959-4452-ba91-029191056f70-config\") pod \"neutron-85f8bd445b-dxm5t\" (UID: \"b2eaecb7-d959-4452-ba91-029191056f70\") " pod="openstack/neutron-85f8bd445b-dxm5t" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.607216 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2eaecb7-d959-4452-ba91-029191056f70-ovndb-tls-certs\") pod \"neutron-85f8bd445b-dxm5t\" (UID: \"b2eaecb7-d959-4452-ba91-029191056f70\") " pod="openstack/neutron-85f8bd445b-dxm5t" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.612731 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpsdf\" (UniqueName: \"kubernetes.io/projected/b2eaecb7-d959-4452-ba91-029191056f70-kube-api-access-rpsdf\") pod \"neutron-85f8bd445b-dxm5t\" (UID: \"b2eaecb7-d959-4452-ba91-029191056f70\") " pod="openstack/neutron-85f8bd445b-dxm5t" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.631564 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-85f8bd445b-dxm5t" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.846785 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hrzmd" event={"ID":"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401","Type":"ContainerStarted","Data":"e194302916df6a7135e83785075becbc55e764210162c9c32aaa1fa745ca1f2e"} Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.916442 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-hrzmd" podStartSLOduration=11.916417766 podStartE2EDuration="11.916417766s" podCreationTimestamp="2025-12-06 09:23:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:23:53.889075957 +0000 UTC m=+1051.633336244" watchObservedRunningTime="2025-12-06 09:23:53.916417766 +0000 UTC m=+1051.660678053" Dec 06 09:23:53 crc kubenswrapper[4672]: I1206 09:23:53.991614 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d47f95b99-h4jdd"] Dec 06 09:23:54 crc kubenswrapper[4672]: W1206 09:23:54.295615 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2eaecb7_d959_4452_ba91_029191056f70.slice/crio-badedc59d33a3c4f8cdc316723811c668feba004a801b5dda75bdf97e1e8070d WatchSource:0}: Error finding container badedc59d33a3c4f8cdc316723811c668feba004a801b5dda75bdf97e1e8070d: Status 404 returned error can't find the container with id badedc59d33a3c4f8cdc316723811c668feba004a801b5dda75bdf97e1e8070d Dec 06 09:23:54 crc kubenswrapper[4672]: I1206 09:23:54.295720 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-85f8bd445b-dxm5t"] Dec 06 09:23:54 crc kubenswrapper[4672]: I1206 09:23:54.854964 4672 generic.go:334] "Generic (PLEG): container finished" podID="47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d" containerID="a79edf8501a816519884d08ea9dd884df0e124b16c391d48966ef8555ee58b96" exitCode=0 Dec 06 09:23:54 crc kubenswrapper[4672]: I1206 09:23:54.856017 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d47f95b99-h4jdd" event={"ID":"47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d","Type":"ContainerDied","Data":"a79edf8501a816519884d08ea9dd884df0e124b16c391d48966ef8555ee58b96"} Dec 06 09:23:54 crc kubenswrapper[4672]: I1206 09:23:54.856147 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d47f95b99-h4jdd" event={"ID":"47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d","Type":"ContainerStarted","Data":"8dcea7e50638456f11eb8e42ff0c8ac88f9fcfad9737ae30426b4059bd8dc5be"} Dec 06 09:23:54 crc kubenswrapper[4672]: I1206 09:23:54.860243 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85f8bd445b-dxm5t" event={"ID":"b2eaecb7-d959-4452-ba91-029191056f70","Type":"ContainerStarted","Data":"fcfbb3a8fa34c8ded377c8feb108edb04a6c456309a3c94277e9b7012da91f57"} Dec 06 09:23:54 crc kubenswrapper[4672]: I1206 09:23:54.860289 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85f8bd445b-dxm5t" event={"ID":"b2eaecb7-d959-4452-ba91-029191056f70","Type":"ContainerStarted","Data":"badedc59d33a3c4f8cdc316723811c668feba004a801b5dda75bdf97e1e8070d"} Dec 06 09:23:55 crc kubenswrapper[4672]: I1206 09:23:55.842717 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-d999c477-wf9vn"] Dec 06 09:23:55 crc kubenswrapper[4672]: I1206 09:23:55.844722 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d999c477-wf9vn" Dec 06 09:23:55 crc kubenswrapper[4672]: I1206 09:23:55.850067 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 06 09:23:55 crc kubenswrapper[4672]: I1206 09:23:55.850254 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 06 09:23:55 crc kubenswrapper[4672]: I1206 09:23:55.851407 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d999c477-wf9vn"] Dec 06 09:23:55 crc kubenswrapper[4672]: I1206 09:23:55.868797 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bc4c1773-bc77-4592-aff9-04323f477805-httpd-config\") pod \"neutron-d999c477-wf9vn\" (UID: \"bc4c1773-bc77-4592-aff9-04323f477805\") " pod="openstack/neutron-d999c477-wf9vn" Dec 06 09:23:55 crc kubenswrapper[4672]: I1206 09:23:55.868872 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4c1773-bc77-4592-aff9-04323f477805-internal-tls-certs\") pod \"neutron-d999c477-wf9vn\" (UID: \"bc4c1773-bc77-4592-aff9-04323f477805\") " pod="openstack/neutron-d999c477-wf9vn" Dec 06 09:23:55 crc kubenswrapper[4672]: I1206 09:23:55.868963 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpfgv\" (UniqueName: \"kubernetes.io/projected/bc4c1773-bc77-4592-aff9-04323f477805-kube-api-access-bpfgv\") pod \"neutron-d999c477-wf9vn\" (UID: \"bc4c1773-bc77-4592-aff9-04323f477805\") " pod="openstack/neutron-d999c477-wf9vn" Dec 06 09:23:55 crc kubenswrapper[4672]: I1206 09:23:55.869053 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bc4c1773-bc77-4592-aff9-04323f477805-config\") pod \"neutron-d999c477-wf9vn\" (UID: \"bc4c1773-bc77-4592-aff9-04323f477805\") " pod="openstack/neutron-d999c477-wf9vn" Dec 06 09:23:55 crc kubenswrapper[4672]: I1206 09:23:55.869194 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4c1773-bc77-4592-aff9-04323f477805-ovndb-tls-certs\") pod \"neutron-d999c477-wf9vn\" (UID: \"bc4c1773-bc77-4592-aff9-04323f477805\") " pod="openstack/neutron-d999c477-wf9vn" Dec 06 09:23:55 crc kubenswrapper[4672]: I1206 09:23:55.869261 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4c1773-bc77-4592-aff9-04323f477805-combined-ca-bundle\") pod \"neutron-d999c477-wf9vn\" (UID: \"bc4c1773-bc77-4592-aff9-04323f477805\") " pod="openstack/neutron-d999c477-wf9vn" Dec 06 09:23:55 crc kubenswrapper[4672]: I1206 09:23:55.869282 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4c1773-bc77-4592-aff9-04323f477805-public-tls-certs\") pod \"neutron-d999c477-wf9vn\" (UID: \"bc4c1773-bc77-4592-aff9-04323f477805\") " pod="openstack/neutron-d999c477-wf9vn" Dec 06 09:23:55 crc kubenswrapper[4672]: I1206 09:23:55.869522 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d47f95b99-h4jdd" event={"ID":"47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d","Type":"ContainerStarted","Data":"cc5c57e0444f2c882380f8fb6782250e02ed4a3736c7b388cf6625adce224300"} Dec 06 09:23:55 crc kubenswrapper[4672]: I1206 09:23:55.870219 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d47f95b99-h4jdd" Dec 06 09:23:55 crc kubenswrapper[4672]: I1206 09:23:55.873159 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85f8bd445b-dxm5t" event={"ID":"b2eaecb7-d959-4452-ba91-029191056f70","Type":"ContainerStarted","Data":"2f424b0498a0ac11b6c44b1a0c740dfcb7daa1e794b1450978a3bb7757bf7b89"} Dec 06 09:23:55 crc kubenswrapper[4672]: I1206 09:23:55.873709 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-85f8bd445b-dxm5t" Dec 06 09:23:55 crc kubenswrapper[4672]: I1206 09:23:55.907954 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d47f95b99-h4jdd" podStartSLOduration=2.907937557 podStartE2EDuration="2.907937557s" podCreationTimestamp="2025-12-06 09:23:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:23:55.905755368 +0000 UTC m=+1053.650015655" watchObservedRunningTime="2025-12-06 09:23:55.907937557 +0000 UTC m=+1053.652197844" Dec 06 09:23:55 crc kubenswrapper[4672]: I1206 09:23:55.927274 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-85f8bd445b-dxm5t" podStartSLOduration=2.927252809 podStartE2EDuration="2.927252809s" podCreationTimestamp="2025-12-06 09:23:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:23:55.92134709 +0000 UTC m=+1053.665607377" watchObservedRunningTime="2025-12-06 09:23:55.927252809 +0000 UTC m=+1053.671513096" Dec 06 09:23:55 crc kubenswrapper[4672]: I1206 09:23:55.970315 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4c1773-bc77-4592-aff9-04323f477805-ovndb-tls-certs\") pod \"neutron-d999c477-wf9vn\" (UID: \"bc4c1773-bc77-4592-aff9-04323f477805\") " pod="openstack/neutron-d999c477-wf9vn" Dec 06 09:23:55 crc kubenswrapper[4672]: I1206 09:23:55.970431 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4c1773-bc77-4592-aff9-04323f477805-combined-ca-bundle\") pod \"neutron-d999c477-wf9vn\" (UID: \"bc4c1773-bc77-4592-aff9-04323f477805\") " pod="openstack/neutron-d999c477-wf9vn" Dec 06 09:23:55 crc kubenswrapper[4672]: I1206 09:23:55.970460 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4c1773-bc77-4592-aff9-04323f477805-public-tls-certs\") pod \"neutron-d999c477-wf9vn\" (UID: \"bc4c1773-bc77-4592-aff9-04323f477805\") " pod="openstack/neutron-d999c477-wf9vn" Dec 06 09:23:55 crc kubenswrapper[4672]: I1206 09:23:55.970531 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bc4c1773-bc77-4592-aff9-04323f477805-httpd-config\") pod \"neutron-d999c477-wf9vn\" (UID: \"bc4c1773-bc77-4592-aff9-04323f477805\") " pod="openstack/neutron-d999c477-wf9vn" Dec 06 09:23:55 crc kubenswrapper[4672]: I1206 09:23:55.970570 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4c1773-bc77-4592-aff9-04323f477805-internal-tls-certs\") pod \"neutron-d999c477-wf9vn\" (UID: \"bc4c1773-bc77-4592-aff9-04323f477805\") " pod="openstack/neutron-d999c477-wf9vn" Dec 06 09:23:55 crc kubenswrapper[4672]: I1206 09:23:55.970644 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpfgv\" (UniqueName: \"kubernetes.io/projected/bc4c1773-bc77-4592-aff9-04323f477805-kube-api-access-bpfgv\") pod \"neutron-d999c477-wf9vn\" (UID: \"bc4c1773-bc77-4592-aff9-04323f477805\") " pod="openstack/neutron-d999c477-wf9vn" Dec 06 09:23:55 crc kubenswrapper[4672]: I1206 09:23:55.970705 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bc4c1773-bc77-4592-aff9-04323f477805-config\") pod \"neutron-d999c477-wf9vn\" (UID: \"bc4c1773-bc77-4592-aff9-04323f477805\") " pod="openstack/neutron-d999c477-wf9vn" Dec 06 09:23:55 crc kubenswrapper[4672]: I1206 09:23:55.978803 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4c1773-bc77-4592-aff9-04323f477805-internal-tls-certs\") pod \"neutron-d999c477-wf9vn\" (UID: \"bc4c1773-bc77-4592-aff9-04323f477805\") " pod="openstack/neutron-d999c477-wf9vn" Dec 06 09:23:55 crc kubenswrapper[4672]: I1206 09:23:55.983360 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4c1773-bc77-4592-aff9-04323f477805-ovndb-tls-certs\") pod \"neutron-d999c477-wf9vn\" (UID: \"bc4c1773-bc77-4592-aff9-04323f477805\") " pod="openstack/neutron-d999c477-wf9vn" Dec 06 09:23:55 crc kubenswrapper[4672]: I1206 09:23:55.987089 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bc4c1773-bc77-4592-aff9-04323f477805-httpd-config\") pod \"neutron-d999c477-wf9vn\" (UID: \"bc4c1773-bc77-4592-aff9-04323f477805\") " pod="openstack/neutron-d999c477-wf9vn" Dec 06 09:23:55 crc kubenswrapper[4672]: I1206 09:23:55.992088 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bc4c1773-bc77-4592-aff9-04323f477805-config\") pod \"neutron-d999c477-wf9vn\" (UID: \"bc4c1773-bc77-4592-aff9-04323f477805\") " pod="openstack/neutron-d999c477-wf9vn" Dec 06 09:23:55 crc kubenswrapper[4672]: I1206 09:23:55.995392 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4c1773-bc77-4592-aff9-04323f477805-public-tls-certs\") pod \"neutron-d999c477-wf9vn\" (UID: \"bc4c1773-bc77-4592-aff9-04323f477805\") " pod="openstack/neutron-d999c477-wf9vn" Dec 06 09:23:56 crc kubenswrapper[4672]: I1206 09:23:55.997324 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4c1773-bc77-4592-aff9-04323f477805-combined-ca-bundle\") pod \"neutron-d999c477-wf9vn\" (UID: \"bc4c1773-bc77-4592-aff9-04323f477805\") " pod="openstack/neutron-d999c477-wf9vn" Dec 06 09:23:56 crc kubenswrapper[4672]: I1206 09:23:56.005275 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpfgv\" (UniqueName: \"kubernetes.io/projected/bc4c1773-bc77-4592-aff9-04323f477805-kube-api-access-bpfgv\") pod \"neutron-d999c477-wf9vn\" (UID: \"bc4c1773-bc77-4592-aff9-04323f477805\") " pod="openstack/neutron-d999c477-wf9vn" Dec 06 09:23:56 crc kubenswrapper[4672]: I1206 09:23:56.169010 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d999c477-wf9vn" Dec 06 09:23:56 crc kubenswrapper[4672]: I1206 09:23:56.882834 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c2437522-da6e-48b0-94b4-30b0968bccde","Type":"ContainerStarted","Data":"30c8db56ebd0459d81667b1dc66d3ee86fe5dbb7addfd4bca363f1c34ab4de5a"} Dec 06 09:23:56 crc kubenswrapper[4672]: I1206 09:23:56.942453 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d999c477-wf9vn"] Dec 06 09:23:56 crc kubenswrapper[4672]: W1206 09:23:56.947779 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc4c1773_bc77_4592_aff9_04323f477805.slice/crio-ea59da6e7fc4489e480fa2d6e01d64d7ea5de8549feb281fa2b61f41cee2df15 WatchSource:0}: Error finding container ea59da6e7fc4489e480fa2d6e01d64d7ea5de8549feb281fa2b61f41cee2df15: Status 404 returned error can't find the container with id ea59da6e7fc4489e480fa2d6e01d64d7ea5de8549feb281fa2b61f41cee2df15 Dec 06 09:23:57 crc kubenswrapper[4672]: I1206 09:23:57.894477 4672 generic.go:334] "Generic (PLEG): container finished" podID="4f78f7c5-5965-4932-845c-8f0be90b421a" containerID="7ea2d51b6c8a3a3d7dd06aa4aa9f77784520acda5263e9107f77febfec11562e" exitCode=0 Dec 06 09:23:57 crc kubenswrapper[4672]: I1206 09:23:57.894540 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wz7rb" event={"ID":"4f78f7c5-5965-4932-845c-8f0be90b421a","Type":"ContainerDied","Data":"7ea2d51b6c8a3a3d7dd06aa4aa9f77784520acda5263e9107f77febfec11562e"} Dec 06 09:23:57 crc kubenswrapper[4672]: I1206 09:23:57.900592 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d999c477-wf9vn" event={"ID":"bc4c1773-bc77-4592-aff9-04323f477805","Type":"ContainerStarted","Data":"8925c7a6678ec64c5b4bad928177833167e1b1c22e04f5620cfaec87e9bb145f"} Dec 06 09:23:57 crc kubenswrapper[4672]: I1206 09:23:57.900644 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-d999c477-wf9vn" Dec 06 09:23:57 crc kubenswrapper[4672]: I1206 09:23:57.900654 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d999c477-wf9vn" event={"ID":"bc4c1773-bc77-4592-aff9-04323f477805","Type":"ContainerStarted","Data":"d61b6edf3210e43a93df73e5f240d7efc0f8db092ffa99d11c1e4fedb39ef77f"} Dec 06 09:23:57 crc kubenswrapper[4672]: I1206 09:23:57.900663 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d999c477-wf9vn" event={"ID":"bc4c1773-bc77-4592-aff9-04323f477805","Type":"ContainerStarted","Data":"ea59da6e7fc4489e480fa2d6e01d64d7ea5de8549feb281fa2b61f41cee2df15"} Dec 06 09:23:57 crc kubenswrapper[4672]: I1206 09:23:57.957245 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-d999c477-wf9vn" podStartSLOduration=2.957224179 podStartE2EDuration="2.957224179s" podCreationTimestamp="2025-12-06 09:23:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:23:57.938646136 +0000 UTC m=+1055.682906423" watchObservedRunningTime="2025-12-06 09:23:57.957224179 +0000 UTC m=+1055.701484466" Dec 06 09:23:58 crc kubenswrapper[4672]: I1206 09:23:58.917206 4672 generic.go:334] "Generic (PLEG): container finished" podID="6a9bafbf-4733-4178-8012-3e94d02aa9cb" containerID="3ce98b553cc83461588d7a633c3558405e42b7ae843dd18b41c50eb9a8cce884" exitCode=0 Dec 06 09:23:58 crc kubenswrapper[4672]: I1206 09:23:58.917269 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pmmnm" event={"ID":"6a9bafbf-4733-4178-8012-3e94d02aa9cb","Type":"ContainerDied","Data":"3ce98b553cc83461588d7a633c3558405e42b7ae843dd18b41c50eb9a8cce884"} Dec 06 09:23:59 crc kubenswrapper[4672]: E1206 09:23:59.367393 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e11e53b_de6c_4e98_8a9c_7fbdac1a1401.slice/crio-conmon-e194302916df6a7135e83785075becbc55e764210162c9c32aaa1fa745ca1f2e.scope\": RecentStats: unable to find data in memory cache]" Dec 06 09:23:59 crc kubenswrapper[4672]: I1206 09:23:59.492229 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wz7rb" Dec 06 09:23:59 crc kubenswrapper[4672]: I1206 09:23:59.580486 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f78f7c5-5965-4932-845c-8f0be90b421a-logs\") pod \"4f78f7c5-5965-4932-845c-8f0be90b421a\" (UID: \"4f78f7c5-5965-4932-845c-8f0be90b421a\") " Dec 06 09:23:59 crc kubenswrapper[4672]: I1206 09:23:59.580571 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f78f7c5-5965-4932-845c-8f0be90b421a-config-data\") pod \"4f78f7c5-5965-4932-845c-8f0be90b421a\" (UID: \"4f78f7c5-5965-4932-845c-8f0be90b421a\") " Dec 06 09:23:59 crc kubenswrapper[4672]: I1206 09:23:59.580802 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f78f7c5-5965-4932-845c-8f0be90b421a-combined-ca-bundle\") pod \"4f78f7c5-5965-4932-845c-8f0be90b421a\" (UID: \"4f78f7c5-5965-4932-845c-8f0be90b421a\") " Dec 06 09:23:59 crc kubenswrapper[4672]: I1206 09:23:59.581373 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwpmj\" (UniqueName: \"kubernetes.io/projected/4f78f7c5-5965-4932-845c-8f0be90b421a-kube-api-access-wwpmj\") pod \"4f78f7c5-5965-4932-845c-8f0be90b421a\" (UID: \"4f78f7c5-5965-4932-845c-8f0be90b421a\") " Dec 06 09:23:59 crc kubenswrapper[4672]: I1206 09:23:59.581470 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f78f7c5-5965-4932-845c-8f0be90b421a-scripts\") pod \"4f78f7c5-5965-4932-845c-8f0be90b421a\" (UID: \"4f78f7c5-5965-4932-845c-8f0be90b421a\") " Dec 06 09:23:59 crc kubenswrapper[4672]: I1206 09:23:59.581551 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f78f7c5-5965-4932-845c-8f0be90b421a-logs" (OuterVolumeSpecName: "logs") pod "4f78f7c5-5965-4932-845c-8f0be90b421a" (UID: "4f78f7c5-5965-4932-845c-8f0be90b421a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:23:59 crc kubenswrapper[4672]: I1206 09:23:59.587747 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f78f7c5-5965-4932-845c-8f0be90b421a-scripts" (OuterVolumeSpecName: "scripts") pod "4f78f7c5-5965-4932-845c-8f0be90b421a" (UID: "4f78f7c5-5965-4932-845c-8f0be90b421a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:23:59 crc kubenswrapper[4672]: I1206 09:23:59.611256 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f78f7c5-5965-4932-845c-8f0be90b421a-kube-api-access-wwpmj" (OuterVolumeSpecName: "kube-api-access-wwpmj") pod "4f78f7c5-5965-4932-845c-8f0be90b421a" (UID: "4f78f7c5-5965-4932-845c-8f0be90b421a"). InnerVolumeSpecName "kube-api-access-wwpmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:23:59 crc kubenswrapper[4672]: I1206 09:23:59.614986 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f78f7c5-5965-4932-845c-8f0be90b421a-config-data" (OuterVolumeSpecName: "config-data") pod "4f78f7c5-5965-4932-845c-8f0be90b421a" (UID: "4f78f7c5-5965-4932-845c-8f0be90b421a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:23:59 crc kubenswrapper[4672]: I1206 09:23:59.614999 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f78f7c5-5965-4932-845c-8f0be90b421a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f78f7c5-5965-4932-845c-8f0be90b421a" (UID: "4f78f7c5-5965-4932-845c-8f0be90b421a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:23:59 crc kubenswrapper[4672]: I1206 09:23:59.683167 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f78f7c5-5965-4932-845c-8f0be90b421a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:59 crc kubenswrapper[4672]: I1206 09:23:59.683201 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwpmj\" (UniqueName: \"kubernetes.io/projected/4f78f7c5-5965-4932-845c-8f0be90b421a-kube-api-access-wwpmj\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:59 crc kubenswrapper[4672]: I1206 09:23:59.683212 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f78f7c5-5965-4932-845c-8f0be90b421a-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:59 crc kubenswrapper[4672]: I1206 09:23:59.683221 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f78f7c5-5965-4932-845c-8f0be90b421a-logs\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:59 crc kubenswrapper[4672]: I1206 09:23:59.683229 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f78f7c5-5965-4932-845c-8f0be90b421a-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:59 crc kubenswrapper[4672]: I1206 09:23:59.931970 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wz7rb" event={"ID":"4f78f7c5-5965-4932-845c-8f0be90b421a","Type":"ContainerDied","Data":"d596fefb46f713114548cd4a1d0e410fbd1fc6926a90928ecfa98ab61acd18a7"} Dec 06 09:23:59 crc kubenswrapper[4672]: I1206 09:23:59.932011 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d596fefb46f713114548cd4a1d0e410fbd1fc6926a90928ecfa98ab61acd18a7" Dec 06 09:23:59 crc kubenswrapper[4672]: I1206 09:23:59.932074 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wz7rb" Dec 06 09:23:59 crc kubenswrapper[4672]: I1206 09:23:59.951243 4672 generic.go:334] "Generic (PLEG): container finished" podID="6e11e53b-de6c-4e98-8a9c-7fbdac1a1401" containerID="e194302916df6a7135e83785075becbc55e764210162c9c32aaa1fa745ca1f2e" exitCode=0 Dec 06 09:23:59 crc kubenswrapper[4672]: I1206 09:23:59.951674 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hrzmd" event={"ID":"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401","Type":"ContainerDied","Data":"e194302916df6a7135e83785075becbc55e764210162c9c32aaa1fa745ca1f2e"} Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.026110 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-777f8d8c58-75kwt"] Dec 06 09:24:00 crc kubenswrapper[4672]: E1206 09:24:00.026480 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f78f7c5-5965-4932-845c-8f0be90b421a" containerName="placement-db-sync" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.026496 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f78f7c5-5965-4932-845c-8f0be90b421a" containerName="placement-db-sync" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.026760 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f78f7c5-5965-4932-845c-8f0be90b421a" containerName="placement-db-sync" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.027697 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-777f8d8c58-75kwt" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.034035 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-kbwnm" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.034238 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.036713 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.036808 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.036972 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.054926 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-777f8d8c58-75kwt"] Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.089079 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46caf0fe-e392-43fb-8893-2a7bd67bd1a7-combined-ca-bundle\") pod \"placement-777f8d8c58-75kwt\" (UID: \"46caf0fe-e392-43fb-8893-2a7bd67bd1a7\") " pod="openstack/placement-777f8d8c58-75kwt" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.089142 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nrmg\" (UniqueName: \"kubernetes.io/projected/46caf0fe-e392-43fb-8893-2a7bd67bd1a7-kube-api-access-7nrmg\") pod \"placement-777f8d8c58-75kwt\" (UID: \"46caf0fe-e392-43fb-8893-2a7bd67bd1a7\") " pod="openstack/placement-777f8d8c58-75kwt" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.089168 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/46caf0fe-e392-43fb-8893-2a7bd67bd1a7-internal-tls-certs\") pod \"placement-777f8d8c58-75kwt\" (UID: \"46caf0fe-e392-43fb-8893-2a7bd67bd1a7\") " pod="openstack/placement-777f8d8c58-75kwt" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.089203 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46caf0fe-e392-43fb-8893-2a7bd67bd1a7-config-data\") pod \"placement-777f8d8c58-75kwt\" (UID: \"46caf0fe-e392-43fb-8893-2a7bd67bd1a7\") " pod="openstack/placement-777f8d8c58-75kwt" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.089239 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46caf0fe-e392-43fb-8893-2a7bd67bd1a7-logs\") pod \"placement-777f8d8c58-75kwt\" (UID: \"46caf0fe-e392-43fb-8893-2a7bd67bd1a7\") " pod="openstack/placement-777f8d8c58-75kwt" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.089258 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/46caf0fe-e392-43fb-8893-2a7bd67bd1a7-public-tls-certs\") pod \"placement-777f8d8c58-75kwt\" (UID: \"46caf0fe-e392-43fb-8893-2a7bd67bd1a7\") " pod="openstack/placement-777f8d8c58-75kwt" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.089274 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46caf0fe-e392-43fb-8893-2a7bd67bd1a7-scripts\") pod \"placement-777f8d8c58-75kwt\" (UID: \"46caf0fe-e392-43fb-8893-2a7bd67bd1a7\") " pod="openstack/placement-777f8d8c58-75kwt" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.190184 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46caf0fe-e392-43fb-8893-2a7bd67bd1a7-config-data\") pod \"placement-777f8d8c58-75kwt\" (UID: \"46caf0fe-e392-43fb-8893-2a7bd67bd1a7\") " pod="openstack/placement-777f8d8c58-75kwt" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.190568 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46caf0fe-e392-43fb-8893-2a7bd67bd1a7-logs\") pod \"placement-777f8d8c58-75kwt\" (UID: \"46caf0fe-e392-43fb-8893-2a7bd67bd1a7\") " pod="openstack/placement-777f8d8c58-75kwt" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.190590 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/46caf0fe-e392-43fb-8893-2a7bd67bd1a7-public-tls-certs\") pod \"placement-777f8d8c58-75kwt\" (UID: \"46caf0fe-e392-43fb-8893-2a7bd67bd1a7\") " pod="openstack/placement-777f8d8c58-75kwt" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.190645 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46caf0fe-e392-43fb-8893-2a7bd67bd1a7-scripts\") pod \"placement-777f8d8c58-75kwt\" (UID: \"46caf0fe-e392-43fb-8893-2a7bd67bd1a7\") " pod="openstack/placement-777f8d8c58-75kwt" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.190725 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46caf0fe-e392-43fb-8893-2a7bd67bd1a7-combined-ca-bundle\") pod \"placement-777f8d8c58-75kwt\" (UID: \"46caf0fe-e392-43fb-8893-2a7bd67bd1a7\") " pod="openstack/placement-777f8d8c58-75kwt" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.190762 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nrmg\" (UniqueName: \"kubernetes.io/projected/46caf0fe-e392-43fb-8893-2a7bd67bd1a7-kube-api-access-7nrmg\") pod \"placement-777f8d8c58-75kwt\" (UID: \"46caf0fe-e392-43fb-8893-2a7bd67bd1a7\") " pod="openstack/placement-777f8d8c58-75kwt" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.190785 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/46caf0fe-e392-43fb-8893-2a7bd67bd1a7-internal-tls-certs\") pod \"placement-777f8d8c58-75kwt\" (UID: \"46caf0fe-e392-43fb-8893-2a7bd67bd1a7\") " pod="openstack/placement-777f8d8c58-75kwt" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.191952 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46caf0fe-e392-43fb-8893-2a7bd67bd1a7-logs\") pod \"placement-777f8d8c58-75kwt\" (UID: \"46caf0fe-e392-43fb-8893-2a7bd67bd1a7\") " pod="openstack/placement-777f8d8c58-75kwt" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.195609 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46caf0fe-e392-43fb-8893-2a7bd67bd1a7-config-data\") pod \"placement-777f8d8c58-75kwt\" (UID: \"46caf0fe-e392-43fb-8893-2a7bd67bd1a7\") " pod="openstack/placement-777f8d8c58-75kwt" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.200178 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46caf0fe-e392-43fb-8893-2a7bd67bd1a7-combined-ca-bundle\") pod \"placement-777f8d8c58-75kwt\" (UID: \"46caf0fe-e392-43fb-8893-2a7bd67bd1a7\") " pod="openstack/placement-777f8d8c58-75kwt" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.201195 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/46caf0fe-e392-43fb-8893-2a7bd67bd1a7-public-tls-certs\") pod \"placement-777f8d8c58-75kwt\" (UID: \"46caf0fe-e392-43fb-8893-2a7bd67bd1a7\") " pod="openstack/placement-777f8d8c58-75kwt" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.214073 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/46caf0fe-e392-43fb-8893-2a7bd67bd1a7-internal-tls-certs\") pod \"placement-777f8d8c58-75kwt\" (UID: \"46caf0fe-e392-43fb-8893-2a7bd67bd1a7\") " pod="openstack/placement-777f8d8c58-75kwt" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.214356 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46caf0fe-e392-43fb-8893-2a7bd67bd1a7-scripts\") pod \"placement-777f8d8c58-75kwt\" (UID: \"46caf0fe-e392-43fb-8893-2a7bd67bd1a7\") " pod="openstack/placement-777f8d8c58-75kwt" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.222084 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nrmg\" (UniqueName: \"kubernetes.io/projected/46caf0fe-e392-43fb-8893-2a7bd67bd1a7-kube-api-access-7nrmg\") pod \"placement-777f8d8c58-75kwt\" (UID: \"46caf0fe-e392-43fb-8893-2a7bd67bd1a7\") " pod="openstack/placement-777f8d8c58-75kwt" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.351776 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-777f8d8c58-75kwt" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.476254 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pmmnm" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.604583 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6a9bafbf-4733-4178-8012-3e94d02aa9cb-db-sync-config-data\") pod \"6a9bafbf-4733-4178-8012-3e94d02aa9cb\" (UID: \"6a9bafbf-4733-4178-8012-3e94d02aa9cb\") " Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.605043 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlnfk\" (UniqueName: \"kubernetes.io/projected/6a9bafbf-4733-4178-8012-3e94d02aa9cb-kube-api-access-tlnfk\") pod \"6a9bafbf-4733-4178-8012-3e94d02aa9cb\" (UID: \"6a9bafbf-4733-4178-8012-3e94d02aa9cb\") " Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.605071 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a9bafbf-4733-4178-8012-3e94d02aa9cb-combined-ca-bundle\") pod \"6a9bafbf-4733-4178-8012-3e94d02aa9cb\" (UID: \"6a9bafbf-4733-4178-8012-3e94d02aa9cb\") " Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.627680 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a9bafbf-4733-4178-8012-3e94d02aa9cb-kube-api-access-tlnfk" (OuterVolumeSpecName: "kube-api-access-tlnfk") pod "6a9bafbf-4733-4178-8012-3e94d02aa9cb" (UID: "6a9bafbf-4733-4178-8012-3e94d02aa9cb"). InnerVolumeSpecName "kube-api-access-tlnfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.627787 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a9bafbf-4733-4178-8012-3e94d02aa9cb-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6a9bafbf-4733-4178-8012-3e94d02aa9cb" (UID: "6a9bafbf-4733-4178-8012-3e94d02aa9cb"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.646149 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a9bafbf-4733-4178-8012-3e94d02aa9cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a9bafbf-4733-4178-8012-3e94d02aa9cb" (UID: "6a9bafbf-4733-4178-8012-3e94d02aa9cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.709722 4672 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6a9bafbf-4733-4178-8012-3e94d02aa9cb-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.709762 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlnfk\" (UniqueName: \"kubernetes.io/projected/6a9bafbf-4733-4178-8012-3e94d02aa9cb-kube-api-access-tlnfk\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.709774 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a9bafbf-4733-4178-8012-3e94d02aa9cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.966091 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pmmnm" Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.967811 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pmmnm" event={"ID":"6a9bafbf-4733-4178-8012-3e94d02aa9cb","Type":"ContainerDied","Data":"2d509fb51ee9d2ebacbee4c925cdbdf7800bbafd640f7014321f194c2e722115"} Dec 06 09:24:00 crc kubenswrapper[4672]: I1206 09:24:00.967857 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d509fb51ee9d2ebacbee4c925cdbdf7800bbafd640f7014321f194c2e722115" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.165944 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6d54549b45-whq64"] Dec 06 09:24:01 crc kubenswrapper[4672]: E1206 09:24:01.166278 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a9bafbf-4733-4178-8012-3e94d02aa9cb" containerName="barbican-db-sync" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.166296 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9bafbf-4733-4178-8012-3e94d02aa9cb" containerName="barbican-db-sync" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.166471 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a9bafbf-4733-4178-8012-3e94d02aa9cb" containerName="barbican-db-sync" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.167340 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d54549b45-whq64" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.171003 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.171028 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.171375 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2b8cp" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.201581 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6d54549b45-whq64"] Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.217507 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/963924f1-d56b-4422-af4a-cc5c3a17944f-logs\") pod \"barbican-worker-6d54549b45-whq64\" (UID: \"963924f1-d56b-4422-af4a-cc5c3a17944f\") " pod="openstack/barbican-worker-6d54549b45-whq64" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.217610 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/963924f1-d56b-4422-af4a-cc5c3a17944f-config-data-custom\") pod \"barbican-worker-6d54549b45-whq64\" (UID: \"963924f1-d56b-4422-af4a-cc5c3a17944f\") " pod="openstack/barbican-worker-6d54549b45-whq64" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.217651 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqd55\" (UniqueName: \"kubernetes.io/projected/963924f1-d56b-4422-af4a-cc5c3a17944f-kube-api-access-nqd55\") pod \"barbican-worker-6d54549b45-whq64\" (UID: \"963924f1-d56b-4422-af4a-cc5c3a17944f\") " pod="openstack/barbican-worker-6d54549b45-whq64" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.217750 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/963924f1-d56b-4422-af4a-cc5c3a17944f-config-data\") pod \"barbican-worker-6d54549b45-whq64\" (UID: \"963924f1-d56b-4422-af4a-cc5c3a17944f\") " pod="openstack/barbican-worker-6d54549b45-whq64" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.217885 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/963924f1-d56b-4422-af4a-cc5c3a17944f-combined-ca-bundle\") pod \"barbican-worker-6d54549b45-whq64\" (UID: \"963924f1-d56b-4422-af4a-cc5c3a17944f\") " pod="openstack/barbican-worker-6d54549b45-whq64" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.308273 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-868fbbdb46-nq8wk"] Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.309925 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-868fbbdb46-nq8wk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.318026 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.319420 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/963924f1-d56b-4422-af4a-cc5c3a17944f-combined-ca-bundle\") pod \"barbican-worker-6d54549b45-whq64\" (UID: \"963924f1-d56b-4422-af4a-cc5c3a17944f\") " pod="openstack/barbican-worker-6d54549b45-whq64" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.319461 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/963924f1-d56b-4422-af4a-cc5c3a17944f-logs\") pod \"barbican-worker-6d54549b45-whq64\" (UID: \"963924f1-d56b-4422-af4a-cc5c3a17944f\") " pod="openstack/barbican-worker-6d54549b45-whq64" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.319491 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/963924f1-d56b-4422-af4a-cc5c3a17944f-config-data-custom\") pod \"barbican-worker-6d54549b45-whq64\" (UID: \"963924f1-d56b-4422-af4a-cc5c3a17944f\") " pod="openstack/barbican-worker-6d54549b45-whq64" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.319522 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqd55\" (UniqueName: \"kubernetes.io/projected/963924f1-d56b-4422-af4a-cc5c3a17944f-kube-api-access-nqd55\") pod \"barbican-worker-6d54549b45-whq64\" (UID: \"963924f1-d56b-4422-af4a-cc5c3a17944f\") " pod="openstack/barbican-worker-6d54549b45-whq64" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.319550 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/963924f1-d56b-4422-af4a-cc5c3a17944f-config-data\") pod \"barbican-worker-6d54549b45-whq64\" (UID: \"963924f1-d56b-4422-af4a-cc5c3a17944f\") " pod="openstack/barbican-worker-6d54549b45-whq64" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.322350 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-777f8d8c58-75kwt"] Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.329235 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-868fbbdb46-nq8wk"] Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.330099 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/963924f1-d56b-4422-af4a-cc5c3a17944f-config-data\") pod \"barbican-worker-6d54549b45-whq64\" (UID: \"963924f1-d56b-4422-af4a-cc5c3a17944f\") " pod="openstack/barbican-worker-6d54549b45-whq64" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.336315 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/963924f1-d56b-4422-af4a-cc5c3a17944f-combined-ca-bundle\") pod \"barbican-worker-6d54549b45-whq64\" (UID: \"963924f1-d56b-4422-af4a-cc5c3a17944f\") " pod="openstack/barbican-worker-6d54549b45-whq64" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.340219 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/963924f1-d56b-4422-af4a-cc5c3a17944f-config-data-custom\") pod \"barbican-worker-6d54549b45-whq64\" (UID: \"963924f1-d56b-4422-af4a-cc5c3a17944f\") " pod="openstack/barbican-worker-6d54549b45-whq64" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.346921 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/963924f1-d56b-4422-af4a-cc5c3a17944f-logs\") pod \"barbican-worker-6d54549b45-whq64\" (UID: \"963924f1-d56b-4422-af4a-cc5c3a17944f\") " pod="openstack/barbican-worker-6d54549b45-whq64" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.383995 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqd55\" (UniqueName: \"kubernetes.io/projected/963924f1-d56b-4422-af4a-cc5c3a17944f-kube-api-access-nqd55\") pod \"barbican-worker-6d54549b45-whq64\" (UID: \"963924f1-d56b-4422-af4a-cc5c3a17944f\") " pod="openstack/barbican-worker-6d54549b45-whq64" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.421418 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79992d1b-dc0e-43ad-b6cd-942fadb148e6-config-data-custom\") pod \"barbican-keystone-listener-868fbbdb46-nq8wk\" (UID: \"79992d1b-dc0e-43ad-b6cd-942fadb148e6\") " pod="openstack/barbican-keystone-listener-868fbbdb46-nq8wk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.421660 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79992d1b-dc0e-43ad-b6cd-942fadb148e6-combined-ca-bundle\") pod \"barbican-keystone-listener-868fbbdb46-nq8wk\" (UID: \"79992d1b-dc0e-43ad-b6cd-942fadb148e6\") " pod="openstack/barbican-keystone-listener-868fbbdb46-nq8wk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.421776 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79992d1b-dc0e-43ad-b6cd-942fadb148e6-logs\") pod \"barbican-keystone-listener-868fbbdb46-nq8wk\" (UID: \"79992d1b-dc0e-43ad-b6cd-942fadb148e6\") " pod="openstack/barbican-keystone-listener-868fbbdb46-nq8wk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.421846 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79992d1b-dc0e-43ad-b6cd-942fadb148e6-config-data\") pod \"barbican-keystone-listener-868fbbdb46-nq8wk\" (UID: \"79992d1b-dc0e-43ad-b6cd-942fadb148e6\") " pod="openstack/barbican-keystone-listener-868fbbdb46-nq8wk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.421912 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9l9g\" (UniqueName: \"kubernetes.io/projected/79992d1b-dc0e-43ad-b6cd-942fadb148e6-kube-api-access-n9l9g\") pod \"barbican-keystone-listener-868fbbdb46-nq8wk\" (UID: \"79992d1b-dc0e-43ad-b6cd-942fadb148e6\") " pod="openstack/barbican-keystone-listener-868fbbdb46-nq8wk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.488006 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d47f95b99-h4jdd"] Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.488412 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d47f95b99-h4jdd" podUID="47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d" containerName="dnsmasq-dns" containerID="cri-o://cc5c57e0444f2c882380f8fb6782250e02ed4a3736c7b388cf6625adce224300" gracePeriod=10 Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.494386 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d47f95b99-h4jdd" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.494554 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d54549b45-whq64" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.529786 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79992d1b-dc0e-43ad-b6cd-942fadb148e6-config-data-custom\") pod \"barbican-keystone-listener-868fbbdb46-nq8wk\" (UID: \"79992d1b-dc0e-43ad-b6cd-942fadb148e6\") " pod="openstack/barbican-keystone-listener-868fbbdb46-nq8wk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.529846 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79992d1b-dc0e-43ad-b6cd-942fadb148e6-combined-ca-bundle\") pod \"barbican-keystone-listener-868fbbdb46-nq8wk\" (UID: \"79992d1b-dc0e-43ad-b6cd-942fadb148e6\") " pod="openstack/barbican-keystone-listener-868fbbdb46-nq8wk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.529879 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79992d1b-dc0e-43ad-b6cd-942fadb148e6-logs\") pod \"barbican-keystone-listener-868fbbdb46-nq8wk\" (UID: \"79992d1b-dc0e-43ad-b6cd-942fadb148e6\") " pod="openstack/barbican-keystone-listener-868fbbdb46-nq8wk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.529901 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79992d1b-dc0e-43ad-b6cd-942fadb148e6-config-data\") pod \"barbican-keystone-listener-868fbbdb46-nq8wk\" (UID: \"79992d1b-dc0e-43ad-b6cd-942fadb148e6\") " pod="openstack/barbican-keystone-listener-868fbbdb46-nq8wk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.529919 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9l9g\" (UniqueName: \"kubernetes.io/projected/79992d1b-dc0e-43ad-b6cd-942fadb148e6-kube-api-access-n9l9g\") pod \"barbican-keystone-listener-868fbbdb46-nq8wk\" (UID: \"79992d1b-dc0e-43ad-b6cd-942fadb148e6\") " pod="openstack/barbican-keystone-listener-868fbbdb46-nq8wk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.534111 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79992d1b-dc0e-43ad-b6cd-942fadb148e6-logs\") pod \"barbican-keystone-listener-868fbbdb46-nq8wk\" (UID: \"79992d1b-dc0e-43ad-b6cd-942fadb148e6\") " pod="openstack/barbican-keystone-listener-868fbbdb46-nq8wk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.541215 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79992d1b-dc0e-43ad-b6cd-942fadb148e6-config-data\") pod \"barbican-keystone-listener-868fbbdb46-nq8wk\" (UID: \"79992d1b-dc0e-43ad-b6cd-942fadb148e6\") " pod="openstack/barbican-keystone-listener-868fbbdb46-nq8wk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.554765 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6556b45576-ndnkn"] Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.556457 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6556b45576-ndnkn" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.557634 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79992d1b-dc0e-43ad-b6cd-942fadb148e6-config-data-custom\") pod \"barbican-keystone-listener-868fbbdb46-nq8wk\" (UID: \"79992d1b-dc0e-43ad-b6cd-942fadb148e6\") " pod="openstack/barbican-keystone-listener-868fbbdb46-nq8wk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.558775 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.568294 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79992d1b-dc0e-43ad-b6cd-942fadb148e6-combined-ca-bundle\") pod \"barbican-keystone-listener-868fbbdb46-nq8wk\" (UID: \"79992d1b-dc0e-43ad-b6cd-942fadb148e6\") " pod="openstack/barbican-keystone-listener-868fbbdb46-nq8wk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.570202 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9l9g\" (UniqueName: \"kubernetes.io/projected/79992d1b-dc0e-43ad-b6cd-942fadb148e6-kube-api-access-n9l9g\") pod \"barbican-keystone-listener-868fbbdb46-nq8wk\" (UID: \"79992d1b-dc0e-43ad-b6cd-942fadb148e6\") " pod="openstack/barbican-keystone-listener-868fbbdb46-nq8wk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.631826 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6556b45576-ndnkn"] Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.642548 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-868fbbdb46-nq8wk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.723329 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cc79f6d89-lj8pk"] Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.726170 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cc79f6d89-lj8pk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.741469 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c3f68cf-fee8-4b57-93b8-daafbe0ee008-combined-ca-bundle\") pod \"barbican-api-6556b45576-ndnkn\" (UID: \"0c3f68cf-fee8-4b57-93b8-daafbe0ee008\") " pod="openstack/barbican-api-6556b45576-ndnkn" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.741524 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p9fc\" (UniqueName: \"kubernetes.io/projected/0c3f68cf-fee8-4b57-93b8-daafbe0ee008-kube-api-access-4p9fc\") pod \"barbican-api-6556b45576-ndnkn\" (UID: \"0c3f68cf-fee8-4b57-93b8-daafbe0ee008\") " pod="openstack/barbican-api-6556b45576-ndnkn" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.741559 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c3f68cf-fee8-4b57-93b8-daafbe0ee008-logs\") pod \"barbican-api-6556b45576-ndnkn\" (UID: \"0c3f68cf-fee8-4b57-93b8-daafbe0ee008\") " pod="openstack/barbican-api-6556b45576-ndnkn" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.741593 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c3f68cf-fee8-4b57-93b8-daafbe0ee008-config-data\") pod \"barbican-api-6556b45576-ndnkn\" (UID: \"0c3f68cf-fee8-4b57-93b8-daafbe0ee008\") " pod="openstack/barbican-api-6556b45576-ndnkn" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.741626 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c3f68cf-fee8-4b57-93b8-daafbe0ee008-config-data-custom\") pod \"barbican-api-6556b45576-ndnkn\" (UID: \"0c3f68cf-fee8-4b57-93b8-daafbe0ee008\") " pod="openstack/barbican-api-6556b45576-ndnkn" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.787535 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cc79f6d89-lj8pk"] Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.810720 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hrzmd" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.842815 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c3f68cf-fee8-4b57-93b8-daafbe0ee008-combined-ca-bundle\") pod \"barbican-api-6556b45576-ndnkn\" (UID: \"0c3f68cf-fee8-4b57-93b8-daafbe0ee008\") " pod="openstack/barbican-api-6556b45576-ndnkn" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.842866 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p9fc\" (UniqueName: \"kubernetes.io/projected/0c3f68cf-fee8-4b57-93b8-daafbe0ee008-kube-api-access-4p9fc\") pod \"barbican-api-6556b45576-ndnkn\" (UID: \"0c3f68cf-fee8-4b57-93b8-daafbe0ee008\") " pod="openstack/barbican-api-6556b45576-ndnkn" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.842896 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c3f68cf-fee8-4b57-93b8-daafbe0ee008-logs\") pod \"barbican-api-6556b45576-ndnkn\" (UID: \"0c3f68cf-fee8-4b57-93b8-daafbe0ee008\") " pod="openstack/barbican-api-6556b45576-ndnkn" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.842928 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c4cde3b-2757-4097-a95e-e9765a4f5b37-dns-svc\") pod \"dnsmasq-dns-cc79f6d89-lj8pk\" (UID: \"6c4cde3b-2757-4097-a95e-e9765a4f5b37\") " pod="openstack/dnsmasq-dns-cc79f6d89-lj8pk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.842949 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c3f68cf-fee8-4b57-93b8-daafbe0ee008-config-data\") pod \"barbican-api-6556b45576-ndnkn\" (UID: \"0c3f68cf-fee8-4b57-93b8-daafbe0ee008\") " pod="openstack/barbican-api-6556b45576-ndnkn" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.842968 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c3f68cf-fee8-4b57-93b8-daafbe0ee008-config-data-custom\") pod \"barbican-api-6556b45576-ndnkn\" (UID: \"0c3f68cf-fee8-4b57-93b8-daafbe0ee008\") " pod="openstack/barbican-api-6556b45576-ndnkn" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.843053 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhfh2\" (UniqueName: \"kubernetes.io/projected/6c4cde3b-2757-4097-a95e-e9765a4f5b37-kube-api-access-mhfh2\") pod \"dnsmasq-dns-cc79f6d89-lj8pk\" (UID: \"6c4cde3b-2757-4097-a95e-e9765a4f5b37\") " pod="openstack/dnsmasq-dns-cc79f6d89-lj8pk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.843094 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4cde3b-2757-4097-a95e-e9765a4f5b37-config\") pod \"dnsmasq-dns-cc79f6d89-lj8pk\" (UID: \"6c4cde3b-2757-4097-a95e-e9765a4f5b37\") " pod="openstack/dnsmasq-dns-cc79f6d89-lj8pk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.843116 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c4cde3b-2757-4097-a95e-e9765a4f5b37-ovsdbserver-sb\") pod \"dnsmasq-dns-cc79f6d89-lj8pk\" (UID: \"6c4cde3b-2757-4097-a95e-e9765a4f5b37\") " pod="openstack/dnsmasq-dns-cc79f6d89-lj8pk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.843137 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c4cde3b-2757-4097-a95e-e9765a4f5b37-ovsdbserver-nb\") pod \"dnsmasq-dns-cc79f6d89-lj8pk\" (UID: \"6c4cde3b-2757-4097-a95e-e9765a4f5b37\") " pod="openstack/dnsmasq-dns-cc79f6d89-lj8pk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.845049 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c3f68cf-fee8-4b57-93b8-daafbe0ee008-logs\") pod \"barbican-api-6556b45576-ndnkn\" (UID: \"0c3f68cf-fee8-4b57-93b8-daafbe0ee008\") " pod="openstack/barbican-api-6556b45576-ndnkn" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.873063 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c3f68cf-fee8-4b57-93b8-daafbe0ee008-combined-ca-bundle\") pod \"barbican-api-6556b45576-ndnkn\" (UID: \"0c3f68cf-fee8-4b57-93b8-daafbe0ee008\") " pod="openstack/barbican-api-6556b45576-ndnkn" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.873039 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c3f68cf-fee8-4b57-93b8-daafbe0ee008-config-data-custom\") pod \"barbican-api-6556b45576-ndnkn\" (UID: \"0c3f68cf-fee8-4b57-93b8-daafbe0ee008\") " pod="openstack/barbican-api-6556b45576-ndnkn" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.875203 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p9fc\" (UniqueName: \"kubernetes.io/projected/0c3f68cf-fee8-4b57-93b8-daafbe0ee008-kube-api-access-4p9fc\") pod \"barbican-api-6556b45576-ndnkn\" (UID: \"0c3f68cf-fee8-4b57-93b8-daafbe0ee008\") " pod="openstack/barbican-api-6556b45576-ndnkn" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.876290 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c3f68cf-fee8-4b57-93b8-daafbe0ee008-config-data\") pod \"barbican-api-6556b45576-ndnkn\" (UID: \"0c3f68cf-fee8-4b57-93b8-daafbe0ee008\") " pod="openstack/barbican-api-6556b45576-ndnkn" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.947119 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-fernet-keys\") pod \"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401\" (UID: \"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401\") " Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.947161 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-scripts\") pod \"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401\" (UID: \"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401\") " Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.947197 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-credential-keys\") pod \"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401\" (UID: \"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401\") " Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.947214 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-config-data\") pod \"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401\" (UID: \"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401\") " Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.947278 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-combined-ca-bundle\") pod \"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401\" (UID: \"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401\") " Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.947312 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nv2k6\" (UniqueName: \"kubernetes.io/projected/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-kube-api-access-nv2k6\") pod \"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401\" (UID: \"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401\") " Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.947521 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c4cde3b-2757-4097-a95e-e9765a4f5b37-ovsdbserver-sb\") pod \"dnsmasq-dns-cc79f6d89-lj8pk\" (UID: \"6c4cde3b-2757-4097-a95e-e9765a4f5b37\") " pod="openstack/dnsmasq-dns-cc79f6d89-lj8pk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.947552 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c4cde3b-2757-4097-a95e-e9765a4f5b37-ovsdbserver-nb\") pod \"dnsmasq-dns-cc79f6d89-lj8pk\" (UID: \"6c4cde3b-2757-4097-a95e-e9765a4f5b37\") " pod="openstack/dnsmasq-dns-cc79f6d89-lj8pk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.947618 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c4cde3b-2757-4097-a95e-e9765a4f5b37-dns-svc\") pod \"dnsmasq-dns-cc79f6d89-lj8pk\" (UID: \"6c4cde3b-2757-4097-a95e-e9765a4f5b37\") " pod="openstack/dnsmasq-dns-cc79f6d89-lj8pk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.947688 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhfh2\" (UniqueName: \"kubernetes.io/projected/6c4cde3b-2757-4097-a95e-e9765a4f5b37-kube-api-access-mhfh2\") pod \"dnsmasq-dns-cc79f6d89-lj8pk\" (UID: \"6c4cde3b-2757-4097-a95e-e9765a4f5b37\") " pod="openstack/dnsmasq-dns-cc79f6d89-lj8pk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.947719 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4cde3b-2757-4097-a95e-e9765a4f5b37-config\") pod \"dnsmasq-dns-cc79f6d89-lj8pk\" (UID: \"6c4cde3b-2757-4097-a95e-e9765a4f5b37\") " pod="openstack/dnsmasq-dns-cc79f6d89-lj8pk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.948495 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4cde3b-2757-4097-a95e-e9765a4f5b37-config\") pod \"dnsmasq-dns-cc79f6d89-lj8pk\" (UID: \"6c4cde3b-2757-4097-a95e-e9765a4f5b37\") " pod="openstack/dnsmasq-dns-cc79f6d89-lj8pk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.951832 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c4cde3b-2757-4097-a95e-e9765a4f5b37-ovsdbserver-sb\") pod \"dnsmasq-dns-cc79f6d89-lj8pk\" (UID: \"6c4cde3b-2757-4097-a95e-e9765a4f5b37\") " pod="openstack/dnsmasq-dns-cc79f6d89-lj8pk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.952470 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c4cde3b-2757-4097-a95e-e9765a4f5b37-dns-svc\") pod \"dnsmasq-dns-cc79f6d89-lj8pk\" (UID: \"6c4cde3b-2757-4097-a95e-e9765a4f5b37\") " pod="openstack/dnsmasq-dns-cc79f6d89-lj8pk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.952699 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c4cde3b-2757-4097-a95e-e9765a4f5b37-ovsdbserver-nb\") pod \"dnsmasq-dns-cc79f6d89-lj8pk\" (UID: \"6c4cde3b-2757-4097-a95e-e9765a4f5b37\") " pod="openstack/dnsmasq-dns-cc79f6d89-lj8pk" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.965018 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "6e11e53b-de6c-4e98-8a9c-7fbdac1a1401" (UID: "6e11e53b-de6c-4e98-8a9c-7fbdac1a1401"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.967743 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6e11e53b-de6c-4e98-8a9c-7fbdac1a1401" (UID: "6e11e53b-de6c-4e98-8a9c-7fbdac1a1401"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.967971 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-scripts" (OuterVolumeSpecName: "scripts") pod "6e11e53b-de6c-4e98-8a9c-7fbdac1a1401" (UID: "6e11e53b-de6c-4e98-8a9c-7fbdac1a1401"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.976742 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-kube-api-access-nv2k6" (OuterVolumeSpecName: "kube-api-access-nv2k6") pod "6e11e53b-de6c-4e98-8a9c-7fbdac1a1401" (UID: "6e11e53b-de6c-4e98-8a9c-7fbdac1a1401"). InnerVolumeSpecName "kube-api-access-nv2k6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:24:01 crc kubenswrapper[4672]: I1206 09:24:01.992528 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhfh2\" (UniqueName: \"kubernetes.io/projected/6c4cde3b-2757-4097-a95e-e9765a4f5b37-kube-api-access-mhfh2\") pod \"dnsmasq-dns-cc79f6d89-lj8pk\" (UID: \"6c4cde3b-2757-4097-a95e-e9765a4f5b37\") " pod="openstack/dnsmasq-dns-cc79f6d89-lj8pk" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.021404 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hrzmd" event={"ID":"6e11e53b-de6c-4e98-8a9c-7fbdac1a1401","Type":"ContainerDied","Data":"a70998765929e04f488b26645c5114600898cb79c1fdaa8910186cb37c9c82e4"} Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.021445 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a70998765929e04f488b26645c5114600898cb79c1fdaa8910186cb37c9c82e4" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.021513 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hrzmd" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.026948 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e11e53b-de6c-4e98-8a9c-7fbdac1a1401" (UID: "6e11e53b-de6c-4e98-8a9c-7fbdac1a1401"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.030791 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-777f8d8c58-75kwt" event={"ID":"46caf0fe-e392-43fb-8893-2a7bd67bd1a7","Type":"ContainerStarted","Data":"10c9d0d7181dfcb440ba754c7444c6072f6192a44c5fb29be6eb702e77e4aa55"} Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.050523 4672 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.050549 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.050558 4672 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.050567 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.050578 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nv2k6\" (UniqueName: \"kubernetes.io/projected/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-kube-api-access-nv2k6\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.050739 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-config-data" (OuterVolumeSpecName: "config-data") pod "6e11e53b-de6c-4e98-8a9c-7fbdac1a1401" (UID: "6e11e53b-de6c-4e98-8a9c-7fbdac1a1401"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.060993 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6556b45576-ndnkn" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.146418 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-76bb4c894-tw7m5"] Dec 06 09:24:02 crc kubenswrapper[4672]: E1206 09:24:02.146832 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e11e53b-de6c-4e98-8a9c-7fbdac1a1401" containerName="keystone-bootstrap" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.146848 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e11e53b-de6c-4e98-8a9c-7fbdac1a1401" containerName="keystone-bootstrap" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.146957 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cc79f6d89-lj8pk" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.147050 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e11e53b-de6c-4e98-8a9c-7fbdac1a1401" containerName="keystone-bootstrap" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.147612 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-76bb4c894-tw7m5" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.153946 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.154119 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.157246 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.178323 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-76bb4c894-tw7m5"] Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.259565 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/010644c2-5d3a-41e3-a27a-31a6e1d3a0b4-credential-keys\") pod \"keystone-76bb4c894-tw7m5\" (UID: \"010644c2-5d3a-41e3-a27a-31a6e1d3a0b4\") " pod="openstack/keystone-76bb4c894-tw7m5" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.259630 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/010644c2-5d3a-41e3-a27a-31a6e1d3a0b4-config-data\") pod \"keystone-76bb4c894-tw7m5\" (UID: \"010644c2-5d3a-41e3-a27a-31a6e1d3a0b4\") " pod="openstack/keystone-76bb4c894-tw7m5" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.259664 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/010644c2-5d3a-41e3-a27a-31a6e1d3a0b4-internal-tls-certs\") pod \"keystone-76bb4c894-tw7m5\" (UID: \"010644c2-5d3a-41e3-a27a-31a6e1d3a0b4\") " pod="openstack/keystone-76bb4c894-tw7m5" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.259692 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/010644c2-5d3a-41e3-a27a-31a6e1d3a0b4-scripts\") pod \"keystone-76bb4c894-tw7m5\" (UID: \"010644c2-5d3a-41e3-a27a-31a6e1d3a0b4\") " pod="openstack/keystone-76bb4c894-tw7m5" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.259712 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/010644c2-5d3a-41e3-a27a-31a6e1d3a0b4-public-tls-certs\") pod \"keystone-76bb4c894-tw7m5\" (UID: \"010644c2-5d3a-41e3-a27a-31a6e1d3a0b4\") " pod="openstack/keystone-76bb4c894-tw7m5" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.259752 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/010644c2-5d3a-41e3-a27a-31a6e1d3a0b4-combined-ca-bundle\") pod \"keystone-76bb4c894-tw7m5\" (UID: \"010644c2-5d3a-41e3-a27a-31a6e1d3a0b4\") " pod="openstack/keystone-76bb4c894-tw7m5" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.259785 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbrcm\" (UniqueName: \"kubernetes.io/projected/010644c2-5d3a-41e3-a27a-31a6e1d3a0b4-kube-api-access-vbrcm\") pod \"keystone-76bb4c894-tw7m5\" (UID: \"010644c2-5d3a-41e3-a27a-31a6e1d3a0b4\") " pod="openstack/keystone-76bb4c894-tw7m5" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.259812 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/010644c2-5d3a-41e3-a27a-31a6e1d3a0b4-fernet-keys\") pod \"keystone-76bb4c894-tw7m5\" (UID: \"010644c2-5d3a-41e3-a27a-31a6e1d3a0b4\") " pod="openstack/keystone-76bb4c894-tw7m5" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.361148 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/010644c2-5d3a-41e3-a27a-31a6e1d3a0b4-fernet-keys\") pod \"keystone-76bb4c894-tw7m5\" (UID: \"010644c2-5d3a-41e3-a27a-31a6e1d3a0b4\") " pod="openstack/keystone-76bb4c894-tw7m5" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.361440 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/010644c2-5d3a-41e3-a27a-31a6e1d3a0b4-credential-keys\") pod \"keystone-76bb4c894-tw7m5\" (UID: \"010644c2-5d3a-41e3-a27a-31a6e1d3a0b4\") " pod="openstack/keystone-76bb4c894-tw7m5" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.361461 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/010644c2-5d3a-41e3-a27a-31a6e1d3a0b4-config-data\") pod \"keystone-76bb4c894-tw7m5\" (UID: \"010644c2-5d3a-41e3-a27a-31a6e1d3a0b4\") " pod="openstack/keystone-76bb4c894-tw7m5" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.361490 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/010644c2-5d3a-41e3-a27a-31a6e1d3a0b4-internal-tls-certs\") pod \"keystone-76bb4c894-tw7m5\" (UID: \"010644c2-5d3a-41e3-a27a-31a6e1d3a0b4\") " pod="openstack/keystone-76bb4c894-tw7m5" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.361513 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/010644c2-5d3a-41e3-a27a-31a6e1d3a0b4-scripts\") pod \"keystone-76bb4c894-tw7m5\" (UID: \"010644c2-5d3a-41e3-a27a-31a6e1d3a0b4\") " pod="openstack/keystone-76bb4c894-tw7m5" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.361536 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/010644c2-5d3a-41e3-a27a-31a6e1d3a0b4-public-tls-certs\") pod \"keystone-76bb4c894-tw7m5\" (UID: \"010644c2-5d3a-41e3-a27a-31a6e1d3a0b4\") " pod="openstack/keystone-76bb4c894-tw7m5" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.361575 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/010644c2-5d3a-41e3-a27a-31a6e1d3a0b4-combined-ca-bundle\") pod \"keystone-76bb4c894-tw7m5\" (UID: \"010644c2-5d3a-41e3-a27a-31a6e1d3a0b4\") " pod="openstack/keystone-76bb4c894-tw7m5" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.361623 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbrcm\" (UniqueName: \"kubernetes.io/projected/010644c2-5d3a-41e3-a27a-31a6e1d3a0b4-kube-api-access-vbrcm\") pod \"keystone-76bb4c894-tw7m5\" (UID: \"010644c2-5d3a-41e3-a27a-31a6e1d3a0b4\") " pod="openstack/keystone-76bb4c894-tw7m5" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.365656 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/010644c2-5d3a-41e3-a27a-31a6e1d3a0b4-fernet-keys\") pod \"keystone-76bb4c894-tw7m5\" (UID: \"010644c2-5d3a-41e3-a27a-31a6e1d3a0b4\") " pod="openstack/keystone-76bb4c894-tw7m5" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.387235 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/010644c2-5d3a-41e3-a27a-31a6e1d3a0b4-scripts\") pod \"keystone-76bb4c894-tw7m5\" (UID: \"010644c2-5d3a-41e3-a27a-31a6e1d3a0b4\") " pod="openstack/keystone-76bb4c894-tw7m5" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.387772 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/010644c2-5d3a-41e3-a27a-31a6e1d3a0b4-combined-ca-bundle\") pod \"keystone-76bb4c894-tw7m5\" (UID: \"010644c2-5d3a-41e3-a27a-31a6e1d3a0b4\") " pod="openstack/keystone-76bb4c894-tw7m5" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.388153 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/010644c2-5d3a-41e3-a27a-31a6e1d3a0b4-public-tls-certs\") pod \"keystone-76bb4c894-tw7m5\" (UID: \"010644c2-5d3a-41e3-a27a-31a6e1d3a0b4\") " pod="openstack/keystone-76bb4c894-tw7m5" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.392054 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbrcm\" (UniqueName: \"kubernetes.io/projected/010644c2-5d3a-41e3-a27a-31a6e1d3a0b4-kube-api-access-vbrcm\") pod \"keystone-76bb4c894-tw7m5\" (UID: \"010644c2-5d3a-41e3-a27a-31a6e1d3a0b4\") " pod="openstack/keystone-76bb4c894-tw7m5" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.405887 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-868fbbdb46-nq8wk"] Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.408217 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/010644c2-5d3a-41e3-a27a-31a6e1d3a0b4-credential-keys\") pod \"keystone-76bb4c894-tw7m5\" (UID: \"010644c2-5d3a-41e3-a27a-31a6e1d3a0b4\") " pod="openstack/keystone-76bb4c894-tw7m5" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.412529 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/010644c2-5d3a-41e3-a27a-31a6e1d3a0b4-internal-tls-certs\") pod \"keystone-76bb4c894-tw7m5\" (UID: \"010644c2-5d3a-41e3-a27a-31a6e1d3a0b4\") " pod="openstack/keystone-76bb4c894-tw7m5" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.415589 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/010644c2-5d3a-41e3-a27a-31a6e1d3a0b4-config-data\") pod \"keystone-76bb4c894-tw7m5\" (UID: \"010644c2-5d3a-41e3-a27a-31a6e1d3a0b4\") " pod="openstack/keystone-76bb4c894-tw7m5" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.474894 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-76bb4c894-tw7m5" Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.509328 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6d54549b45-whq64"] Dec 06 09:24:02 crc kubenswrapper[4672]: I1206 09:24:02.738431 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cc79f6d89-lj8pk"] Dec 06 09:24:03 crc kubenswrapper[4672]: I1206 09:24:03.021976 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6556b45576-ndnkn"] Dec 06 09:24:03 crc kubenswrapper[4672]: I1206 09:24:03.043961 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-868fbbdb46-nq8wk" event={"ID":"79992d1b-dc0e-43ad-b6cd-942fadb148e6","Type":"ContainerStarted","Data":"23cf328a7df49182ca3fba9a49109865d958d489c7060f0b27c4c99cd2f07f1b"} Dec 06 09:24:03 crc kubenswrapper[4672]: I1206 09:24:03.046488 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cc79f6d89-lj8pk" event={"ID":"6c4cde3b-2757-4097-a95e-e9765a4f5b37","Type":"ContainerStarted","Data":"95ec93f795c88c16940a4bdd6c86bcb038aa7f586f64b66b62f1297074d17123"} Dec 06 09:24:03 crc kubenswrapper[4672]: I1206 09:24:03.047504 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d54549b45-whq64" event={"ID":"963924f1-d56b-4422-af4a-cc5c3a17944f","Type":"ContainerStarted","Data":"61c7e2229919b850cf1cfa3b5722288aba4e8fa133a1dcedff89701add459dd4"} Dec 06 09:24:03 crc kubenswrapper[4672]: I1206 09:24:03.049397 4672 generic.go:334] "Generic (PLEG): container finished" podID="47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d" containerID="cc5c57e0444f2c882380f8fb6782250e02ed4a3736c7b388cf6625adce224300" exitCode=0 Dec 06 09:24:03 crc kubenswrapper[4672]: I1206 09:24:03.049491 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d47f95b99-h4jdd" event={"ID":"47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d","Type":"ContainerDied","Data":"cc5c57e0444f2c882380f8fb6782250e02ed4a3736c7b388cf6625adce224300"} Dec 06 09:24:03 crc kubenswrapper[4672]: I1206 09:24:03.050932 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-777f8d8c58-75kwt" event={"ID":"46caf0fe-e392-43fb-8893-2a7bd67bd1a7","Type":"ContainerStarted","Data":"f8ca8ffea9a035e7ab8c7faef26378db752f3c8fca9d095f42fa7c3c676de83b"} Dec 06 09:24:03 crc kubenswrapper[4672]: I1206 09:24:03.185023 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-76bb4c894-tw7m5"] Dec 06 09:24:05 crc kubenswrapper[4672]: I1206 09:24:05.430338 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7fcb648c94-6bbbh"] Dec 06 09:24:05 crc kubenswrapper[4672]: I1206 09:24:05.432434 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7fcb648c94-6bbbh" Dec 06 09:24:05 crc kubenswrapper[4672]: I1206 09:24:05.441270 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 06 09:24:05 crc kubenswrapper[4672]: I1206 09:24:05.443497 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 06 09:24:05 crc kubenswrapper[4672]: I1206 09:24:05.470683 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7fcb648c94-6bbbh"] Dec 06 09:24:05 crc kubenswrapper[4672]: I1206 09:24:05.528574 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28410d08-bb47-4a67-a4d8-c06929b8c644-config-data\") pod \"barbican-api-7fcb648c94-6bbbh\" (UID: \"28410d08-bb47-4a67-a4d8-c06929b8c644\") " pod="openstack/barbican-api-7fcb648c94-6bbbh" Dec 06 09:24:05 crc kubenswrapper[4672]: I1206 09:24:05.528711 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28410d08-bb47-4a67-a4d8-c06929b8c644-config-data-custom\") pod \"barbican-api-7fcb648c94-6bbbh\" (UID: \"28410d08-bb47-4a67-a4d8-c06929b8c644\") " pod="openstack/barbican-api-7fcb648c94-6bbbh" Dec 06 09:24:05 crc kubenswrapper[4672]: I1206 09:24:05.528756 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28410d08-bb47-4a67-a4d8-c06929b8c644-logs\") pod \"barbican-api-7fcb648c94-6bbbh\" (UID: \"28410d08-bb47-4a67-a4d8-c06929b8c644\") " pod="openstack/barbican-api-7fcb648c94-6bbbh" Dec 06 09:24:05 crc kubenswrapper[4672]: I1206 09:24:05.528926 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28410d08-bb47-4a67-a4d8-c06929b8c644-public-tls-certs\") pod \"barbican-api-7fcb648c94-6bbbh\" (UID: \"28410d08-bb47-4a67-a4d8-c06929b8c644\") " pod="openstack/barbican-api-7fcb648c94-6bbbh" Dec 06 09:24:05 crc kubenswrapper[4672]: I1206 09:24:05.528987 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp4rb\" (UniqueName: \"kubernetes.io/projected/28410d08-bb47-4a67-a4d8-c06929b8c644-kube-api-access-vp4rb\") pod \"barbican-api-7fcb648c94-6bbbh\" (UID: \"28410d08-bb47-4a67-a4d8-c06929b8c644\") " pod="openstack/barbican-api-7fcb648c94-6bbbh" Dec 06 09:24:05 crc kubenswrapper[4672]: I1206 09:24:05.529096 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28410d08-bb47-4a67-a4d8-c06929b8c644-internal-tls-certs\") pod \"barbican-api-7fcb648c94-6bbbh\" (UID: \"28410d08-bb47-4a67-a4d8-c06929b8c644\") " pod="openstack/barbican-api-7fcb648c94-6bbbh" Dec 06 09:24:05 crc kubenswrapper[4672]: I1206 09:24:05.529272 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28410d08-bb47-4a67-a4d8-c06929b8c644-combined-ca-bundle\") pod \"barbican-api-7fcb648c94-6bbbh\" (UID: \"28410d08-bb47-4a67-a4d8-c06929b8c644\") " pod="openstack/barbican-api-7fcb648c94-6bbbh" Dec 06 09:24:05 crc kubenswrapper[4672]: I1206 09:24:05.631294 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28410d08-bb47-4a67-a4d8-c06929b8c644-internal-tls-certs\") pod \"barbican-api-7fcb648c94-6bbbh\" (UID: \"28410d08-bb47-4a67-a4d8-c06929b8c644\") " pod="openstack/barbican-api-7fcb648c94-6bbbh" Dec 06 09:24:05 crc kubenswrapper[4672]: I1206 09:24:05.631362 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28410d08-bb47-4a67-a4d8-c06929b8c644-combined-ca-bundle\") pod \"barbican-api-7fcb648c94-6bbbh\" (UID: \"28410d08-bb47-4a67-a4d8-c06929b8c644\") " pod="openstack/barbican-api-7fcb648c94-6bbbh" Dec 06 09:24:05 crc kubenswrapper[4672]: I1206 09:24:05.631393 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28410d08-bb47-4a67-a4d8-c06929b8c644-config-data\") pod \"barbican-api-7fcb648c94-6bbbh\" (UID: \"28410d08-bb47-4a67-a4d8-c06929b8c644\") " pod="openstack/barbican-api-7fcb648c94-6bbbh" Dec 06 09:24:05 crc kubenswrapper[4672]: I1206 09:24:05.631432 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28410d08-bb47-4a67-a4d8-c06929b8c644-config-data-custom\") pod \"barbican-api-7fcb648c94-6bbbh\" (UID: \"28410d08-bb47-4a67-a4d8-c06929b8c644\") " pod="openstack/barbican-api-7fcb648c94-6bbbh" Dec 06 09:24:05 crc kubenswrapper[4672]: I1206 09:24:05.631458 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28410d08-bb47-4a67-a4d8-c06929b8c644-logs\") pod \"barbican-api-7fcb648c94-6bbbh\" (UID: \"28410d08-bb47-4a67-a4d8-c06929b8c644\") " pod="openstack/barbican-api-7fcb648c94-6bbbh" Dec 06 09:24:05 crc kubenswrapper[4672]: I1206 09:24:05.631508 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28410d08-bb47-4a67-a4d8-c06929b8c644-public-tls-certs\") pod \"barbican-api-7fcb648c94-6bbbh\" (UID: \"28410d08-bb47-4a67-a4d8-c06929b8c644\") " pod="openstack/barbican-api-7fcb648c94-6bbbh" Dec 06 09:24:05 crc kubenswrapper[4672]: I1206 09:24:05.631531 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp4rb\" (UniqueName: \"kubernetes.io/projected/28410d08-bb47-4a67-a4d8-c06929b8c644-kube-api-access-vp4rb\") pod \"barbican-api-7fcb648c94-6bbbh\" (UID: \"28410d08-bb47-4a67-a4d8-c06929b8c644\") " pod="openstack/barbican-api-7fcb648c94-6bbbh" Dec 06 09:24:05 crc kubenswrapper[4672]: I1206 09:24:05.633533 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28410d08-bb47-4a67-a4d8-c06929b8c644-logs\") pod \"barbican-api-7fcb648c94-6bbbh\" (UID: \"28410d08-bb47-4a67-a4d8-c06929b8c644\") " pod="openstack/barbican-api-7fcb648c94-6bbbh" Dec 06 09:24:05 crc kubenswrapper[4672]: I1206 09:24:05.641253 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28410d08-bb47-4a67-a4d8-c06929b8c644-public-tls-certs\") pod \"barbican-api-7fcb648c94-6bbbh\" (UID: \"28410d08-bb47-4a67-a4d8-c06929b8c644\") " pod="openstack/barbican-api-7fcb648c94-6bbbh" Dec 06 09:24:05 crc kubenswrapper[4672]: I1206 09:24:05.642250 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28410d08-bb47-4a67-a4d8-c06929b8c644-config-data\") pod \"barbican-api-7fcb648c94-6bbbh\" (UID: \"28410d08-bb47-4a67-a4d8-c06929b8c644\") " pod="openstack/barbican-api-7fcb648c94-6bbbh" Dec 06 09:24:05 crc kubenswrapper[4672]: I1206 09:24:05.642895 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28410d08-bb47-4a67-a4d8-c06929b8c644-combined-ca-bundle\") pod \"barbican-api-7fcb648c94-6bbbh\" (UID: \"28410d08-bb47-4a67-a4d8-c06929b8c644\") " pod="openstack/barbican-api-7fcb648c94-6bbbh" Dec 06 09:24:05 crc kubenswrapper[4672]: I1206 09:24:05.652109 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28410d08-bb47-4a67-a4d8-c06929b8c644-config-data-custom\") pod \"barbican-api-7fcb648c94-6bbbh\" (UID: \"28410d08-bb47-4a67-a4d8-c06929b8c644\") " pod="openstack/barbican-api-7fcb648c94-6bbbh" Dec 06 09:24:05 crc kubenswrapper[4672]: I1206 09:24:05.652268 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28410d08-bb47-4a67-a4d8-c06929b8c644-internal-tls-certs\") pod \"barbican-api-7fcb648c94-6bbbh\" (UID: \"28410d08-bb47-4a67-a4d8-c06929b8c644\") " pod="openstack/barbican-api-7fcb648c94-6bbbh" Dec 06 09:24:05 crc kubenswrapper[4672]: I1206 09:24:05.659704 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp4rb\" (UniqueName: \"kubernetes.io/projected/28410d08-bb47-4a67-a4d8-c06929b8c644-kube-api-access-vp4rb\") pod \"barbican-api-7fcb648c94-6bbbh\" (UID: \"28410d08-bb47-4a67-a4d8-c06929b8c644\") " pod="openstack/barbican-api-7fcb648c94-6bbbh" Dec 06 09:24:05 crc kubenswrapper[4672]: I1206 09:24:05.748704 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7fcb648c94-6bbbh" Dec 06 09:24:08 crc kubenswrapper[4672]: I1206 09:24:08.447867 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5d47f95b99-h4jdd" podUID="47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.139:5353: i/o timeout" Dec 06 09:24:09 crc kubenswrapper[4672]: I1206 09:24:09.051026 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d47f95b99-h4jdd" Dec 06 09:24:09 crc kubenswrapper[4672]: I1206 09:24:09.154593 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d47f95b99-h4jdd" event={"ID":"47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d","Type":"ContainerDied","Data":"8dcea7e50638456f11eb8e42ff0c8ac88f9fcfad9737ae30426b4059bd8dc5be"} Dec 06 09:24:09 crc kubenswrapper[4672]: I1206 09:24:09.154659 4672 scope.go:117] "RemoveContainer" containerID="cc5c57e0444f2c882380f8fb6782250e02ed4a3736c7b388cf6625adce224300" Dec 06 09:24:09 crc kubenswrapper[4672]: I1206 09:24:09.154790 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d47f95b99-h4jdd" Dec 06 09:24:09 crc kubenswrapper[4672]: I1206 09:24:09.174502 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-76bb4c894-tw7m5" event={"ID":"010644c2-5d3a-41e3-a27a-31a6e1d3a0b4","Type":"ContainerStarted","Data":"574fac9626db0931b4762730aabbf3f0f4f80cc75cb192a363b5737766b625f4"} Dec 06 09:24:09 crc kubenswrapper[4672]: I1206 09:24:09.176755 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6556b45576-ndnkn" event={"ID":"0c3f68cf-fee8-4b57-93b8-daafbe0ee008","Type":"ContainerStarted","Data":"69d8df010756d091b8bee1b6cf1dcc0df862e9ca8830666c94fd4ee667e5ad08"} Dec 06 09:24:09 crc kubenswrapper[4672]: I1206 09:24:09.194442 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhddw\" (UniqueName: \"kubernetes.io/projected/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d-kube-api-access-lhddw\") pod \"47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d\" (UID: \"47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d\") " Dec 06 09:24:09 crc kubenswrapper[4672]: I1206 09:24:09.194630 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d-ovsdbserver-nb\") pod \"47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d\" (UID: \"47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d\") " Dec 06 09:24:09 crc kubenswrapper[4672]: I1206 09:24:09.194725 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d-dns-svc\") pod \"47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d\" (UID: \"47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d\") " Dec 06 09:24:09 crc kubenswrapper[4672]: I1206 09:24:09.194758 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d-ovsdbserver-sb\") pod \"47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d\" (UID: \"47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d\") " Dec 06 09:24:09 crc kubenswrapper[4672]: I1206 09:24:09.194787 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d-config\") pod \"47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d\" (UID: \"47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d\") " Dec 06 09:24:09 crc kubenswrapper[4672]: I1206 09:24:09.200131 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d-kube-api-access-lhddw" (OuterVolumeSpecName: "kube-api-access-lhddw") pod "47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d" (UID: "47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d"). InnerVolumeSpecName "kube-api-access-lhddw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:24:09 crc kubenswrapper[4672]: I1206 09:24:09.290292 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d-config" (OuterVolumeSpecName: "config") pod "47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d" (UID: "47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:24:09 crc kubenswrapper[4672]: I1206 09:24:09.296977 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:09 crc kubenswrapper[4672]: I1206 09:24:09.297011 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhddw\" (UniqueName: \"kubernetes.io/projected/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d-kube-api-access-lhddw\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:09 crc kubenswrapper[4672]: I1206 09:24:09.301359 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d" (UID: "47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:24:09 crc kubenswrapper[4672]: I1206 09:24:09.303955 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d" (UID: "47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:24:09 crc kubenswrapper[4672]: I1206 09:24:09.307723 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d" (UID: "47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:24:09 crc kubenswrapper[4672]: I1206 09:24:09.410501 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:09 crc kubenswrapper[4672]: I1206 09:24:09.410566 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:09 crc kubenswrapper[4672]: I1206 09:24:09.410580 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:09 crc kubenswrapper[4672]: I1206 09:24:09.430814 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7fcb648c94-6bbbh"] Dec 06 09:24:09 crc kubenswrapper[4672]: W1206 09:24:09.465900 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28410d08_bb47_4a67_a4d8_c06929b8c644.slice/crio-c56559a69d2cf35c846356986569d2294916d07992aab72d574253b51d04a161 WatchSource:0}: Error finding container c56559a69d2cf35c846356986569d2294916d07992aab72d574253b51d04a161: Status 404 returned error can't find the container with id c56559a69d2cf35c846356986569d2294916d07992aab72d574253b51d04a161 Dec 06 09:24:09 crc kubenswrapper[4672]: I1206 09:24:09.488969 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d47f95b99-h4jdd"] Dec 06 09:24:09 crc kubenswrapper[4672]: I1206 09:24:09.505245 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d47f95b99-h4jdd"] Dec 06 09:24:09 crc kubenswrapper[4672]: E1206 09:24:09.630168 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47ecfed3_98e9_4bcf_8f8e_ecfa5049d72d.slice/crio-8dcea7e50638456f11eb8e42ff0c8ac88f9fcfad9737ae30426b4059bd8dc5be\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47ecfed3_98e9_4bcf_8f8e_ecfa5049d72d.slice\": RecentStats: unable to find data in memory cache]" Dec 06 09:24:09 crc kubenswrapper[4672]: I1206 09:24:09.857048 4672 scope.go:117] "RemoveContainer" containerID="a79edf8501a816519884d08ea9dd884df0e124b16c391d48966ef8555ee58b96" Dec 06 09:24:10 crc kubenswrapper[4672]: I1206 09:24:10.188648 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7fcb648c94-6bbbh" event={"ID":"28410d08-bb47-4a67-a4d8-c06929b8c644","Type":"ContainerStarted","Data":"c56559a69d2cf35c846356986569d2294916d07992aab72d574253b51d04a161"} Dec 06 09:24:10 crc kubenswrapper[4672]: I1206 09:24:10.194552 4672 generic.go:334] "Generic (PLEG): container finished" podID="6c4cde3b-2757-4097-a95e-e9765a4f5b37" containerID="c39fb0504c6bc2758d2c3e37cb76994065f797294f100b34cc565ef50bb97a8b" exitCode=0 Dec 06 09:24:10 crc kubenswrapper[4672]: I1206 09:24:10.194577 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cc79f6d89-lj8pk" event={"ID":"6c4cde3b-2757-4097-a95e-e9765a4f5b37","Type":"ContainerDied","Data":"c39fb0504c6bc2758d2c3e37cb76994065f797294f100b34cc565ef50bb97a8b"} Dec 06 09:24:10 crc kubenswrapper[4672]: I1206 09:24:10.570839 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d" path="/var/lib/kubelet/pods/47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d/volumes" Dec 06 09:24:11 crc kubenswrapper[4672]: I1206 09:24:11.216477 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7fcb648c94-6bbbh" event={"ID":"28410d08-bb47-4a67-a4d8-c06929b8c644","Type":"ContainerStarted","Data":"9bcabc70af2cc4e66192f6a7e938424372d7c9c679ffd1a04710c94aad662307"} Dec 06 09:24:11 crc kubenswrapper[4672]: I1206 09:24:11.216821 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7fcb648c94-6bbbh" event={"ID":"28410d08-bb47-4a67-a4d8-c06929b8c644","Type":"ContainerStarted","Data":"2b4dcc3841829bd75e03c97f8600f8e175638c763bf22a21a4cc8074f1482893"} Dec 06 09:24:11 crc kubenswrapper[4672]: I1206 09:24:11.217149 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7fcb648c94-6bbbh" Dec 06 09:24:11 crc kubenswrapper[4672]: I1206 09:24:11.217221 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7fcb648c94-6bbbh" Dec 06 09:24:11 crc kubenswrapper[4672]: I1206 09:24:11.223077 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-76bb4c894-tw7m5" event={"ID":"010644c2-5d3a-41e3-a27a-31a6e1d3a0b4","Type":"ContainerStarted","Data":"7f02f315fd64b84094916d1fb56d3b0a3bebea54a05e5b2d969cb692e48e59b7"} Dec 06 09:24:11 crc kubenswrapper[4672]: I1206 09:24:11.224400 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-76bb4c894-tw7m5" Dec 06 09:24:11 crc kubenswrapper[4672]: I1206 09:24:11.228653 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6556b45576-ndnkn" event={"ID":"0c3f68cf-fee8-4b57-93b8-daafbe0ee008","Type":"ContainerStarted","Data":"41fc5e666e82af9c0d19cb6a92cce8bbefc83c276faf1008061c585850a6d93c"} Dec 06 09:24:11 crc kubenswrapper[4672]: I1206 09:24:11.242997 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c2437522-da6e-48b0-94b4-30b0968bccde","Type":"ContainerStarted","Data":"d3aaf2d78621f739ce8eb5f80d83ae289c9c5d7193ea6bf07712934d565c2a0f"} Dec 06 09:24:11 crc kubenswrapper[4672]: I1206 09:24:11.249646 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-777f8d8c58-75kwt" event={"ID":"46caf0fe-e392-43fb-8893-2a7bd67bd1a7","Type":"ContainerStarted","Data":"8de7f4891a90e48e43a5f1d47ce4e892c93f32923854e1ef745bc15eb0242064"} Dec 06 09:24:11 crc kubenswrapper[4672]: I1206 09:24:11.252284 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-868fbbdb46-nq8wk" event={"ID":"79992d1b-dc0e-43ad-b6cd-942fadb148e6","Type":"ContainerStarted","Data":"666a58f9fb374ff8460f62240332f803dc961cb5710dc2ddf152ff0bbdcee7f3"} Dec 06 09:24:11 crc kubenswrapper[4672]: I1206 09:24:11.257056 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cc79f6d89-lj8pk" event={"ID":"6c4cde3b-2757-4097-a95e-e9765a4f5b37","Type":"ContainerStarted","Data":"20852569fb5d359adade53d3a3b28061d4b05c615ef4782f0e6616a32ef3b224"} Dec 06 09:24:11 crc kubenswrapper[4672]: I1206 09:24:11.257338 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cc79f6d89-lj8pk" Dec 06 09:24:11 crc kubenswrapper[4672]: I1206 09:24:11.259070 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7fcb648c94-6bbbh" podStartSLOduration=6.259047103 podStartE2EDuration="6.259047103s" podCreationTimestamp="2025-12-06 09:24:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:24:11.236093042 +0000 UTC m=+1068.980353369" watchObservedRunningTime="2025-12-06 09:24:11.259047103 +0000 UTC m=+1069.003307390" Dec 06 09:24:11 crc kubenswrapper[4672]: I1206 09:24:11.259480 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d54549b45-whq64" event={"ID":"963924f1-d56b-4422-af4a-cc5c3a17944f","Type":"ContainerStarted","Data":"159498ffcce99a0b57e68797b322bc47bd67c1d737ffd1c1ae0150fcdde8a6f8"} Dec 06 09:24:11 crc kubenswrapper[4672]: I1206 09:24:11.268117 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-76bb4c894-tw7m5" podStartSLOduration=9.268100127 podStartE2EDuration="9.268100127s" podCreationTimestamp="2025-12-06 09:24:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:24:11.256900024 +0000 UTC m=+1069.001160311" watchObservedRunningTime="2025-12-06 09:24:11.268100127 +0000 UTC m=+1069.012360414" Dec 06 09:24:11 crc kubenswrapper[4672]: I1206 09:24:11.279283 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-777f8d8c58-75kwt" podStartSLOduration=12.279265649 podStartE2EDuration="12.279265649s" podCreationTimestamp="2025-12-06 09:23:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:24:11.277908803 +0000 UTC m=+1069.022169090" watchObservedRunningTime="2025-12-06 09:24:11.279265649 +0000 UTC m=+1069.023525926" Dec 06 09:24:11 crc kubenswrapper[4672]: I1206 09:24:11.297675 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cc79f6d89-lj8pk" podStartSLOduration=10.297591135 podStartE2EDuration="10.297591135s" podCreationTimestamp="2025-12-06 09:24:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:24:11.290885853 +0000 UTC m=+1069.035146140" watchObservedRunningTime="2025-12-06 09:24:11.297591135 +0000 UTC m=+1069.041851422" Dec 06 09:24:12 crc kubenswrapper[4672]: I1206 09:24:12.270269 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d54549b45-whq64" event={"ID":"963924f1-d56b-4422-af4a-cc5c3a17944f","Type":"ContainerStarted","Data":"e2b27fab7613d96594df5515e7bd872095e6c1b7a4ee92fffc9124b0214f74be"} Dec 06 09:24:12 crc kubenswrapper[4672]: I1206 09:24:12.273675 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6556b45576-ndnkn" event={"ID":"0c3f68cf-fee8-4b57-93b8-daafbe0ee008","Type":"ContainerStarted","Data":"100216a82d6fdb21fac77d981094ced2e082e2e89e4be455b93181c796dce581"} Dec 06 09:24:12 crc kubenswrapper[4672]: I1206 09:24:12.274475 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6556b45576-ndnkn" Dec 06 09:24:12 crc kubenswrapper[4672]: I1206 09:24:12.274500 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6556b45576-ndnkn" Dec 06 09:24:12 crc kubenswrapper[4672]: I1206 09:24:12.276056 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jnkpx" event={"ID":"34617816-681d-44a7-b88d-73983735dd75","Type":"ContainerStarted","Data":"8a5bc2686ab392e7026d7311a2b21f7285dff688effd999c36f0d14f7b03646b"} Dec 06 09:24:12 crc kubenswrapper[4672]: I1206 09:24:12.278921 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-868fbbdb46-nq8wk" event={"ID":"79992d1b-dc0e-43ad-b6cd-942fadb148e6","Type":"ContainerStarted","Data":"fdea6d04eb4f6f2c78b406d378979e01d23396e7d546e8075d61f566718313d9"} Dec 06 09:24:12 crc kubenswrapper[4672]: I1206 09:24:12.279775 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-777f8d8c58-75kwt" Dec 06 09:24:12 crc kubenswrapper[4672]: I1206 09:24:12.280203 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-777f8d8c58-75kwt" Dec 06 09:24:12 crc kubenswrapper[4672]: I1206 09:24:12.288564 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6d54549b45-whq64" podStartSLOduration=3.949114304 podStartE2EDuration="11.288547085s" podCreationTimestamp="2025-12-06 09:24:01 +0000 UTC" firstStartedPulling="2025-12-06 09:24:02.56169987 +0000 UTC m=+1060.305960157" lastFinishedPulling="2025-12-06 09:24:09.901132651 +0000 UTC m=+1067.645392938" observedRunningTime="2025-12-06 09:24:12.287958989 +0000 UTC m=+1070.032219266" watchObservedRunningTime="2025-12-06 09:24:12.288547085 +0000 UTC m=+1070.032807372" Dec 06 09:24:12 crc kubenswrapper[4672]: I1206 09:24:12.308376 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-868fbbdb46-nq8wk" podStartSLOduration=3.800264439 podStartE2EDuration="11.3083593s" podCreationTimestamp="2025-12-06 09:24:01 +0000 UTC" firstStartedPulling="2025-12-06 09:24:02.452173979 +0000 UTC m=+1060.196434256" lastFinishedPulling="2025-12-06 09:24:09.96026883 +0000 UTC m=+1067.704529117" observedRunningTime="2025-12-06 09:24:12.306257544 +0000 UTC m=+1070.050517821" watchObservedRunningTime="2025-12-06 09:24:12.3083593 +0000 UTC m=+1070.052619587" Dec 06 09:24:12 crc kubenswrapper[4672]: I1206 09:24:12.357772 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6556b45576-ndnkn" podStartSLOduration=11.357755406 podStartE2EDuration="11.357755406s" podCreationTimestamp="2025-12-06 09:24:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:24:12.345790533 +0000 UTC m=+1070.090050820" watchObservedRunningTime="2025-12-06 09:24:12.357755406 +0000 UTC m=+1070.102015693" Dec 06 09:24:12 crc kubenswrapper[4672]: I1206 09:24:12.357873 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-jnkpx" podStartSLOduration=4.635738984 podStartE2EDuration="43.357867769s" podCreationTimestamp="2025-12-06 09:23:29 +0000 UTC" firstStartedPulling="2025-12-06 09:23:31.152448809 +0000 UTC m=+1028.896709096" lastFinishedPulling="2025-12-06 09:24:09.874577594 +0000 UTC m=+1067.618837881" observedRunningTime="2025-12-06 09:24:12.327527449 +0000 UTC m=+1070.071787736" watchObservedRunningTime="2025-12-06 09:24:12.357867769 +0000 UTC m=+1070.102128056" Dec 06 09:24:13 crc kubenswrapper[4672]: I1206 09:24:13.449316 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5d47f95b99-h4jdd" podUID="47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.139:5353: i/o timeout" Dec 06 09:24:14 crc kubenswrapper[4672]: I1206 09:24:14.130446 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-777f8d8c58-75kwt" Dec 06 09:24:16 crc kubenswrapper[4672]: I1206 09:24:16.332091 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-777f8d8c58-75kwt" Dec 06 09:24:17 crc kubenswrapper[4672]: I1206 09:24:17.150950 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cc79f6d89-lj8pk" Dec 06 09:24:17 crc kubenswrapper[4672]: I1206 09:24:17.246496 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76445f8cf5-d9mzn"] Dec 06 09:24:17 crc kubenswrapper[4672]: I1206 09:24:17.252148 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76445f8cf5-d9mzn" podUID="dc4be349-0e33-467d-9214-134e91da88f2" containerName="dnsmasq-dns" containerID="cri-o://7ae30a5f2eb33c96762e4adef64716062087190ba3f4512d00813545c751f0b7" gracePeriod=10 Dec 06 09:24:17 crc kubenswrapper[4672]: I1206 09:24:17.334136 4672 generic.go:334] "Generic (PLEG): container finished" podID="34617816-681d-44a7-b88d-73983735dd75" containerID="8a5bc2686ab392e7026d7311a2b21f7285dff688effd999c36f0d14f7b03646b" exitCode=0 Dec 06 09:24:17 crc kubenswrapper[4672]: I1206 09:24:17.334182 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jnkpx" event={"ID":"34617816-681d-44a7-b88d-73983735dd75","Type":"ContainerDied","Data":"8a5bc2686ab392e7026d7311a2b21f7285dff688effd999c36f0d14f7b03646b"} Dec 06 09:24:18 crc kubenswrapper[4672]: I1206 09:24:18.342588 4672 generic.go:334] "Generic (PLEG): container finished" podID="dc4be349-0e33-467d-9214-134e91da88f2" containerID="7ae30a5f2eb33c96762e4adef64716062087190ba3f4512d00813545c751f0b7" exitCode=0 Dec 06 09:24:18 crc kubenswrapper[4672]: I1206 09:24:18.342644 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76445f8cf5-d9mzn" event={"ID":"dc4be349-0e33-467d-9214-134e91da88f2","Type":"ContainerDied","Data":"7ae30a5f2eb33c96762e4adef64716062087190ba3f4512d00813545c751f0b7"} Dec 06 09:24:18 crc kubenswrapper[4672]: I1206 09:24:18.546872 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7fcb648c94-6bbbh" Dec 06 09:24:18 crc kubenswrapper[4672]: I1206 09:24:18.685732 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7fcb648c94-6bbbh" Dec 06 09:24:18 crc kubenswrapper[4672]: I1206 09:24:18.758841 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6556b45576-ndnkn"] Dec 06 09:24:18 crc kubenswrapper[4672]: I1206 09:24:18.759102 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6556b45576-ndnkn" podUID="0c3f68cf-fee8-4b57-93b8-daafbe0ee008" containerName="barbican-api-log" containerID="cri-o://41fc5e666e82af9c0d19cb6a92cce8bbefc83c276faf1008061c585850a6d93c" gracePeriod=30 Dec 06 09:24:18 crc kubenswrapper[4672]: I1206 09:24:18.759576 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6556b45576-ndnkn" podUID="0c3f68cf-fee8-4b57-93b8-daafbe0ee008" containerName="barbican-api" containerID="cri-o://100216a82d6fdb21fac77d981094ced2e082e2e89e4be455b93181c796dce581" gracePeriod=30 Dec 06 09:24:18 crc kubenswrapper[4672]: I1206 09:24:18.771074 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6556b45576-ndnkn" podUID="0c3f68cf-fee8-4b57-93b8-daafbe0ee008" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.145:9311/healthcheck\": EOF" Dec 06 09:24:18 crc kubenswrapper[4672]: I1206 09:24:18.771364 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6556b45576-ndnkn" podUID="0c3f68cf-fee8-4b57-93b8-daafbe0ee008" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.145:9311/healthcheck\": EOF" Dec 06 09:24:18 crc kubenswrapper[4672]: I1206 09:24:18.771764 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6556b45576-ndnkn" podUID="0c3f68cf-fee8-4b57-93b8-daafbe0ee008" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.145:9311/healthcheck\": EOF" Dec 06 09:24:18 crc kubenswrapper[4672]: I1206 09:24:18.771997 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6556b45576-ndnkn" podUID="0c3f68cf-fee8-4b57-93b8-daafbe0ee008" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.145:9311/healthcheck\": EOF" Dec 06 09:24:18 crc kubenswrapper[4672]: I1206 09:24:18.773754 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6556b45576-ndnkn" podUID="0c3f68cf-fee8-4b57-93b8-daafbe0ee008" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.145:9311/healthcheck\": EOF" Dec 06 09:24:18 crc kubenswrapper[4672]: I1206 09:24:18.773743 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6556b45576-ndnkn" podUID="0c3f68cf-fee8-4b57-93b8-daafbe0ee008" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.145:9311/healthcheck\": read tcp 10.217.0.2:51300->10.217.0.145:9311: read: connection reset by peer" Dec 06 09:24:19 crc kubenswrapper[4672]: I1206 09:24:19.355965 4672 generic.go:334] "Generic (PLEG): container finished" podID="0c3f68cf-fee8-4b57-93b8-daafbe0ee008" containerID="41fc5e666e82af9c0d19cb6a92cce8bbefc83c276faf1008061c585850a6d93c" exitCode=143 Dec 06 09:24:19 crc kubenswrapper[4672]: I1206 09:24:19.356147 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6556b45576-ndnkn" event={"ID":"0c3f68cf-fee8-4b57-93b8-daafbe0ee008","Type":"ContainerDied","Data":"41fc5e666e82af9c0d19cb6a92cce8bbefc83c276faf1008061c585850a6d93c"} Dec 06 09:24:20 crc kubenswrapper[4672]: I1206 09:24:20.899774 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jnkpx" Dec 06 09:24:20 crc kubenswrapper[4672]: I1206 09:24:20.907993 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76445f8cf5-d9mzn" Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.062802 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34617816-681d-44a7-b88d-73983735dd75-scripts\") pod \"34617816-681d-44a7-b88d-73983735dd75\" (UID: \"34617816-681d-44a7-b88d-73983735dd75\") " Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.063050 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34617816-681d-44a7-b88d-73983735dd75-combined-ca-bundle\") pod \"34617816-681d-44a7-b88d-73983735dd75\" (UID: \"34617816-681d-44a7-b88d-73983735dd75\") " Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.063084 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/34617816-681d-44a7-b88d-73983735dd75-etc-machine-id\") pod \"34617816-681d-44a7-b88d-73983735dd75\" (UID: \"34617816-681d-44a7-b88d-73983735dd75\") " Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.063154 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9kdt\" (UniqueName: \"kubernetes.io/projected/34617816-681d-44a7-b88d-73983735dd75-kube-api-access-d9kdt\") pod \"34617816-681d-44a7-b88d-73983735dd75\" (UID: \"34617816-681d-44a7-b88d-73983735dd75\") " Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.063177 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc4be349-0e33-467d-9214-134e91da88f2-ovsdbserver-nb\") pod \"dc4be349-0e33-467d-9214-134e91da88f2\" (UID: \"dc4be349-0e33-467d-9214-134e91da88f2\") " Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.063201 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc4be349-0e33-467d-9214-134e91da88f2-config\") pod \"dc4be349-0e33-467d-9214-134e91da88f2\" (UID: \"dc4be349-0e33-467d-9214-134e91da88f2\") " Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.063249 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc4be349-0e33-467d-9214-134e91da88f2-ovsdbserver-sb\") pod \"dc4be349-0e33-467d-9214-134e91da88f2\" (UID: \"dc4be349-0e33-467d-9214-134e91da88f2\") " Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.063267 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f28m7\" (UniqueName: \"kubernetes.io/projected/dc4be349-0e33-467d-9214-134e91da88f2-kube-api-access-f28m7\") pod \"dc4be349-0e33-467d-9214-134e91da88f2\" (UID: \"dc4be349-0e33-467d-9214-134e91da88f2\") " Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.063307 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/34617816-681d-44a7-b88d-73983735dd75-db-sync-config-data\") pod \"34617816-681d-44a7-b88d-73983735dd75\" (UID: \"34617816-681d-44a7-b88d-73983735dd75\") " Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.063366 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34617816-681d-44a7-b88d-73983735dd75-config-data\") pod \"34617816-681d-44a7-b88d-73983735dd75\" (UID: \"34617816-681d-44a7-b88d-73983735dd75\") " Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.063384 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc4be349-0e33-467d-9214-134e91da88f2-dns-svc\") pod \"dc4be349-0e33-467d-9214-134e91da88f2\" (UID: \"dc4be349-0e33-467d-9214-134e91da88f2\") " Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.067406 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/34617816-681d-44a7-b88d-73983735dd75-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "34617816-681d-44a7-b88d-73983735dd75" (UID: "34617816-681d-44a7-b88d-73983735dd75"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.070943 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34617816-681d-44a7-b88d-73983735dd75-kube-api-access-d9kdt" (OuterVolumeSpecName: "kube-api-access-d9kdt") pod "34617816-681d-44a7-b88d-73983735dd75" (UID: "34617816-681d-44a7-b88d-73983735dd75"). InnerVolumeSpecName "kube-api-access-d9kdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.082720 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34617816-681d-44a7-b88d-73983735dd75-scripts" (OuterVolumeSpecName: "scripts") pod "34617816-681d-44a7-b88d-73983735dd75" (UID: "34617816-681d-44a7-b88d-73983735dd75"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.084771 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc4be349-0e33-467d-9214-134e91da88f2-kube-api-access-f28m7" (OuterVolumeSpecName: "kube-api-access-f28m7") pod "dc4be349-0e33-467d-9214-134e91da88f2" (UID: "dc4be349-0e33-467d-9214-134e91da88f2"). InnerVolumeSpecName "kube-api-access-f28m7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.084993 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34617816-681d-44a7-b88d-73983735dd75-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "34617816-681d-44a7-b88d-73983735dd75" (UID: "34617816-681d-44a7-b88d-73983735dd75"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.118347 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34617816-681d-44a7-b88d-73983735dd75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34617816-681d-44a7-b88d-73983735dd75" (UID: "34617816-681d-44a7-b88d-73983735dd75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.146501 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc4be349-0e33-467d-9214-134e91da88f2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dc4be349-0e33-467d-9214-134e91da88f2" (UID: "dc4be349-0e33-467d-9214-134e91da88f2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.148153 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc4be349-0e33-467d-9214-134e91da88f2-config" (OuterVolumeSpecName: "config") pod "dc4be349-0e33-467d-9214-134e91da88f2" (UID: "dc4be349-0e33-467d-9214-134e91da88f2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.156875 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc4be349-0e33-467d-9214-134e91da88f2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dc4be349-0e33-467d-9214-134e91da88f2" (UID: "dc4be349-0e33-467d-9214-134e91da88f2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.164394 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc4be349-0e33-467d-9214-134e91da88f2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dc4be349-0e33-467d-9214-134e91da88f2" (UID: "dc4be349-0e33-467d-9214-134e91da88f2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.164854 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34617816-681d-44a7-b88d-73983735dd75-config-data" (OuterVolumeSpecName: "config-data") pod "34617816-681d-44a7-b88d-73983735dd75" (UID: "34617816-681d-44a7-b88d-73983735dd75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.165052 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc4be349-0e33-467d-9214-134e91da88f2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.165080 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f28m7\" (UniqueName: \"kubernetes.io/projected/dc4be349-0e33-467d-9214-134e91da88f2-kube-api-access-f28m7\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.165091 4672 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/34617816-681d-44a7-b88d-73983735dd75-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.165100 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34617816-681d-44a7-b88d-73983735dd75-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.165109 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc4be349-0e33-467d-9214-134e91da88f2-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.165119 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34617816-681d-44a7-b88d-73983735dd75-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.165130 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34617816-681d-44a7-b88d-73983735dd75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.165138 4672 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/34617816-681d-44a7-b88d-73983735dd75-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.165148 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9kdt\" (UniqueName: \"kubernetes.io/projected/34617816-681d-44a7-b88d-73983735dd75-kube-api-access-d9kdt\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.165157 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc4be349-0e33-467d-9214-134e91da88f2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.165166 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc4be349-0e33-467d-9214-134e91da88f2-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.376017 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jnkpx" Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.376029 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jnkpx" event={"ID":"34617816-681d-44a7-b88d-73983735dd75","Type":"ContainerDied","Data":"704766205aca81a7409d17ad0a051289746b3fc747594baab804916133fce7ed"} Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.376423 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="704766205aca81a7409d17ad0a051289746b3fc747594baab804916133fce7ed" Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.378856 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c2437522-da6e-48b0-94b4-30b0968bccde","Type":"ContainerStarted","Data":"332a1f591ccc359ba9670a5346322a6169aac3076382f9083e9732cc32935cc7"} Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.379004 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.378997 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c2437522-da6e-48b0-94b4-30b0968bccde" containerName="ceilometer-central-agent" containerID="cri-o://f5f794408b6d4724c62aae1e5ee1e4ac7383ab3451759f6b8e6aa0e0ab98ad7c" gracePeriod=30 Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.379036 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c2437522-da6e-48b0-94b4-30b0968bccde" containerName="sg-core" containerID="cri-o://d3aaf2d78621f739ce8eb5f80d83ae289c9c5d7193ea6bf07712934d565c2a0f" gracePeriod=30 Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.379088 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c2437522-da6e-48b0-94b4-30b0968bccde" containerName="ceilometer-notification-agent" containerID="cri-o://30c8db56ebd0459d81667b1dc66d3ee86fe5dbb7addfd4bca363f1c34ab4de5a" gracePeriod=30 Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.379181 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c2437522-da6e-48b0-94b4-30b0968bccde" containerName="proxy-httpd" containerID="cri-o://332a1f591ccc359ba9670a5346322a6169aac3076382f9083e9732cc32935cc7" gracePeriod=30 Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.384875 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76445f8cf5-d9mzn" event={"ID":"dc4be349-0e33-467d-9214-134e91da88f2","Type":"ContainerDied","Data":"04dcb28b07415016bda166d74989e7e067bce99dcec8daf5aaa1f49ae4bf9824"} Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.384925 4672 scope.go:117] "RemoveContainer" containerID="7ae30a5f2eb33c96762e4adef64716062087190ba3f4512d00813545c751f0b7" Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.385061 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76445f8cf5-d9mzn" Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.410984 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.678798843 podStartE2EDuration="52.410961209s" podCreationTimestamp="2025-12-06 09:23:29 +0000 UTC" firstStartedPulling="2025-12-06 09:23:31.178905748 +0000 UTC m=+1028.923166035" lastFinishedPulling="2025-12-06 09:24:20.911068114 +0000 UTC m=+1078.655328401" observedRunningTime="2025-12-06 09:24:21.407522927 +0000 UTC m=+1079.151783214" watchObservedRunningTime="2025-12-06 09:24:21.410961209 +0000 UTC m=+1079.155221506" Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.418901 4672 scope.go:117] "RemoveContainer" containerID="c0f656ea5633a2c28d6df83d0b8b981a77b4ce557e0e91c5858fb26740879fb3" Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.439857 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76445f8cf5-d9mzn"] Dec 06 09:24:21 crc kubenswrapper[4672]: I1206 09:24:21.458538 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76445f8cf5-d9mzn"] Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.176639 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 09:24:22 crc kubenswrapper[4672]: E1206 09:24:22.177004 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc4be349-0e33-467d-9214-134e91da88f2" containerName="dnsmasq-dns" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.177019 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc4be349-0e33-467d-9214-134e91da88f2" containerName="dnsmasq-dns" Dec 06 09:24:22 crc kubenswrapper[4672]: E1206 09:24:22.177034 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc4be349-0e33-467d-9214-134e91da88f2" containerName="init" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.177041 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc4be349-0e33-467d-9214-134e91da88f2" containerName="init" Dec 06 09:24:22 crc kubenswrapper[4672]: E1206 09:24:22.177053 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d" containerName="init" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.177060 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d" containerName="init" Dec 06 09:24:22 crc kubenswrapper[4672]: E1206 09:24:22.177077 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d" containerName="dnsmasq-dns" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.177083 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d" containerName="dnsmasq-dns" Dec 06 09:24:22 crc kubenswrapper[4672]: E1206 09:24:22.177095 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34617816-681d-44a7-b88d-73983735dd75" containerName="cinder-db-sync" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.177100 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="34617816-681d-44a7-b88d-73983735dd75" containerName="cinder-db-sync" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.177257 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="47ecfed3-98e9-4bcf-8f8e-ecfa5049d72d" containerName="dnsmasq-dns" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.177273 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc4be349-0e33-467d-9214-134e91da88f2" containerName="dnsmasq-dns" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.177282 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="34617816-681d-44a7-b88d-73983735dd75" containerName="cinder-db-sync" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.178241 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.189165 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.189368 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-lhcs8" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.201180 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.201454 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.211245 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.251747 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d6875bb67-xw22g"] Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.268005 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d6875bb67-xw22g" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.291479 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28f3b4d4-d557-4187-b590-4601d0a3fb9b-config-data\") pod \"cinder-scheduler-0\" (UID: \"28f3b4d4-d557-4187-b590-4601d0a3fb9b\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.291768 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f3b4d4-d557-4187-b590-4601d0a3fb9b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"28f3b4d4-d557-4187-b590-4601d0a3fb9b\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.291861 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28f3b4d4-d557-4187-b590-4601d0a3fb9b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"28f3b4d4-d557-4187-b590-4601d0a3fb9b\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.291939 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6mxr\" (UniqueName: \"kubernetes.io/projected/28f3b4d4-d557-4187-b590-4601d0a3fb9b-kube-api-access-v6mxr\") pod \"cinder-scheduler-0\" (UID: \"28f3b4d4-d557-4187-b590-4601d0a3fb9b\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.292034 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28f3b4d4-d557-4187-b590-4601d0a3fb9b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"28f3b4d4-d557-4187-b590-4601d0a3fb9b\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.292145 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28f3b4d4-d557-4187-b590-4601d0a3fb9b-scripts\") pod \"cinder-scheduler-0\" (UID: \"28f3b4d4-d557-4187-b590-4601d0a3fb9b\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.300516 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d6875bb67-xw22g"] Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.396516 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28f3b4d4-d557-4187-b590-4601d0a3fb9b-scripts\") pod \"cinder-scheduler-0\" (UID: \"28f3b4d4-d557-4187-b590-4601d0a3fb9b\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.397652 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1601ecfd-770a-4d88-9a5e-fb465a0b98f0-ovsdbserver-nb\") pod \"dnsmasq-dns-5d6875bb67-xw22g\" (UID: \"1601ecfd-770a-4d88-9a5e-fb465a0b98f0\") " pod="openstack/dnsmasq-dns-5d6875bb67-xw22g" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.397768 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28f3b4d4-d557-4187-b590-4601d0a3fb9b-config-data\") pod \"cinder-scheduler-0\" (UID: \"28f3b4d4-d557-4187-b590-4601d0a3fb9b\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.397858 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f3b4d4-d557-4187-b590-4601d0a3fb9b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"28f3b4d4-d557-4187-b590-4601d0a3fb9b\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.397929 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28f3b4d4-d557-4187-b590-4601d0a3fb9b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"28f3b4d4-d557-4187-b590-4601d0a3fb9b\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.398009 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6mxr\" (UniqueName: \"kubernetes.io/projected/28f3b4d4-d557-4187-b590-4601d0a3fb9b-kube-api-access-v6mxr\") pod \"cinder-scheduler-0\" (UID: \"28f3b4d4-d557-4187-b590-4601d0a3fb9b\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.398101 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1601ecfd-770a-4d88-9a5e-fb465a0b98f0-config\") pod \"dnsmasq-dns-5d6875bb67-xw22g\" (UID: \"1601ecfd-770a-4d88-9a5e-fb465a0b98f0\") " pod="openstack/dnsmasq-dns-5d6875bb67-xw22g" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.398188 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28f3b4d4-d557-4187-b590-4601d0a3fb9b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"28f3b4d4-d557-4187-b590-4601d0a3fb9b\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.398279 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1601ecfd-770a-4d88-9a5e-fb465a0b98f0-ovsdbserver-sb\") pod \"dnsmasq-dns-5d6875bb67-xw22g\" (UID: \"1601ecfd-770a-4d88-9a5e-fb465a0b98f0\") " pod="openstack/dnsmasq-dns-5d6875bb67-xw22g" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.398357 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1601ecfd-770a-4d88-9a5e-fb465a0b98f0-dns-svc\") pod \"dnsmasq-dns-5d6875bb67-xw22g\" (UID: \"1601ecfd-770a-4d88-9a5e-fb465a0b98f0\") " pod="openstack/dnsmasq-dns-5d6875bb67-xw22g" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.398449 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjjpq\" (UniqueName: \"kubernetes.io/projected/1601ecfd-770a-4d88-9a5e-fb465a0b98f0-kube-api-access-rjjpq\") pod \"dnsmasq-dns-5d6875bb67-xw22g\" (UID: \"1601ecfd-770a-4d88-9a5e-fb465a0b98f0\") " pod="openstack/dnsmasq-dns-5d6875bb67-xw22g" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.403200 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28f3b4d4-d557-4187-b590-4601d0a3fb9b-scripts\") pod \"cinder-scheduler-0\" (UID: \"28f3b4d4-d557-4187-b590-4601d0a3fb9b\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.404054 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28f3b4d4-d557-4187-b590-4601d0a3fb9b-config-data\") pod \"cinder-scheduler-0\" (UID: \"28f3b4d4-d557-4187-b590-4601d0a3fb9b\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.404487 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28f3b4d4-d557-4187-b590-4601d0a3fb9b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"28f3b4d4-d557-4187-b590-4601d0a3fb9b\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.409278 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f3b4d4-d557-4187-b590-4601d0a3fb9b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"28f3b4d4-d557-4187-b590-4601d0a3fb9b\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.414024 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28f3b4d4-d557-4187-b590-4601d0a3fb9b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"28f3b4d4-d557-4187-b590-4601d0a3fb9b\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.431189 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6mxr\" (UniqueName: \"kubernetes.io/projected/28f3b4d4-d557-4187-b590-4601d0a3fb9b-kube-api-access-v6mxr\") pod \"cinder-scheduler-0\" (UID: \"28f3b4d4-d557-4187-b590-4601d0a3fb9b\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.435796 4672 generic.go:334] "Generic (PLEG): container finished" podID="c2437522-da6e-48b0-94b4-30b0968bccde" containerID="332a1f591ccc359ba9670a5346322a6169aac3076382f9083e9732cc32935cc7" exitCode=0 Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.435834 4672 generic.go:334] "Generic (PLEG): container finished" podID="c2437522-da6e-48b0-94b4-30b0968bccde" containerID="d3aaf2d78621f739ce8eb5f80d83ae289c9c5d7193ea6bf07712934d565c2a0f" exitCode=2 Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.435843 4672 generic.go:334] "Generic (PLEG): container finished" podID="c2437522-da6e-48b0-94b4-30b0968bccde" containerID="f5f794408b6d4724c62aae1e5ee1e4ac7383ab3451759f6b8e6aa0e0ab98ad7c" exitCode=0 Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.435905 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c2437522-da6e-48b0-94b4-30b0968bccde","Type":"ContainerDied","Data":"332a1f591ccc359ba9670a5346322a6169aac3076382f9083e9732cc32935cc7"} Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.435934 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c2437522-da6e-48b0-94b4-30b0968bccde","Type":"ContainerDied","Data":"d3aaf2d78621f739ce8eb5f80d83ae289c9c5d7193ea6bf07712934d565c2a0f"} Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.435965 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c2437522-da6e-48b0-94b4-30b0968bccde","Type":"ContainerDied","Data":"f5f794408b6d4724c62aae1e5ee1e4ac7383ab3451759f6b8e6aa0e0ab98ad7c"} Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.496501 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.498034 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.499581 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.500223 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1601ecfd-770a-4d88-9a5e-fb465a0b98f0-config\") pod \"dnsmasq-dns-5d6875bb67-xw22g\" (UID: \"1601ecfd-770a-4d88-9a5e-fb465a0b98f0\") " pod="openstack/dnsmasq-dns-5d6875bb67-xw22g" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.500280 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1601ecfd-770a-4d88-9a5e-fb465a0b98f0-ovsdbserver-sb\") pod \"dnsmasq-dns-5d6875bb67-xw22g\" (UID: \"1601ecfd-770a-4d88-9a5e-fb465a0b98f0\") " pod="openstack/dnsmasq-dns-5d6875bb67-xw22g" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.500301 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1601ecfd-770a-4d88-9a5e-fb465a0b98f0-dns-svc\") pod \"dnsmasq-dns-5d6875bb67-xw22g\" (UID: \"1601ecfd-770a-4d88-9a5e-fb465a0b98f0\") " pod="openstack/dnsmasq-dns-5d6875bb67-xw22g" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.500318 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjjpq\" (UniqueName: \"kubernetes.io/projected/1601ecfd-770a-4d88-9a5e-fb465a0b98f0-kube-api-access-rjjpq\") pod \"dnsmasq-dns-5d6875bb67-xw22g\" (UID: \"1601ecfd-770a-4d88-9a5e-fb465a0b98f0\") " pod="openstack/dnsmasq-dns-5d6875bb67-xw22g" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.501242 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1601ecfd-770a-4d88-9a5e-fb465a0b98f0-config\") pod \"dnsmasq-dns-5d6875bb67-xw22g\" (UID: \"1601ecfd-770a-4d88-9a5e-fb465a0b98f0\") " pod="openstack/dnsmasq-dns-5d6875bb67-xw22g" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.501644 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1601ecfd-770a-4d88-9a5e-fb465a0b98f0-dns-svc\") pod \"dnsmasq-dns-5d6875bb67-xw22g\" (UID: \"1601ecfd-770a-4d88-9a5e-fb465a0b98f0\") " pod="openstack/dnsmasq-dns-5d6875bb67-xw22g" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.503991 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1601ecfd-770a-4d88-9a5e-fb465a0b98f0-ovsdbserver-sb\") pod \"dnsmasq-dns-5d6875bb67-xw22g\" (UID: \"1601ecfd-770a-4d88-9a5e-fb465a0b98f0\") " pod="openstack/dnsmasq-dns-5d6875bb67-xw22g" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.504440 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1601ecfd-770a-4d88-9a5e-fb465a0b98f0-ovsdbserver-nb\") pod \"dnsmasq-dns-5d6875bb67-xw22g\" (UID: \"1601ecfd-770a-4d88-9a5e-fb465a0b98f0\") " pod="openstack/dnsmasq-dns-5d6875bb67-xw22g" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.505244 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1601ecfd-770a-4d88-9a5e-fb465a0b98f0-ovsdbserver-nb\") pod \"dnsmasq-dns-5d6875bb67-xw22g\" (UID: \"1601ecfd-770a-4d88-9a5e-fb465a0b98f0\") " pod="openstack/dnsmasq-dns-5d6875bb67-xw22g" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.517968 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-lhcs8" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.525417 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.527728 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.529174 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjjpq\" (UniqueName: \"kubernetes.io/projected/1601ecfd-770a-4d88-9a5e-fb465a0b98f0-kube-api-access-rjjpq\") pod \"dnsmasq-dns-5d6875bb67-xw22g\" (UID: \"1601ecfd-770a-4d88-9a5e-fb465a0b98f0\") " pod="openstack/dnsmasq-dns-5d6875bb67-xw22g" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.601500 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc4be349-0e33-467d-9214-134e91da88f2" path="/var/lib/kubelet/pods/dc4be349-0e33-467d-9214-134e91da88f2/volumes" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.608726 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44d1e242-93af-4397-9030-232f8780c592-config-data\") pod \"cinder-api-0\" (UID: \"44d1e242-93af-4397-9030-232f8780c592\") " pod="openstack/cinder-api-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.610397 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44d1e242-93af-4397-9030-232f8780c592-logs\") pod \"cinder-api-0\" (UID: \"44d1e242-93af-4397-9030-232f8780c592\") " pod="openstack/cinder-api-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.610648 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/44d1e242-93af-4397-9030-232f8780c592-etc-machine-id\") pod \"cinder-api-0\" (UID: \"44d1e242-93af-4397-9030-232f8780c592\") " pod="openstack/cinder-api-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.610805 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d1e242-93af-4397-9030-232f8780c592-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"44d1e242-93af-4397-9030-232f8780c592\") " pod="openstack/cinder-api-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.611012 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44d1e242-93af-4397-9030-232f8780c592-scripts\") pod \"cinder-api-0\" (UID: \"44d1e242-93af-4397-9030-232f8780c592\") " pod="openstack/cinder-api-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.611117 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44d1e242-93af-4397-9030-232f8780c592-config-data-custom\") pod \"cinder-api-0\" (UID: \"44d1e242-93af-4397-9030-232f8780c592\") " pod="openstack/cinder-api-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.611266 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5s7n\" (UniqueName: \"kubernetes.io/projected/44d1e242-93af-4397-9030-232f8780c592-kube-api-access-j5s7n\") pod \"cinder-api-0\" (UID: \"44d1e242-93af-4397-9030-232f8780c592\") " pod="openstack/cinder-api-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.626962 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d6875bb67-xw22g" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.714733 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44d1e242-93af-4397-9030-232f8780c592-logs\") pod \"cinder-api-0\" (UID: \"44d1e242-93af-4397-9030-232f8780c592\") " pod="openstack/cinder-api-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.715055 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/44d1e242-93af-4397-9030-232f8780c592-etc-machine-id\") pod \"cinder-api-0\" (UID: \"44d1e242-93af-4397-9030-232f8780c592\") " pod="openstack/cinder-api-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.715135 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d1e242-93af-4397-9030-232f8780c592-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"44d1e242-93af-4397-9030-232f8780c592\") " pod="openstack/cinder-api-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.715156 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44d1e242-93af-4397-9030-232f8780c592-scripts\") pod \"cinder-api-0\" (UID: \"44d1e242-93af-4397-9030-232f8780c592\") " pod="openstack/cinder-api-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.715183 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44d1e242-93af-4397-9030-232f8780c592-config-data-custom\") pod \"cinder-api-0\" (UID: \"44d1e242-93af-4397-9030-232f8780c592\") " pod="openstack/cinder-api-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.715237 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5s7n\" (UniqueName: \"kubernetes.io/projected/44d1e242-93af-4397-9030-232f8780c592-kube-api-access-j5s7n\") pod \"cinder-api-0\" (UID: \"44d1e242-93af-4397-9030-232f8780c592\") " pod="openstack/cinder-api-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.715290 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44d1e242-93af-4397-9030-232f8780c592-config-data\") pod \"cinder-api-0\" (UID: \"44d1e242-93af-4397-9030-232f8780c592\") " pod="openstack/cinder-api-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.716583 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44d1e242-93af-4397-9030-232f8780c592-logs\") pod \"cinder-api-0\" (UID: \"44d1e242-93af-4397-9030-232f8780c592\") " pod="openstack/cinder-api-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.716805 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/44d1e242-93af-4397-9030-232f8780c592-etc-machine-id\") pod \"cinder-api-0\" (UID: \"44d1e242-93af-4397-9030-232f8780c592\") " pod="openstack/cinder-api-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.722119 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44d1e242-93af-4397-9030-232f8780c592-scripts\") pod \"cinder-api-0\" (UID: \"44d1e242-93af-4397-9030-232f8780c592\") " pod="openstack/cinder-api-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.725505 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44d1e242-93af-4397-9030-232f8780c592-config-data-custom\") pod \"cinder-api-0\" (UID: \"44d1e242-93af-4397-9030-232f8780c592\") " pod="openstack/cinder-api-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.728111 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d1e242-93af-4397-9030-232f8780c592-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"44d1e242-93af-4397-9030-232f8780c592\") " pod="openstack/cinder-api-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.728559 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44d1e242-93af-4397-9030-232f8780c592-config-data\") pod \"cinder-api-0\" (UID: \"44d1e242-93af-4397-9030-232f8780c592\") " pod="openstack/cinder-api-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.747987 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5s7n\" (UniqueName: \"kubernetes.io/projected/44d1e242-93af-4397-9030-232f8780c592-kube-api-access-j5s7n\") pod \"cinder-api-0\" (UID: \"44d1e242-93af-4397-9030-232f8780c592\") " pod="openstack/cinder-api-0" Dec 06 09:24:22 crc kubenswrapper[4672]: I1206 09:24:22.907917 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.059115 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d6875bb67-xw22g"] Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.092729 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.474320 4672 generic.go:334] "Generic (PLEG): container finished" podID="1601ecfd-770a-4d88-9a5e-fb465a0b98f0" containerID="c9aae3441a78a1a5aab55b0bbb2bc8b9c2ee997420c9af7f90dd14217bafbf8c" exitCode=0 Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.474615 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d6875bb67-xw22g" event={"ID":"1601ecfd-770a-4d88-9a5e-fb465a0b98f0","Type":"ContainerDied","Data":"c9aae3441a78a1a5aab55b0bbb2bc8b9c2ee997420c9af7f90dd14217bafbf8c"} Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.474640 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d6875bb67-xw22g" event={"ID":"1601ecfd-770a-4d88-9a5e-fb465a0b98f0","Type":"ContainerStarted","Data":"705dd69ff52892f097c13b0c504c7592fb8487f4b2bee996854e6481248050cc"} Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.476213 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"28f3b4d4-d557-4187-b590-4601d0a3fb9b","Type":"ContainerStarted","Data":"b31f624a4a2d8977de7d00fc005bbaaa49884d60390ff1d35e1446f7366c8041"} Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.485731 4672 generic.go:334] "Generic (PLEG): container finished" podID="c2437522-da6e-48b0-94b4-30b0968bccde" containerID="30c8db56ebd0459d81667b1dc66d3ee86fe5dbb7addfd4bca363f1c34ab4de5a" exitCode=0 Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.485774 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c2437522-da6e-48b0-94b4-30b0968bccde","Type":"ContainerDied","Data":"30c8db56ebd0459d81667b1dc66d3ee86fe5dbb7addfd4bca363f1c34ab4de5a"} Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.508585 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.545922 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.640085 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-85f8bd445b-dxm5t" Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.733885 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2437522-da6e-48b0-94b4-30b0968bccde-log-httpd\") pod \"c2437522-da6e-48b0-94b4-30b0968bccde\" (UID: \"c2437522-da6e-48b0-94b4-30b0968bccde\") " Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.734440 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8km79\" (UniqueName: \"kubernetes.io/projected/c2437522-da6e-48b0-94b4-30b0968bccde-kube-api-access-8km79\") pod \"c2437522-da6e-48b0-94b4-30b0968bccde\" (UID: \"c2437522-da6e-48b0-94b4-30b0968bccde\") " Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.734538 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2437522-da6e-48b0-94b4-30b0968bccde-config-data\") pod \"c2437522-da6e-48b0-94b4-30b0968bccde\" (UID: \"c2437522-da6e-48b0-94b4-30b0968bccde\") " Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.734651 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2437522-da6e-48b0-94b4-30b0968bccde-combined-ca-bundle\") pod \"c2437522-da6e-48b0-94b4-30b0968bccde\" (UID: \"c2437522-da6e-48b0-94b4-30b0968bccde\") " Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.734749 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2437522-da6e-48b0-94b4-30b0968bccde-scripts\") pod \"c2437522-da6e-48b0-94b4-30b0968bccde\" (UID: \"c2437522-da6e-48b0-94b4-30b0968bccde\") " Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.734873 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c2437522-da6e-48b0-94b4-30b0968bccde-sg-core-conf-yaml\") pod \"c2437522-da6e-48b0-94b4-30b0968bccde\" (UID: \"c2437522-da6e-48b0-94b4-30b0968bccde\") " Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.735006 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2437522-da6e-48b0-94b4-30b0968bccde-run-httpd\") pod \"c2437522-da6e-48b0-94b4-30b0968bccde\" (UID: \"c2437522-da6e-48b0-94b4-30b0968bccde\") " Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.736275 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2437522-da6e-48b0-94b4-30b0968bccde-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c2437522-da6e-48b0-94b4-30b0968bccde" (UID: "c2437522-da6e-48b0-94b4-30b0968bccde"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.737381 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2437522-da6e-48b0-94b4-30b0968bccde-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c2437522-da6e-48b0-94b4-30b0968bccde" (UID: "c2437522-da6e-48b0-94b4-30b0968bccde"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.742532 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2437522-da6e-48b0-94b4-30b0968bccde-scripts" (OuterVolumeSpecName: "scripts") pod "c2437522-da6e-48b0-94b4-30b0968bccde" (UID: "c2437522-da6e-48b0-94b4-30b0968bccde"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.778799 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2437522-da6e-48b0-94b4-30b0968bccde-kube-api-access-8km79" (OuterVolumeSpecName: "kube-api-access-8km79") pod "c2437522-da6e-48b0-94b4-30b0968bccde" (UID: "c2437522-da6e-48b0-94b4-30b0968bccde"). InnerVolumeSpecName "kube-api-access-8km79". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.788922 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2437522-da6e-48b0-94b4-30b0968bccde-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c2437522-da6e-48b0-94b4-30b0968bccde" (UID: "c2437522-da6e-48b0-94b4-30b0968bccde"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.836651 4672 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2437522-da6e-48b0-94b4-30b0968bccde-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.836681 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8km79\" (UniqueName: \"kubernetes.io/projected/c2437522-da6e-48b0-94b4-30b0968bccde-kube-api-access-8km79\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.836691 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2437522-da6e-48b0-94b4-30b0968bccde-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.836699 4672 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c2437522-da6e-48b0-94b4-30b0968bccde-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.836708 4672 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2437522-da6e-48b0-94b4-30b0968bccde-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.882067 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2437522-da6e-48b0-94b4-30b0968bccde-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2437522-da6e-48b0-94b4-30b0968bccde" (UID: "c2437522-da6e-48b0-94b4-30b0968bccde"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.902957 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2437522-da6e-48b0-94b4-30b0968bccde-config-data" (OuterVolumeSpecName: "config-data") pod "c2437522-da6e-48b0-94b4-30b0968bccde" (UID: "c2437522-da6e-48b0-94b4-30b0968bccde"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.938222 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2437522-da6e-48b0-94b4-30b0968bccde-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:23 crc kubenswrapper[4672]: I1206 09:24:23.938254 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2437522-da6e-48b0-94b4-30b0968bccde-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.195181 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6556b45576-ndnkn" podUID="0c3f68cf-fee8-4b57-93b8-daafbe0ee008" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.145:9311/healthcheck\": read tcp 10.217.0.2:51314->10.217.0.145:9311: read: connection reset by peer" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.195181 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6556b45576-ndnkn" podUID="0c3f68cf-fee8-4b57-93b8-daafbe0ee008" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.145:9311/healthcheck\": read tcp 10.217.0.2:51312->10.217.0.145:9311: read: connection reset by peer" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.508872 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.563913 4672 generic.go:334] "Generic (PLEG): container finished" podID="0c3f68cf-fee8-4b57-93b8-daafbe0ee008" containerID="100216a82d6fdb21fac77d981094ced2e082e2e89e4be455b93181c796dce581" exitCode=0 Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.578364 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6556b45576-ndnkn" event={"ID":"0c3f68cf-fee8-4b57-93b8-daafbe0ee008","Type":"ContainerDied","Data":"100216a82d6fdb21fac77d981094ced2e082e2e89e4be455b93181c796dce581"} Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.578408 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"44d1e242-93af-4397-9030-232f8780c592","Type":"ContainerStarted","Data":"286e2d6aa83633d2d60bc321f1ccdebbf5c0991660a1dc41fa59e543550cc55d"} Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.578417 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"44d1e242-93af-4397-9030-232f8780c592","Type":"ContainerStarted","Data":"0372f26ba6a7d163caf1af3aca182e2c5f8f162679c1fb6752c69d76eb58ebc9"} Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.586901 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c2437522-da6e-48b0-94b4-30b0968bccde","Type":"ContainerDied","Data":"cf1a2d80d0cecb2f41beb1f6efbe7a39416dd9e46c15c8b9183c30eceeb413b5"} Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.586951 4672 scope.go:117] "RemoveContainer" containerID="332a1f591ccc359ba9670a5346322a6169aac3076382f9083e9732cc32935cc7" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.587042 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.593390 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d6875bb67-xw22g" event={"ID":"1601ecfd-770a-4d88-9a5e-fb465a0b98f0","Type":"ContainerStarted","Data":"a66f3532346abb63e584d93d265146eb0811561ebfa6f4fd948f61ca6a6105fb"} Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.593838 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d6875bb67-xw22g" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.636349 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d6875bb67-xw22g" podStartSLOduration=2.636329627 podStartE2EDuration="2.636329627s" podCreationTimestamp="2025-12-06 09:24:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:24:24.628405882 +0000 UTC m=+1082.372666169" watchObservedRunningTime="2025-12-06 09:24:24.636329627 +0000 UTC m=+1082.380589914" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.655722 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.668664 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6556b45576-ndnkn" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.672388 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.681996 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:24:24 crc kubenswrapper[4672]: E1206 09:24:24.683351 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2437522-da6e-48b0-94b4-30b0968bccde" containerName="sg-core" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.683377 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2437522-da6e-48b0-94b4-30b0968bccde" containerName="sg-core" Dec 06 09:24:24 crc kubenswrapper[4672]: E1206 09:24:24.683396 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2437522-da6e-48b0-94b4-30b0968bccde" containerName="ceilometer-notification-agent" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.683404 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2437522-da6e-48b0-94b4-30b0968bccde" containerName="ceilometer-notification-agent" Dec 06 09:24:24 crc kubenswrapper[4672]: E1206 09:24:24.683422 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2437522-da6e-48b0-94b4-30b0968bccde" containerName="ceilometer-central-agent" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.683431 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2437522-da6e-48b0-94b4-30b0968bccde" containerName="ceilometer-central-agent" Dec 06 09:24:24 crc kubenswrapper[4672]: E1206 09:24:24.683464 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c3f68cf-fee8-4b57-93b8-daafbe0ee008" containerName="barbican-api" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.683472 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c3f68cf-fee8-4b57-93b8-daafbe0ee008" containerName="barbican-api" Dec 06 09:24:24 crc kubenswrapper[4672]: E1206 09:24:24.683491 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c3f68cf-fee8-4b57-93b8-daafbe0ee008" containerName="barbican-api-log" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.683498 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c3f68cf-fee8-4b57-93b8-daafbe0ee008" containerName="barbican-api-log" Dec 06 09:24:24 crc kubenswrapper[4672]: E1206 09:24:24.683511 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2437522-da6e-48b0-94b4-30b0968bccde" containerName="proxy-httpd" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.683518 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2437522-da6e-48b0-94b4-30b0968bccde" containerName="proxy-httpd" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.685871 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2437522-da6e-48b0-94b4-30b0968bccde" containerName="ceilometer-central-agent" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.685924 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2437522-da6e-48b0-94b4-30b0968bccde" containerName="ceilometer-notification-agent" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.685946 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2437522-da6e-48b0-94b4-30b0968bccde" containerName="proxy-httpd" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.685970 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c3f68cf-fee8-4b57-93b8-daafbe0ee008" containerName="barbican-api" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.686001 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2437522-da6e-48b0-94b4-30b0968bccde" containerName="sg-core" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.686033 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c3f68cf-fee8-4b57-93b8-daafbe0ee008" containerName="barbican-api-log" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.688734 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.690831 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.691059 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.707197 4672 scope.go:117] "RemoveContainer" containerID="d3aaf2d78621f739ce8eb5f80d83ae289c9c5d7193ea6bf07712934d565c2a0f" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.707348 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.766431 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c3f68cf-fee8-4b57-93b8-daafbe0ee008-config-data-custom\") pod \"0c3f68cf-fee8-4b57-93b8-daafbe0ee008\" (UID: \"0c3f68cf-fee8-4b57-93b8-daafbe0ee008\") " Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.766483 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p9fc\" (UniqueName: \"kubernetes.io/projected/0c3f68cf-fee8-4b57-93b8-daafbe0ee008-kube-api-access-4p9fc\") pod \"0c3f68cf-fee8-4b57-93b8-daafbe0ee008\" (UID: \"0c3f68cf-fee8-4b57-93b8-daafbe0ee008\") " Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.766522 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c3f68cf-fee8-4b57-93b8-daafbe0ee008-config-data\") pod \"0c3f68cf-fee8-4b57-93b8-daafbe0ee008\" (UID: \"0c3f68cf-fee8-4b57-93b8-daafbe0ee008\") " Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.766649 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z79fh\" (UniqueName: \"kubernetes.io/projected/4908de30-b638-44a7-b414-d1ac88946fb1-kube-api-access-z79fh\") pod \"ceilometer-0\" (UID: \"4908de30-b638-44a7-b414-d1ac88946fb1\") " pod="openstack/ceilometer-0" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.766678 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4908de30-b638-44a7-b414-d1ac88946fb1-scripts\") pod \"ceilometer-0\" (UID: \"4908de30-b638-44a7-b414-d1ac88946fb1\") " pod="openstack/ceilometer-0" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.766698 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4908de30-b638-44a7-b414-d1ac88946fb1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4908de30-b638-44a7-b414-d1ac88946fb1\") " pod="openstack/ceilometer-0" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.766715 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4908de30-b638-44a7-b414-d1ac88946fb1-config-data\") pod \"ceilometer-0\" (UID: \"4908de30-b638-44a7-b414-d1ac88946fb1\") " pod="openstack/ceilometer-0" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.766731 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4908de30-b638-44a7-b414-d1ac88946fb1-log-httpd\") pod \"ceilometer-0\" (UID: \"4908de30-b638-44a7-b414-d1ac88946fb1\") " pod="openstack/ceilometer-0" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.766770 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4908de30-b638-44a7-b414-d1ac88946fb1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4908de30-b638-44a7-b414-d1ac88946fb1\") " pod="openstack/ceilometer-0" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.766902 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4908de30-b638-44a7-b414-d1ac88946fb1-run-httpd\") pod \"ceilometer-0\" (UID: \"4908de30-b638-44a7-b414-d1ac88946fb1\") " pod="openstack/ceilometer-0" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.784939 4672 scope.go:117] "RemoveContainer" containerID="30c8db56ebd0459d81667b1dc66d3ee86fe5dbb7addfd4bca363f1c34ab4de5a" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.793288 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c3f68cf-fee8-4b57-93b8-daafbe0ee008-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0c3f68cf-fee8-4b57-93b8-daafbe0ee008" (UID: "0c3f68cf-fee8-4b57-93b8-daafbe0ee008"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.793386 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-76445f8cf5-d9mzn" podUID="dc4be349-0e33-467d-9214-134e91da88f2" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.136:5353: i/o timeout" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.797748 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c3f68cf-fee8-4b57-93b8-daafbe0ee008-kube-api-access-4p9fc" (OuterVolumeSpecName: "kube-api-access-4p9fc") pod "0c3f68cf-fee8-4b57-93b8-daafbe0ee008" (UID: "0c3f68cf-fee8-4b57-93b8-daafbe0ee008"). InnerVolumeSpecName "kube-api-access-4p9fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.869360 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c3f68cf-fee8-4b57-93b8-daafbe0ee008-logs\") pod \"0c3f68cf-fee8-4b57-93b8-daafbe0ee008\" (UID: \"0c3f68cf-fee8-4b57-93b8-daafbe0ee008\") " Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.869448 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c3f68cf-fee8-4b57-93b8-daafbe0ee008-combined-ca-bundle\") pod \"0c3f68cf-fee8-4b57-93b8-daafbe0ee008\" (UID: \"0c3f68cf-fee8-4b57-93b8-daafbe0ee008\") " Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.869748 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4908de30-b638-44a7-b414-d1ac88946fb1-scripts\") pod \"ceilometer-0\" (UID: \"4908de30-b638-44a7-b414-d1ac88946fb1\") " pod="openstack/ceilometer-0" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.869778 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4908de30-b638-44a7-b414-d1ac88946fb1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4908de30-b638-44a7-b414-d1ac88946fb1\") " pod="openstack/ceilometer-0" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.869797 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4908de30-b638-44a7-b414-d1ac88946fb1-config-data\") pod \"ceilometer-0\" (UID: \"4908de30-b638-44a7-b414-d1ac88946fb1\") " pod="openstack/ceilometer-0" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.869813 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4908de30-b638-44a7-b414-d1ac88946fb1-log-httpd\") pod \"ceilometer-0\" (UID: \"4908de30-b638-44a7-b414-d1ac88946fb1\") " pod="openstack/ceilometer-0" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.869853 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4908de30-b638-44a7-b414-d1ac88946fb1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4908de30-b638-44a7-b414-d1ac88946fb1\") " pod="openstack/ceilometer-0" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.869922 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4908de30-b638-44a7-b414-d1ac88946fb1-run-httpd\") pod \"ceilometer-0\" (UID: \"4908de30-b638-44a7-b414-d1ac88946fb1\") " pod="openstack/ceilometer-0" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.869962 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z79fh\" (UniqueName: \"kubernetes.io/projected/4908de30-b638-44a7-b414-d1ac88946fb1-kube-api-access-z79fh\") pod \"ceilometer-0\" (UID: \"4908de30-b638-44a7-b414-d1ac88946fb1\") " pod="openstack/ceilometer-0" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.870007 4672 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c3f68cf-fee8-4b57-93b8-daafbe0ee008-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.870017 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p9fc\" (UniqueName: \"kubernetes.io/projected/0c3f68cf-fee8-4b57-93b8-daafbe0ee008-kube-api-access-4p9fc\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.872918 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c3f68cf-fee8-4b57-93b8-daafbe0ee008-logs" (OuterVolumeSpecName: "logs") pod "0c3f68cf-fee8-4b57-93b8-daafbe0ee008" (UID: "0c3f68cf-fee8-4b57-93b8-daafbe0ee008"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.876448 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4908de30-b638-44a7-b414-d1ac88946fb1-log-httpd\") pod \"ceilometer-0\" (UID: \"4908de30-b638-44a7-b414-d1ac88946fb1\") " pod="openstack/ceilometer-0" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.876678 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4908de30-b638-44a7-b414-d1ac88946fb1-run-httpd\") pod \"ceilometer-0\" (UID: \"4908de30-b638-44a7-b414-d1ac88946fb1\") " pod="openstack/ceilometer-0" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.879965 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c3f68cf-fee8-4b57-93b8-daafbe0ee008-config-data" (OuterVolumeSpecName: "config-data") pod "0c3f68cf-fee8-4b57-93b8-daafbe0ee008" (UID: "0c3f68cf-fee8-4b57-93b8-daafbe0ee008"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.884249 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4908de30-b638-44a7-b414-d1ac88946fb1-config-data\") pod \"ceilometer-0\" (UID: \"4908de30-b638-44a7-b414-d1ac88946fb1\") " pod="openstack/ceilometer-0" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.907815 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4908de30-b638-44a7-b414-d1ac88946fb1-scripts\") pod \"ceilometer-0\" (UID: \"4908de30-b638-44a7-b414-d1ac88946fb1\") " pod="openstack/ceilometer-0" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.908235 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4908de30-b638-44a7-b414-d1ac88946fb1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4908de30-b638-44a7-b414-d1ac88946fb1\") " pod="openstack/ceilometer-0" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.909196 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4908de30-b638-44a7-b414-d1ac88946fb1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4908de30-b638-44a7-b414-d1ac88946fb1\") " pod="openstack/ceilometer-0" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.937533 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z79fh\" (UniqueName: \"kubernetes.io/projected/4908de30-b638-44a7-b414-d1ac88946fb1-kube-api-access-z79fh\") pod \"ceilometer-0\" (UID: \"4908de30-b638-44a7-b414-d1ac88946fb1\") " pod="openstack/ceilometer-0" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.953723 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c3f68cf-fee8-4b57-93b8-daafbe0ee008-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c3f68cf-fee8-4b57-93b8-daafbe0ee008" (UID: "0c3f68cf-fee8-4b57-93b8-daafbe0ee008"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.972954 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c3f68cf-fee8-4b57-93b8-daafbe0ee008-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.972996 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c3f68cf-fee8-4b57-93b8-daafbe0ee008-logs\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:24 crc kubenswrapper[4672]: I1206 09:24:24.973005 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c3f68cf-fee8-4b57-93b8-daafbe0ee008-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:25 crc kubenswrapper[4672]: I1206 09:24:25.034752 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:24:25 crc kubenswrapper[4672]: I1206 09:24:25.054199 4672 scope.go:117] "RemoveContainer" containerID="f5f794408b6d4724c62aae1e5ee1e4ac7383ab3451759f6b8e6aa0e0ab98ad7c" Dec 06 09:24:25 crc kubenswrapper[4672]: I1206 09:24:25.589682 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:24:25 crc kubenswrapper[4672]: I1206 09:24:25.659273 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"28f3b4d4-d557-4187-b590-4601d0a3fb9b","Type":"ContainerStarted","Data":"c3956c98bf2e5038917be18b9912b72873976c696a61c666e5cdb5bce5f6d8c5"} Dec 06 09:24:25 crc kubenswrapper[4672]: I1206 09:24:25.677621 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6556b45576-ndnkn" event={"ID":"0c3f68cf-fee8-4b57-93b8-daafbe0ee008","Type":"ContainerDied","Data":"69d8df010756d091b8bee1b6cf1dcc0df862e9ca8830666c94fd4ee667e5ad08"} Dec 06 09:24:25 crc kubenswrapper[4672]: I1206 09:24:25.677677 4672 scope.go:117] "RemoveContainer" containerID="100216a82d6fdb21fac77d981094ced2e082e2e89e4be455b93181c796dce581" Dec 06 09:24:25 crc kubenswrapper[4672]: I1206 09:24:25.677870 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6556b45576-ndnkn" Dec 06 09:24:25 crc kubenswrapper[4672]: I1206 09:24:25.696476 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"44d1e242-93af-4397-9030-232f8780c592","Type":"ContainerStarted","Data":"85ececd1cc8c15c1612c05efefb7fe0b207d8b33d92f8c4b3c9934dd37bdc40f"} Dec 06 09:24:25 crc kubenswrapper[4672]: I1206 09:24:25.696767 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 06 09:24:25 crc kubenswrapper[4672]: I1206 09:24:25.696859 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="44d1e242-93af-4397-9030-232f8780c592" containerName="cinder-api" containerID="cri-o://85ececd1cc8c15c1612c05efefb7fe0b207d8b33d92f8c4b3c9934dd37bdc40f" gracePeriod=30 Dec 06 09:24:25 crc kubenswrapper[4672]: I1206 09:24:25.696864 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="44d1e242-93af-4397-9030-232f8780c592" containerName="cinder-api-log" containerID="cri-o://286e2d6aa83633d2d60bc321f1ccdebbf5c0991660a1dc41fa59e543550cc55d" gracePeriod=30 Dec 06 09:24:25 crc kubenswrapper[4672]: I1206 09:24:25.720521 4672 scope.go:117] "RemoveContainer" containerID="41fc5e666e82af9c0d19cb6a92cce8bbefc83c276faf1008061c585850a6d93c" Dec 06 09:24:25 crc kubenswrapper[4672]: I1206 09:24:25.724463 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.724441814 podStartE2EDuration="3.724441814s" podCreationTimestamp="2025-12-06 09:24:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:24:25.719803389 +0000 UTC m=+1083.464063676" watchObservedRunningTime="2025-12-06 09:24:25.724441814 +0000 UTC m=+1083.468702101" Dec 06 09:24:25 crc kubenswrapper[4672]: I1206 09:24:25.748238 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6556b45576-ndnkn"] Dec 06 09:24:25 crc kubenswrapper[4672]: I1206 09:24:25.761173 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6556b45576-ndnkn"] Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.184555 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-d999c477-wf9vn" Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.244257 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-85f8bd445b-dxm5t"] Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.250801 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-85f8bd445b-dxm5t" podUID="b2eaecb7-d959-4452-ba91-029191056f70" containerName="neutron-api" containerID="cri-o://fcfbb3a8fa34c8ded377c8feb108edb04a6c456309a3c94277e9b7012da91f57" gracePeriod=30 Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.250886 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-85f8bd445b-dxm5t" podUID="b2eaecb7-d959-4452-ba91-029191056f70" containerName="neutron-httpd" containerID="cri-o://2f424b0498a0ac11b6c44b1a0c740dfcb7daa1e794b1450978a3bb7757bf7b89" gracePeriod=30 Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.568003 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c3f68cf-fee8-4b57-93b8-daafbe0ee008" path="/var/lib/kubelet/pods/0c3f68cf-fee8-4b57-93b8-daafbe0ee008/volumes" Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.568858 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2437522-da6e-48b0-94b4-30b0968bccde" path="/var/lib/kubelet/pods/c2437522-da6e-48b0-94b4-30b0968bccde/volumes" Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.672905 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.727642 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"28f3b4d4-d557-4187-b590-4601d0a3fb9b","Type":"ContainerStarted","Data":"78b4ad04e2040adf4a9fc51f6d29be83a9f61e31a953edc0ab68afb7394694ef"} Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.736525 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4908de30-b638-44a7-b414-d1ac88946fb1","Type":"ContainerStarted","Data":"d2ac910eed5c8566d194e428bb1bf075dffa95db1329ba68cacb7ab41fef24fc"} Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.736570 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4908de30-b638-44a7-b414-d1ac88946fb1","Type":"ContainerStarted","Data":"8ffc6c533b3e606bbeec645322302b554124d47450969ae47366a7f60d2c6408"} Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.740307 4672 generic.go:334] "Generic (PLEG): container finished" podID="b2eaecb7-d959-4452-ba91-029191056f70" containerID="2f424b0498a0ac11b6c44b1a0c740dfcb7daa1e794b1450978a3bb7757bf7b89" exitCode=0 Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.740377 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85f8bd445b-dxm5t" event={"ID":"b2eaecb7-d959-4452-ba91-029191056f70","Type":"ContainerDied","Data":"2f424b0498a0ac11b6c44b1a0c740dfcb7daa1e794b1450978a3bb7757bf7b89"} Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.749247 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.828459196 podStartE2EDuration="4.749230339s" podCreationTimestamp="2025-12-06 09:24:22 +0000 UTC" firstStartedPulling="2025-12-06 09:24:23.118120402 +0000 UTC m=+1080.862380689" lastFinishedPulling="2025-12-06 09:24:24.038891535 +0000 UTC m=+1081.783151832" observedRunningTime="2025-12-06 09:24:26.744741838 +0000 UTC m=+1084.489002125" watchObservedRunningTime="2025-12-06 09:24:26.749230339 +0000 UTC m=+1084.493490616" Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.754656 4672 generic.go:334] "Generic (PLEG): container finished" podID="44d1e242-93af-4397-9030-232f8780c592" containerID="85ececd1cc8c15c1612c05efefb7fe0b207d8b33d92f8c4b3c9934dd37bdc40f" exitCode=0 Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.754695 4672 generic.go:334] "Generic (PLEG): container finished" podID="44d1e242-93af-4397-9030-232f8780c592" containerID="286e2d6aa83633d2d60bc321f1ccdebbf5c0991660a1dc41fa59e543550cc55d" exitCode=143 Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.754719 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"44d1e242-93af-4397-9030-232f8780c592","Type":"ContainerDied","Data":"85ececd1cc8c15c1612c05efefb7fe0b207d8b33d92f8c4b3c9934dd37bdc40f"} Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.754744 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"44d1e242-93af-4397-9030-232f8780c592","Type":"ContainerDied","Data":"286e2d6aa83633d2d60bc321f1ccdebbf5c0991660a1dc41fa59e543550cc55d"} Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.754754 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"44d1e242-93af-4397-9030-232f8780c592","Type":"ContainerDied","Data":"0372f26ba6a7d163caf1af3aca182e2c5f8f162679c1fb6752c69d76eb58ebc9"} Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.754771 4672 scope.go:117] "RemoveContainer" containerID="85ececd1cc8c15c1612c05efefb7fe0b207d8b33d92f8c4b3c9934dd37bdc40f" Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.754841 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.777357 4672 scope.go:117] "RemoveContainer" containerID="286e2d6aa83633d2d60bc321f1ccdebbf5c0991660a1dc41fa59e543550cc55d" Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.800079 4672 scope.go:117] "RemoveContainer" containerID="85ececd1cc8c15c1612c05efefb7fe0b207d8b33d92f8c4b3c9934dd37bdc40f" Dec 06 09:24:26 crc kubenswrapper[4672]: E1206 09:24:26.800707 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85ececd1cc8c15c1612c05efefb7fe0b207d8b33d92f8c4b3c9934dd37bdc40f\": container with ID starting with 85ececd1cc8c15c1612c05efefb7fe0b207d8b33d92f8c4b3c9934dd37bdc40f not found: ID does not exist" containerID="85ececd1cc8c15c1612c05efefb7fe0b207d8b33d92f8c4b3c9934dd37bdc40f" Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.800736 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85ececd1cc8c15c1612c05efefb7fe0b207d8b33d92f8c4b3c9934dd37bdc40f"} err="failed to get container status \"85ececd1cc8c15c1612c05efefb7fe0b207d8b33d92f8c4b3c9934dd37bdc40f\": rpc error: code = NotFound desc = could not find container \"85ececd1cc8c15c1612c05efefb7fe0b207d8b33d92f8c4b3c9934dd37bdc40f\": container with ID starting with 85ececd1cc8c15c1612c05efefb7fe0b207d8b33d92f8c4b3c9934dd37bdc40f not found: ID does not exist" Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.800759 4672 scope.go:117] "RemoveContainer" containerID="286e2d6aa83633d2d60bc321f1ccdebbf5c0991660a1dc41fa59e543550cc55d" Dec 06 09:24:26 crc kubenswrapper[4672]: E1206 09:24:26.801284 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"286e2d6aa83633d2d60bc321f1ccdebbf5c0991660a1dc41fa59e543550cc55d\": container with ID starting with 286e2d6aa83633d2d60bc321f1ccdebbf5c0991660a1dc41fa59e543550cc55d not found: ID does not exist" containerID="286e2d6aa83633d2d60bc321f1ccdebbf5c0991660a1dc41fa59e543550cc55d" Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.801308 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"286e2d6aa83633d2d60bc321f1ccdebbf5c0991660a1dc41fa59e543550cc55d"} err="failed to get container status \"286e2d6aa83633d2d60bc321f1ccdebbf5c0991660a1dc41fa59e543550cc55d\": rpc error: code = NotFound desc = could not find container \"286e2d6aa83633d2d60bc321f1ccdebbf5c0991660a1dc41fa59e543550cc55d\": container with ID starting with 286e2d6aa83633d2d60bc321f1ccdebbf5c0991660a1dc41fa59e543550cc55d not found: ID does not exist" Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.801323 4672 scope.go:117] "RemoveContainer" containerID="85ececd1cc8c15c1612c05efefb7fe0b207d8b33d92f8c4b3c9934dd37bdc40f" Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.801582 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5s7n\" (UniqueName: \"kubernetes.io/projected/44d1e242-93af-4397-9030-232f8780c592-kube-api-access-j5s7n\") pod \"44d1e242-93af-4397-9030-232f8780c592\" (UID: \"44d1e242-93af-4397-9030-232f8780c592\") " Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.801953 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44d1e242-93af-4397-9030-232f8780c592-config-data-custom\") pod \"44d1e242-93af-4397-9030-232f8780c592\" (UID: \"44d1e242-93af-4397-9030-232f8780c592\") " Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.802132 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44d1e242-93af-4397-9030-232f8780c592-config-data\") pod \"44d1e242-93af-4397-9030-232f8780c592\" (UID: \"44d1e242-93af-4397-9030-232f8780c592\") " Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.802223 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44d1e242-93af-4397-9030-232f8780c592-logs\") pod \"44d1e242-93af-4397-9030-232f8780c592\" (UID: \"44d1e242-93af-4397-9030-232f8780c592\") " Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.802436 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d1e242-93af-4397-9030-232f8780c592-combined-ca-bundle\") pod \"44d1e242-93af-4397-9030-232f8780c592\" (UID: \"44d1e242-93af-4397-9030-232f8780c592\") " Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.802486 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44d1e242-93af-4397-9030-232f8780c592-scripts\") pod \"44d1e242-93af-4397-9030-232f8780c592\" (UID: \"44d1e242-93af-4397-9030-232f8780c592\") " Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.802524 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/44d1e242-93af-4397-9030-232f8780c592-etc-machine-id\") pod \"44d1e242-93af-4397-9030-232f8780c592\" (UID: \"44d1e242-93af-4397-9030-232f8780c592\") " Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.802985 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44d1e242-93af-4397-9030-232f8780c592-logs" (OuterVolumeSpecName: "logs") pod "44d1e242-93af-4397-9030-232f8780c592" (UID: "44d1e242-93af-4397-9030-232f8780c592"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.801866 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85ececd1cc8c15c1612c05efefb7fe0b207d8b33d92f8c4b3c9934dd37bdc40f"} err="failed to get container status \"85ececd1cc8c15c1612c05efefb7fe0b207d8b33d92f8c4b3c9934dd37bdc40f\": rpc error: code = NotFound desc = could not find container \"85ececd1cc8c15c1612c05efefb7fe0b207d8b33d92f8c4b3c9934dd37bdc40f\": container with ID starting with 85ececd1cc8c15c1612c05efefb7fe0b207d8b33d92f8c4b3c9934dd37bdc40f not found: ID does not exist" Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.803382 4672 scope.go:117] "RemoveContainer" containerID="286e2d6aa83633d2d60bc321f1ccdebbf5c0991660a1dc41fa59e543550cc55d" Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.803576 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44d1e242-93af-4397-9030-232f8780c592-logs\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.803966 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44d1e242-93af-4397-9030-232f8780c592-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "44d1e242-93af-4397-9030-232f8780c592" (UID: "44d1e242-93af-4397-9030-232f8780c592"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.807195 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"286e2d6aa83633d2d60bc321f1ccdebbf5c0991660a1dc41fa59e543550cc55d"} err="failed to get container status \"286e2d6aa83633d2d60bc321f1ccdebbf5c0991660a1dc41fa59e543550cc55d\": rpc error: code = NotFound desc = could not find container \"286e2d6aa83633d2d60bc321f1ccdebbf5c0991660a1dc41fa59e543550cc55d\": container with ID starting with 286e2d6aa83633d2d60bc321f1ccdebbf5c0991660a1dc41fa59e543550cc55d not found: ID does not exist" Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.808407 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44d1e242-93af-4397-9030-232f8780c592-scripts" (OuterVolumeSpecName: "scripts") pod "44d1e242-93af-4397-9030-232f8780c592" (UID: "44d1e242-93af-4397-9030-232f8780c592"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.809190 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44d1e242-93af-4397-9030-232f8780c592-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "44d1e242-93af-4397-9030-232f8780c592" (UID: "44d1e242-93af-4397-9030-232f8780c592"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.809672 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44d1e242-93af-4397-9030-232f8780c592-kube-api-access-j5s7n" (OuterVolumeSpecName: "kube-api-access-j5s7n") pod "44d1e242-93af-4397-9030-232f8780c592" (UID: "44d1e242-93af-4397-9030-232f8780c592"). InnerVolumeSpecName "kube-api-access-j5s7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.863009 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44d1e242-93af-4397-9030-232f8780c592-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44d1e242-93af-4397-9030-232f8780c592" (UID: "44d1e242-93af-4397-9030-232f8780c592"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.908444 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d1e242-93af-4397-9030-232f8780c592-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.908543 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44d1e242-93af-4397-9030-232f8780c592-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.908556 4672 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/44d1e242-93af-4397-9030-232f8780c592-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.908578 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5s7n\" (UniqueName: \"kubernetes.io/projected/44d1e242-93af-4397-9030-232f8780c592-kube-api-access-j5s7n\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.911731 4672 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44d1e242-93af-4397-9030-232f8780c592-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:26 crc kubenswrapper[4672]: I1206 09:24:26.915687 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44d1e242-93af-4397-9030-232f8780c592-config-data" (OuterVolumeSpecName: "config-data") pod "44d1e242-93af-4397-9030-232f8780c592" (UID: "44d1e242-93af-4397-9030-232f8780c592"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.013724 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44d1e242-93af-4397-9030-232f8780c592-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.087306 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.094627 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.110172 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 06 09:24:27 crc kubenswrapper[4672]: E1206 09:24:27.110500 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d1e242-93af-4397-9030-232f8780c592" containerName="cinder-api" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.110518 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d1e242-93af-4397-9030-232f8780c592" containerName="cinder-api" Dec 06 09:24:27 crc kubenswrapper[4672]: E1206 09:24:27.110546 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d1e242-93af-4397-9030-232f8780c592" containerName="cinder-api-log" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.110554 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d1e242-93af-4397-9030-232f8780c592" containerName="cinder-api-log" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.110753 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d1e242-93af-4397-9030-232f8780c592" containerName="cinder-api-log" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.110790 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d1e242-93af-4397-9030-232f8780c592" containerName="cinder-api" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.114800 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.119305 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.119394 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.123055 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.156499 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.217221 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d2747dd-122d-4920-a266-6be569a3ab33-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1d2747dd-122d-4920-a266-6be569a3ab33\") " pod="openstack/cinder-api-0" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.217271 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d2747dd-122d-4920-a266-6be569a3ab33-config-data\") pod \"cinder-api-0\" (UID: \"1d2747dd-122d-4920-a266-6be569a3ab33\") " pod="openstack/cinder-api-0" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.217407 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d2747dd-122d-4920-a266-6be569a3ab33-scripts\") pod \"cinder-api-0\" (UID: \"1d2747dd-122d-4920-a266-6be569a3ab33\") " pod="openstack/cinder-api-0" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.217484 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d2747dd-122d-4920-a266-6be569a3ab33-config-data-custom\") pod \"cinder-api-0\" (UID: \"1d2747dd-122d-4920-a266-6be569a3ab33\") " pod="openstack/cinder-api-0" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.217572 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d2747dd-122d-4920-a266-6be569a3ab33-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1d2747dd-122d-4920-a266-6be569a3ab33\") " pod="openstack/cinder-api-0" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.217624 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d2747dd-122d-4920-a266-6be569a3ab33-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1d2747dd-122d-4920-a266-6be569a3ab33\") " pod="openstack/cinder-api-0" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.217665 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d2747dd-122d-4920-a266-6be569a3ab33-logs\") pod \"cinder-api-0\" (UID: \"1d2747dd-122d-4920-a266-6be569a3ab33\") " pod="openstack/cinder-api-0" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.217708 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1d2747dd-122d-4920-a266-6be569a3ab33-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1d2747dd-122d-4920-a266-6be569a3ab33\") " pod="openstack/cinder-api-0" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.217737 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkgbg\" (UniqueName: \"kubernetes.io/projected/1d2747dd-122d-4920-a266-6be569a3ab33-kube-api-access-lkgbg\") pod \"cinder-api-0\" (UID: \"1d2747dd-122d-4920-a266-6be569a3ab33\") " pod="openstack/cinder-api-0" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.319402 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d2747dd-122d-4920-a266-6be569a3ab33-config-data\") pod \"cinder-api-0\" (UID: \"1d2747dd-122d-4920-a266-6be569a3ab33\") " pod="openstack/cinder-api-0" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.319495 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d2747dd-122d-4920-a266-6be569a3ab33-scripts\") pod \"cinder-api-0\" (UID: \"1d2747dd-122d-4920-a266-6be569a3ab33\") " pod="openstack/cinder-api-0" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.319542 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d2747dd-122d-4920-a266-6be569a3ab33-config-data-custom\") pod \"cinder-api-0\" (UID: \"1d2747dd-122d-4920-a266-6be569a3ab33\") " pod="openstack/cinder-api-0" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.319588 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d2747dd-122d-4920-a266-6be569a3ab33-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1d2747dd-122d-4920-a266-6be569a3ab33\") " pod="openstack/cinder-api-0" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.319645 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d2747dd-122d-4920-a266-6be569a3ab33-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1d2747dd-122d-4920-a266-6be569a3ab33\") " pod="openstack/cinder-api-0" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.319688 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d2747dd-122d-4920-a266-6be569a3ab33-logs\") pod \"cinder-api-0\" (UID: \"1d2747dd-122d-4920-a266-6be569a3ab33\") " pod="openstack/cinder-api-0" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.319735 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1d2747dd-122d-4920-a266-6be569a3ab33-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1d2747dd-122d-4920-a266-6be569a3ab33\") " pod="openstack/cinder-api-0" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.319773 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkgbg\" (UniqueName: \"kubernetes.io/projected/1d2747dd-122d-4920-a266-6be569a3ab33-kube-api-access-lkgbg\") pod \"cinder-api-0\" (UID: \"1d2747dd-122d-4920-a266-6be569a3ab33\") " pod="openstack/cinder-api-0" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.319815 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d2747dd-122d-4920-a266-6be569a3ab33-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1d2747dd-122d-4920-a266-6be569a3ab33\") " pod="openstack/cinder-api-0" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.321283 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1d2747dd-122d-4920-a266-6be569a3ab33-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1d2747dd-122d-4920-a266-6be569a3ab33\") " pod="openstack/cinder-api-0" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.321752 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d2747dd-122d-4920-a266-6be569a3ab33-logs\") pod \"cinder-api-0\" (UID: \"1d2747dd-122d-4920-a266-6be569a3ab33\") " pod="openstack/cinder-api-0" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.326649 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d2747dd-122d-4920-a266-6be569a3ab33-scripts\") pod \"cinder-api-0\" (UID: \"1d2747dd-122d-4920-a266-6be569a3ab33\") " pod="openstack/cinder-api-0" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.326902 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d2747dd-122d-4920-a266-6be569a3ab33-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1d2747dd-122d-4920-a266-6be569a3ab33\") " pod="openstack/cinder-api-0" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.327154 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d2747dd-122d-4920-a266-6be569a3ab33-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1d2747dd-122d-4920-a266-6be569a3ab33\") " pod="openstack/cinder-api-0" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.327292 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d2747dd-122d-4920-a266-6be569a3ab33-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1d2747dd-122d-4920-a266-6be569a3ab33\") " pod="openstack/cinder-api-0" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.338451 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d2747dd-122d-4920-a266-6be569a3ab33-config-data\") pod \"cinder-api-0\" (UID: \"1d2747dd-122d-4920-a266-6be569a3ab33\") " pod="openstack/cinder-api-0" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.339034 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d2747dd-122d-4920-a266-6be569a3ab33-config-data-custom\") pod \"cinder-api-0\" (UID: \"1d2747dd-122d-4920-a266-6be569a3ab33\") " pod="openstack/cinder-api-0" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.340012 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkgbg\" (UniqueName: \"kubernetes.io/projected/1d2747dd-122d-4920-a266-6be569a3ab33-kube-api-access-lkgbg\") pod \"cinder-api-0\" (UID: \"1d2747dd-122d-4920-a266-6be569a3ab33\") " pod="openstack/cinder-api-0" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.432468 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.525743 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.801219 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4908de30-b638-44a7-b414-d1ac88946fb1","Type":"ContainerStarted","Data":"27fac396818df72cab7dd796207bb0cd99b118d530dfd6adc083d105564f79eb"} Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.801312 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4908de30-b638-44a7-b414-d1ac88946fb1","Type":"ContainerStarted","Data":"b8407de13682570be27a6bb0be250b0cbba8a25eaf0ea06fbc77a51d703a8d18"} Dec 06 09:24:27 crc kubenswrapper[4672]: I1206 09:24:27.966281 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 06 09:24:28 crc kubenswrapper[4672]: I1206 09:24:28.592104 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44d1e242-93af-4397-9030-232f8780c592" path="/var/lib/kubelet/pods/44d1e242-93af-4397-9030-232f8780c592/volumes" Dec 06 09:24:28 crc kubenswrapper[4672]: I1206 09:24:28.828752 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1d2747dd-122d-4920-a266-6be569a3ab33","Type":"ContainerStarted","Data":"298c8258924c78f84573bd8d9db6c4270fc35ed9912e2d3d1ffeff94576504ad"} Dec 06 09:24:28 crc kubenswrapper[4672]: I1206 09:24:28.828802 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1d2747dd-122d-4920-a266-6be569a3ab33","Type":"ContainerStarted","Data":"549e52357a311b4d3fc08b02289a67d6fd02e00a75991c36e20909fd0f250820"} Dec 06 09:24:29 crc kubenswrapper[4672]: I1206 09:24:29.828069 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4908de30-b638-44a7-b414-d1ac88946fb1","Type":"ContainerStarted","Data":"bbe749131c3b53b37cdf15f3dffdc2092a18c46cc2d9a79592b299cc701e9f73"} Dec 06 09:24:29 crc kubenswrapper[4672]: I1206 09:24:29.829422 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 09:24:29 crc kubenswrapper[4672]: I1206 09:24:29.832888 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1d2747dd-122d-4920-a266-6be569a3ab33","Type":"ContainerStarted","Data":"a1402cb9b0aeb87a8daabe8fd04e75a8f64e9d91345186db3bc09099514c2c06"} Dec 06 09:24:29 crc kubenswrapper[4672]: I1206 09:24:29.833571 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 06 09:24:29 crc kubenswrapper[4672]: I1206 09:24:29.860781 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.638043443 podStartE2EDuration="5.860765089s" podCreationTimestamp="2025-12-06 09:24:24 +0000 UTC" firstStartedPulling="2025-12-06 09:24:25.608259193 +0000 UTC m=+1083.352519480" lastFinishedPulling="2025-12-06 09:24:28.830980839 +0000 UTC m=+1086.575241126" observedRunningTime="2025-12-06 09:24:29.855201559 +0000 UTC m=+1087.599461856" watchObservedRunningTime="2025-12-06 09:24:29.860765089 +0000 UTC m=+1087.605025376" Dec 06 09:24:32 crc kubenswrapper[4672]: I1206 09:24:32.628721 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d6875bb67-xw22g" Dec 06 09:24:32 crc kubenswrapper[4672]: I1206 09:24:32.644972 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.64495942 podStartE2EDuration="5.64495942s" podCreationTimestamp="2025-12-06 09:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:24:29.886424693 +0000 UTC m=+1087.630685000" watchObservedRunningTime="2025-12-06 09:24:32.64495942 +0000 UTC m=+1090.389219707" Dec 06 09:24:32 crc kubenswrapper[4672]: I1206 09:24:32.701061 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cc79f6d89-lj8pk"] Dec 06 09:24:32 crc kubenswrapper[4672]: I1206 09:24:32.701490 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cc79f6d89-lj8pk" podUID="6c4cde3b-2757-4097-a95e-e9765a4f5b37" containerName="dnsmasq-dns" containerID="cri-o://20852569fb5d359adade53d3a3b28061d4b05c615ef4782f0e6616a32ef3b224" gracePeriod=10 Dec 06 09:24:32 crc kubenswrapper[4672]: I1206 09:24:32.863989 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 06 09:24:32 crc kubenswrapper[4672]: I1206 09:24:32.893646 4672 generic.go:334] "Generic (PLEG): container finished" podID="6c4cde3b-2757-4097-a95e-e9765a4f5b37" containerID="20852569fb5d359adade53d3a3b28061d4b05c615ef4782f0e6616a32ef3b224" exitCode=0 Dec 06 09:24:32 crc kubenswrapper[4672]: I1206 09:24:32.893685 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cc79f6d89-lj8pk" event={"ID":"6c4cde3b-2757-4097-a95e-e9765a4f5b37","Type":"ContainerDied","Data":"20852569fb5d359adade53d3a3b28061d4b05c615ef4782f0e6616a32ef3b224"} Dec 06 09:24:32 crc kubenswrapper[4672]: I1206 09:24:32.908469 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 09:24:33 crc kubenswrapper[4672]: I1206 09:24:33.293773 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cc79f6d89-lj8pk" Dec 06 09:24:33 crc kubenswrapper[4672]: I1206 09:24:33.347749 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c4cde3b-2757-4097-a95e-e9765a4f5b37-ovsdbserver-nb\") pod \"6c4cde3b-2757-4097-a95e-e9765a4f5b37\" (UID: \"6c4cde3b-2757-4097-a95e-e9765a4f5b37\") " Dec 06 09:24:33 crc kubenswrapper[4672]: I1206 09:24:33.347966 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4cde3b-2757-4097-a95e-e9765a4f5b37-config\") pod \"6c4cde3b-2757-4097-a95e-e9765a4f5b37\" (UID: \"6c4cde3b-2757-4097-a95e-e9765a4f5b37\") " Dec 06 09:24:33 crc kubenswrapper[4672]: I1206 09:24:33.348647 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c4cde3b-2757-4097-a95e-e9765a4f5b37-ovsdbserver-sb\") pod \"6c4cde3b-2757-4097-a95e-e9765a4f5b37\" (UID: \"6c4cde3b-2757-4097-a95e-e9765a4f5b37\") " Dec 06 09:24:33 crc kubenswrapper[4672]: I1206 09:24:33.348715 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c4cde3b-2757-4097-a95e-e9765a4f5b37-dns-svc\") pod \"6c4cde3b-2757-4097-a95e-e9765a4f5b37\" (UID: \"6c4cde3b-2757-4097-a95e-e9765a4f5b37\") " Dec 06 09:24:33 crc kubenswrapper[4672]: I1206 09:24:33.348809 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhfh2\" (UniqueName: \"kubernetes.io/projected/6c4cde3b-2757-4097-a95e-e9765a4f5b37-kube-api-access-mhfh2\") pod \"6c4cde3b-2757-4097-a95e-e9765a4f5b37\" (UID: \"6c4cde3b-2757-4097-a95e-e9765a4f5b37\") " Dec 06 09:24:33 crc kubenswrapper[4672]: I1206 09:24:33.386772 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c4cde3b-2757-4097-a95e-e9765a4f5b37-kube-api-access-mhfh2" (OuterVolumeSpecName: "kube-api-access-mhfh2") pod "6c4cde3b-2757-4097-a95e-e9765a4f5b37" (UID: "6c4cde3b-2757-4097-a95e-e9765a4f5b37"). InnerVolumeSpecName "kube-api-access-mhfh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:24:33 crc kubenswrapper[4672]: I1206 09:24:33.422327 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c4cde3b-2757-4097-a95e-e9765a4f5b37-config" (OuterVolumeSpecName: "config") pod "6c4cde3b-2757-4097-a95e-e9765a4f5b37" (UID: "6c4cde3b-2757-4097-a95e-e9765a4f5b37"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:24:33 crc kubenswrapper[4672]: I1206 09:24:33.428898 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c4cde3b-2757-4097-a95e-e9765a4f5b37-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6c4cde3b-2757-4097-a95e-e9765a4f5b37" (UID: "6c4cde3b-2757-4097-a95e-e9765a4f5b37"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:24:33 crc kubenswrapper[4672]: I1206 09:24:33.449164 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c4cde3b-2757-4097-a95e-e9765a4f5b37-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6c4cde3b-2757-4097-a95e-e9765a4f5b37" (UID: "6c4cde3b-2757-4097-a95e-e9765a4f5b37"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:24:33 crc kubenswrapper[4672]: I1206 09:24:33.452112 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4cde3b-2757-4097-a95e-e9765a4f5b37-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:33 crc kubenswrapper[4672]: I1206 09:24:33.452141 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c4cde3b-2757-4097-a95e-e9765a4f5b37-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:33 crc kubenswrapper[4672]: I1206 09:24:33.452267 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhfh2\" (UniqueName: \"kubernetes.io/projected/6c4cde3b-2757-4097-a95e-e9765a4f5b37-kube-api-access-mhfh2\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:33 crc kubenswrapper[4672]: I1206 09:24:33.452286 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c4cde3b-2757-4097-a95e-e9765a4f5b37-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:33 crc kubenswrapper[4672]: I1206 09:24:33.454562 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c4cde3b-2757-4097-a95e-e9765a4f5b37-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6c4cde3b-2757-4097-a95e-e9765a4f5b37" (UID: "6c4cde3b-2757-4097-a95e-e9765a4f5b37"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:24:33 crc kubenswrapper[4672]: I1206 09:24:33.554179 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c4cde3b-2757-4097-a95e-e9765a4f5b37-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:33 crc kubenswrapper[4672]: I1206 09:24:33.902686 4672 generic.go:334] "Generic (PLEG): container finished" podID="b2eaecb7-d959-4452-ba91-029191056f70" containerID="fcfbb3a8fa34c8ded377c8feb108edb04a6c456309a3c94277e9b7012da91f57" exitCode=0 Dec 06 09:24:33 crc kubenswrapper[4672]: I1206 09:24:33.902754 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85f8bd445b-dxm5t" event={"ID":"b2eaecb7-d959-4452-ba91-029191056f70","Type":"ContainerDied","Data":"fcfbb3a8fa34c8ded377c8feb108edb04a6c456309a3c94277e9b7012da91f57"} Dec 06 09:24:33 crc kubenswrapper[4672]: I1206 09:24:33.903036 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85f8bd445b-dxm5t" event={"ID":"b2eaecb7-d959-4452-ba91-029191056f70","Type":"ContainerDied","Data":"badedc59d33a3c4f8cdc316723811c668feba004a801b5dda75bdf97e1e8070d"} Dec 06 09:24:33 crc kubenswrapper[4672]: I1206 09:24:33.903053 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="badedc59d33a3c4f8cdc316723811c668feba004a801b5dda75bdf97e1e8070d" Dec 06 09:24:33 crc kubenswrapper[4672]: I1206 09:24:33.904628 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cc79f6d89-lj8pk" event={"ID":"6c4cde3b-2757-4097-a95e-e9765a4f5b37","Type":"ContainerDied","Data":"95ec93f795c88c16940a4bdd6c86bcb038aa7f586f64b66b62f1297074d17123"} Dec 06 09:24:33 crc kubenswrapper[4672]: I1206 09:24:33.904642 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cc79f6d89-lj8pk" Dec 06 09:24:33 crc kubenswrapper[4672]: I1206 09:24:33.904688 4672 scope.go:117] "RemoveContainer" containerID="20852569fb5d359adade53d3a3b28061d4b05c615ef4782f0e6616a32ef3b224" Dec 06 09:24:33 crc kubenswrapper[4672]: I1206 09:24:33.904795 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="28f3b4d4-d557-4187-b590-4601d0a3fb9b" containerName="cinder-scheduler" containerID="cri-o://c3956c98bf2e5038917be18b9912b72873976c696a61c666e5cdb5bce5f6d8c5" gracePeriod=30 Dec 06 09:24:33 crc kubenswrapper[4672]: I1206 09:24:33.904868 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="28f3b4d4-d557-4187-b590-4601d0a3fb9b" containerName="probe" containerID="cri-o://78b4ad04e2040adf4a9fc51f6d29be83a9f61e31a953edc0ab68afb7394694ef" gracePeriod=30 Dec 06 09:24:33 crc kubenswrapper[4672]: I1206 09:24:33.956498 4672 scope.go:117] "RemoveContainer" containerID="c39fb0504c6bc2758d2c3e37cb76994065f797294f100b34cc565ef50bb97a8b" Dec 06 09:24:33 crc kubenswrapper[4672]: I1206 09:24:33.961464 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-85f8bd445b-dxm5t" Dec 06 09:24:33 crc kubenswrapper[4672]: I1206 09:24:33.982288 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cc79f6d89-lj8pk"] Dec 06 09:24:33 crc kubenswrapper[4672]: I1206 09:24:33.989063 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cc79f6d89-lj8pk"] Dec 06 09:24:34 crc kubenswrapper[4672]: I1206 09:24:34.063262 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b2eaecb7-d959-4452-ba91-029191056f70-config\") pod \"b2eaecb7-d959-4452-ba91-029191056f70\" (UID: \"b2eaecb7-d959-4452-ba91-029191056f70\") " Dec 06 09:24:34 crc kubenswrapper[4672]: I1206 09:24:34.063319 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2eaecb7-d959-4452-ba91-029191056f70-ovndb-tls-certs\") pod \"b2eaecb7-d959-4452-ba91-029191056f70\" (UID: \"b2eaecb7-d959-4452-ba91-029191056f70\") " Dec 06 09:24:34 crc kubenswrapper[4672]: I1206 09:24:34.063354 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2eaecb7-d959-4452-ba91-029191056f70-combined-ca-bundle\") pod \"b2eaecb7-d959-4452-ba91-029191056f70\" (UID: \"b2eaecb7-d959-4452-ba91-029191056f70\") " Dec 06 09:24:34 crc kubenswrapper[4672]: I1206 09:24:34.063415 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpsdf\" (UniqueName: \"kubernetes.io/projected/b2eaecb7-d959-4452-ba91-029191056f70-kube-api-access-rpsdf\") pod \"b2eaecb7-d959-4452-ba91-029191056f70\" (UID: \"b2eaecb7-d959-4452-ba91-029191056f70\") " Dec 06 09:24:34 crc kubenswrapper[4672]: I1206 09:24:34.063507 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b2eaecb7-d959-4452-ba91-029191056f70-httpd-config\") pod \"b2eaecb7-d959-4452-ba91-029191056f70\" (UID: \"b2eaecb7-d959-4452-ba91-029191056f70\") " Dec 06 09:24:34 crc kubenswrapper[4672]: I1206 09:24:34.067772 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2eaecb7-d959-4452-ba91-029191056f70-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "b2eaecb7-d959-4452-ba91-029191056f70" (UID: "b2eaecb7-d959-4452-ba91-029191056f70"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:34 crc kubenswrapper[4672]: I1206 09:24:34.081816 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2eaecb7-d959-4452-ba91-029191056f70-kube-api-access-rpsdf" (OuterVolumeSpecName: "kube-api-access-rpsdf") pod "b2eaecb7-d959-4452-ba91-029191056f70" (UID: "b2eaecb7-d959-4452-ba91-029191056f70"). InnerVolumeSpecName "kube-api-access-rpsdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:24:34 crc kubenswrapper[4672]: I1206 09:24:34.120799 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2eaecb7-d959-4452-ba91-029191056f70-config" (OuterVolumeSpecName: "config") pod "b2eaecb7-d959-4452-ba91-029191056f70" (UID: "b2eaecb7-d959-4452-ba91-029191056f70"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:34 crc kubenswrapper[4672]: I1206 09:24:34.129876 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2eaecb7-d959-4452-ba91-029191056f70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2eaecb7-d959-4452-ba91-029191056f70" (UID: "b2eaecb7-d959-4452-ba91-029191056f70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:34 crc kubenswrapper[4672]: I1206 09:24:34.165709 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpsdf\" (UniqueName: \"kubernetes.io/projected/b2eaecb7-d959-4452-ba91-029191056f70-kube-api-access-rpsdf\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:34 crc kubenswrapper[4672]: I1206 09:24:34.165746 4672 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b2eaecb7-d959-4452-ba91-029191056f70-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:34 crc kubenswrapper[4672]: I1206 09:24:34.165759 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b2eaecb7-d959-4452-ba91-029191056f70-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:34 crc kubenswrapper[4672]: I1206 09:24:34.165795 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2eaecb7-d959-4452-ba91-029191056f70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:34 crc kubenswrapper[4672]: I1206 09:24:34.170783 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2eaecb7-d959-4452-ba91-029191056f70-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "b2eaecb7-d959-4452-ba91-029191056f70" (UID: "b2eaecb7-d959-4452-ba91-029191056f70"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:34 crc kubenswrapper[4672]: I1206 09:24:34.267315 4672 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2eaecb7-d959-4452-ba91-029191056f70-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:34 crc kubenswrapper[4672]: I1206 09:24:34.489122 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-76bb4c894-tw7m5" Dec 06 09:24:34 crc kubenswrapper[4672]: I1206 09:24:34.568585 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c4cde3b-2757-4097-a95e-e9765a4f5b37" path="/var/lib/kubelet/pods/6c4cde3b-2757-4097-a95e-e9765a4f5b37/volumes" Dec 06 09:24:34 crc kubenswrapper[4672]: I1206 09:24:34.915378 4672 generic.go:334] "Generic (PLEG): container finished" podID="28f3b4d4-d557-4187-b590-4601d0a3fb9b" containerID="78b4ad04e2040adf4a9fc51f6d29be83a9f61e31a953edc0ab68afb7394694ef" exitCode=0 Dec 06 09:24:34 crc kubenswrapper[4672]: I1206 09:24:34.915460 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"28f3b4d4-d557-4187-b590-4601d0a3fb9b","Type":"ContainerDied","Data":"78b4ad04e2040adf4a9fc51f6d29be83a9f61e31a953edc0ab68afb7394694ef"} Dec 06 09:24:34 crc kubenswrapper[4672]: I1206 09:24:34.916966 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-85f8bd445b-dxm5t" Dec 06 09:24:34 crc kubenswrapper[4672]: I1206 09:24:34.948033 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-85f8bd445b-dxm5t"] Dec 06 09:24:34 crc kubenswrapper[4672]: I1206 09:24:34.971560 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-85f8bd445b-dxm5t"] Dec 06 09:24:36 crc kubenswrapper[4672]: I1206 09:24:36.167447 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 06 09:24:36 crc kubenswrapper[4672]: E1206 09:24:36.168212 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c4cde3b-2757-4097-a95e-e9765a4f5b37" containerName="dnsmasq-dns" Dec 06 09:24:36 crc kubenswrapper[4672]: I1206 09:24:36.168228 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c4cde3b-2757-4097-a95e-e9765a4f5b37" containerName="dnsmasq-dns" Dec 06 09:24:36 crc kubenswrapper[4672]: E1206 09:24:36.168248 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2eaecb7-d959-4452-ba91-029191056f70" containerName="neutron-api" Dec 06 09:24:36 crc kubenswrapper[4672]: I1206 09:24:36.168255 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2eaecb7-d959-4452-ba91-029191056f70" containerName="neutron-api" Dec 06 09:24:36 crc kubenswrapper[4672]: E1206 09:24:36.168283 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c4cde3b-2757-4097-a95e-e9765a4f5b37" containerName="init" Dec 06 09:24:36 crc kubenswrapper[4672]: I1206 09:24:36.168291 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c4cde3b-2757-4097-a95e-e9765a4f5b37" containerName="init" Dec 06 09:24:36 crc kubenswrapper[4672]: E1206 09:24:36.168302 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2eaecb7-d959-4452-ba91-029191056f70" containerName="neutron-httpd" Dec 06 09:24:36 crc kubenswrapper[4672]: I1206 09:24:36.168310 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2eaecb7-d959-4452-ba91-029191056f70" containerName="neutron-httpd" Dec 06 09:24:36 crc kubenswrapper[4672]: I1206 09:24:36.168525 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c4cde3b-2757-4097-a95e-e9765a4f5b37" containerName="dnsmasq-dns" Dec 06 09:24:36 crc kubenswrapper[4672]: I1206 09:24:36.168542 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2eaecb7-d959-4452-ba91-029191056f70" containerName="neutron-httpd" Dec 06 09:24:36 crc kubenswrapper[4672]: I1206 09:24:36.168557 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2eaecb7-d959-4452-ba91-029191056f70" containerName="neutron-api" Dec 06 09:24:36 crc kubenswrapper[4672]: I1206 09:24:36.169477 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 06 09:24:36 crc kubenswrapper[4672]: I1206 09:24:36.177169 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-qv2cm" Dec 06 09:24:36 crc kubenswrapper[4672]: I1206 09:24:36.177220 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 06 09:24:36 crc kubenswrapper[4672]: I1206 09:24:36.177321 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 06 09:24:36 crc kubenswrapper[4672]: I1206 09:24:36.180780 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 06 09:24:36 crc kubenswrapper[4672]: I1206 09:24:36.215343 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44fea3e1-80c8-4525-b613-467978a95351-combined-ca-bundle\") pod \"openstackclient\" (UID: \"44fea3e1-80c8-4525-b613-467978a95351\") " pod="openstack/openstackclient" Dec 06 09:24:36 crc kubenswrapper[4672]: I1206 09:24:36.215437 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/44fea3e1-80c8-4525-b613-467978a95351-openstack-config\") pod \"openstackclient\" (UID: \"44fea3e1-80c8-4525-b613-467978a95351\") " pod="openstack/openstackclient" Dec 06 09:24:36 crc kubenswrapper[4672]: I1206 09:24:36.215464 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/44fea3e1-80c8-4525-b613-467978a95351-openstack-config-secret\") pod \"openstackclient\" (UID: \"44fea3e1-80c8-4525-b613-467978a95351\") " pod="openstack/openstackclient" Dec 06 09:24:36 crc kubenswrapper[4672]: I1206 09:24:36.215506 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv754\" (UniqueName: \"kubernetes.io/projected/44fea3e1-80c8-4525-b613-467978a95351-kube-api-access-bv754\") pod \"openstackclient\" (UID: \"44fea3e1-80c8-4525-b613-467978a95351\") " pod="openstack/openstackclient" Dec 06 09:24:36 crc kubenswrapper[4672]: I1206 09:24:36.316677 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44fea3e1-80c8-4525-b613-467978a95351-combined-ca-bundle\") pod \"openstackclient\" (UID: \"44fea3e1-80c8-4525-b613-467978a95351\") " pod="openstack/openstackclient" Dec 06 09:24:36 crc kubenswrapper[4672]: I1206 09:24:36.316800 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/44fea3e1-80c8-4525-b613-467978a95351-openstack-config\") pod \"openstackclient\" (UID: \"44fea3e1-80c8-4525-b613-467978a95351\") " pod="openstack/openstackclient" Dec 06 09:24:36 crc kubenswrapper[4672]: I1206 09:24:36.316828 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/44fea3e1-80c8-4525-b613-467978a95351-openstack-config-secret\") pod \"openstackclient\" (UID: \"44fea3e1-80c8-4525-b613-467978a95351\") " pod="openstack/openstackclient" Dec 06 09:24:36 crc kubenswrapper[4672]: I1206 09:24:36.317921 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/44fea3e1-80c8-4525-b613-467978a95351-openstack-config\") pod \"openstackclient\" (UID: \"44fea3e1-80c8-4525-b613-467978a95351\") " pod="openstack/openstackclient" Dec 06 09:24:36 crc kubenswrapper[4672]: I1206 09:24:36.318045 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv754\" (UniqueName: \"kubernetes.io/projected/44fea3e1-80c8-4525-b613-467978a95351-kube-api-access-bv754\") pod \"openstackclient\" (UID: \"44fea3e1-80c8-4525-b613-467978a95351\") " pod="openstack/openstackclient" Dec 06 09:24:36 crc kubenswrapper[4672]: I1206 09:24:36.321145 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44fea3e1-80c8-4525-b613-467978a95351-combined-ca-bundle\") pod \"openstackclient\" (UID: \"44fea3e1-80c8-4525-b613-467978a95351\") " pod="openstack/openstackclient" Dec 06 09:24:36 crc kubenswrapper[4672]: I1206 09:24:36.324198 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/44fea3e1-80c8-4525-b613-467978a95351-openstack-config-secret\") pod \"openstackclient\" (UID: \"44fea3e1-80c8-4525-b613-467978a95351\") " pod="openstack/openstackclient" Dec 06 09:24:36 crc kubenswrapper[4672]: I1206 09:24:36.348494 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv754\" (UniqueName: \"kubernetes.io/projected/44fea3e1-80c8-4525-b613-467978a95351-kube-api-access-bv754\") pod \"openstackclient\" (UID: \"44fea3e1-80c8-4525-b613-467978a95351\") " pod="openstack/openstackclient" Dec 06 09:24:36 crc kubenswrapper[4672]: I1206 09:24:36.486827 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 06 09:24:36 crc kubenswrapper[4672]: I1206 09:24:36.575946 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2eaecb7-d959-4452-ba91-029191056f70" path="/var/lib/kubelet/pods/b2eaecb7-d959-4452-ba91-029191056f70/volumes" Dec 06 09:24:36 crc kubenswrapper[4672]: I1206 09:24:36.978819 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 06 09:24:37 crc kubenswrapper[4672]: I1206 09:24:37.594824 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 09:24:37 crc kubenswrapper[4672]: I1206 09:24:37.740333 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28f3b4d4-d557-4187-b590-4601d0a3fb9b-config-data-custom\") pod \"28f3b4d4-d557-4187-b590-4601d0a3fb9b\" (UID: \"28f3b4d4-d557-4187-b590-4601d0a3fb9b\") " Dec 06 09:24:37 crc kubenswrapper[4672]: I1206 09:24:37.740403 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f3b4d4-d557-4187-b590-4601d0a3fb9b-combined-ca-bundle\") pod \"28f3b4d4-d557-4187-b590-4601d0a3fb9b\" (UID: \"28f3b4d4-d557-4187-b590-4601d0a3fb9b\") " Dec 06 09:24:37 crc kubenswrapper[4672]: I1206 09:24:37.740432 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28f3b4d4-d557-4187-b590-4601d0a3fb9b-etc-machine-id\") pod \"28f3b4d4-d557-4187-b590-4601d0a3fb9b\" (UID: \"28f3b4d4-d557-4187-b590-4601d0a3fb9b\") " Dec 06 09:24:37 crc kubenswrapper[4672]: I1206 09:24:37.740465 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28f3b4d4-d557-4187-b590-4601d0a3fb9b-scripts\") pod \"28f3b4d4-d557-4187-b590-4601d0a3fb9b\" (UID: \"28f3b4d4-d557-4187-b590-4601d0a3fb9b\") " Dec 06 09:24:37 crc kubenswrapper[4672]: I1206 09:24:37.740499 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6mxr\" (UniqueName: \"kubernetes.io/projected/28f3b4d4-d557-4187-b590-4601d0a3fb9b-kube-api-access-v6mxr\") pod \"28f3b4d4-d557-4187-b590-4601d0a3fb9b\" (UID: \"28f3b4d4-d557-4187-b590-4601d0a3fb9b\") " Dec 06 09:24:37 crc kubenswrapper[4672]: I1206 09:24:37.740546 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28f3b4d4-d557-4187-b590-4601d0a3fb9b-config-data\") pod \"28f3b4d4-d557-4187-b590-4601d0a3fb9b\" (UID: \"28f3b4d4-d557-4187-b590-4601d0a3fb9b\") " Dec 06 09:24:37 crc kubenswrapper[4672]: I1206 09:24:37.740749 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28f3b4d4-d557-4187-b590-4601d0a3fb9b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "28f3b4d4-d557-4187-b590-4601d0a3fb9b" (UID: "28f3b4d4-d557-4187-b590-4601d0a3fb9b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 09:24:37 crc kubenswrapper[4672]: I1206 09:24:37.740998 4672 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28f3b4d4-d557-4187-b590-4601d0a3fb9b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:37 crc kubenswrapper[4672]: I1206 09:24:37.749928 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28f3b4d4-d557-4187-b590-4601d0a3fb9b-scripts" (OuterVolumeSpecName: "scripts") pod "28f3b4d4-d557-4187-b590-4601d0a3fb9b" (UID: "28f3b4d4-d557-4187-b590-4601d0a3fb9b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:37 crc kubenswrapper[4672]: I1206 09:24:37.751550 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28f3b4d4-d557-4187-b590-4601d0a3fb9b-kube-api-access-v6mxr" (OuterVolumeSpecName: "kube-api-access-v6mxr") pod "28f3b4d4-d557-4187-b590-4601d0a3fb9b" (UID: "28f3b4d4-d557-4187-b590-4601d0a3fb9b"). InnerVolumeSpecName "kube-api-access-v6mxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:24:37 crc kubenswrapper[4672]: I1206 09:24:37.765408 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28f3b4d4-d557-4187-b590-4601d0a3fb9b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "28f3b4d4-d557-4187-b590-4601d0a3fb9b" (UID: "28f3b4d4-d557-4187-b590-4601d0a3fb9b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:37 crc kubenswrapper[4672]: I1206 09:24:37.805736 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28f3b4d4-d557-4187-b590-4601d0a3fb9b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28f3b4d4-d557-4187-b590-4601d0a3fb9b" (UID: "28f3b4d4-d557-4187-b590-4601d0a3fb9b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:37 crc kubenswrapper[4672]: I1206 09:24:37.842328 4672 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28f3b4d4-d557-4187-b590-4601d0a3fb9b-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:37 crc kubenswrapper[4672]: I1206 09:24:37.842360 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f3b4d4-d557-4187-b590-4601d0a3fb9b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:37 crc kubenswrapper[4672]: I1206 09:24:37.842372 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28f3b4d4-d557-4187-b590-4601d0a3fb9b-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:37 crc kubenswrapper[4672]: I1206 09:24:37.842380 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6mxr\" (UniqueName: \"kubernetes.io/projected/28f3b4d4-d557-4187-b590-4601d0a3fb9b-kube-api-access-v6mxr\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:37 crc kubenswrapper[4672]: I1206 09:24:37.852463 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28f3b4d4-d557-4187-b590-4601d0a3fb9b-config-data" (OuterVolumeSpecName: "config-data") pod "28f3b4d4-d557-4187-b590-4601d0a3fb9b" (UID: "28f3b4d4-d557-4187-b590-4601d0a3fb9b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:37 crc kubenswrapper[4672]: I1206 09:24:37.940053 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"44fea3e1-80c8-4525-b613-467978a95351","Type":"ContainerStarted","Data":"c58f1146e76b019ea68bd1975ddfe7a5e28d5f30dcd87db0308dd9f44f40d61a"} Dec 06 09:24:37 crc kubenswrapper[4672]: I1206 09:24:37.943116 4672 generic.go:334] "Generic (PLEG): container finished" podID="28f3b4d4-d557-4187-b590-4601d0a3fb9b" containerID="c3956c98bf2e5038917be18b9912b72873976c696a61c666e5cdb5bce5f6d8c5" exitCode=0 Dec 06 09:24:37 crc kubenswrapper[4672]: I1206 09:24:37.943173 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"28f3b4d4-d557-4187-b590-4601d0a3fb9b","Type":"ContainerDied","Data":"c3956c98bf2e5038917be18b9912b72873976c696a61c666e5cdb5bce5f6d8c5"} Dec 06 09:24:37 crc kubenswrapper[4672]: I1206 09:24:37.943231 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 09:24:37 crc kubenswrapper[4672]: I1206 09:24:37.943291 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"28f3b4d4-d557-4187-b590-4601d0a3fb9b","Type":"ContainerDied","Data":"b31f624a4a2d8977de7d00fc005bbaaa49884d60390ff1d35e1446f7366c8041"} Dec 06 09:24:37 crc kubenswrapper[4672]: I1206 09:24:37.943304 4672 scope.go:117] "RemoveContainer" containerID="78b4ad04e2040adf4a9fc51f6d29be83a9f61e31a953edc0ab68afb7394694ef" Dec 06 09:24:37 crc kubenswrapper[4672]: I1206 09:24:37.949174 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28f3b4d4-d557-4187-b590-4601d0a3fb9b-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:37 crc kubenswrapper[4672]: I1206 09:24:37.973919 4672 scope.go:117] "RemoveContainer" containerID="c3956c98bf2e5038917be18b9912b72873976c696a61c666e5cdb5bce5f6d8c5" Dec 06 09:24:37 crc kubenswrapper[4672]: I1206 09:24:37.993092 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.002761 4672 scope.go:117] "RemoveContainer" containerID="78b4ad04e2040adf4a9fc51f6d29be83a9f61e31a953edc0ab68afb7394694ef" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.005033 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 09:24:38 crc kubenswrapper[4672]: E1206 09:24:38.006785 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78b4ad04e2040adf4a9fc51f6d29be83a9f61e31a953edc0ab68afb7394694ef\": container with ID starting with 78b4ad04e2040adf4a9fc51f6d29be83a9f61e31a953edc0ab68afb7394694ef not found: ID does not exist" containerID="78b4ad04e2040adf4a9fc51f6d29be83a9f61e31a953edc0ab68afb7394694ef" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.006820 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78b4ad04e2040adf4a9fc51f6d29be83a9f61e31a953edc0ab68afb7394694ef"} err="failed to get container status \"78b4ad04e2040adf4a9fc51f6d29be83a9f61e31a953edc0ab68afb7394694ef\": rpc error: code = NotFound desc = could not find container \"78b4ad04e2040adf4a9fc51f6d29be83a9f61e31a953edc0ab68afb7394694ef\": container with ID starting with 78b4ad04e2040adf4a9fc51f6d29be83a9f61e31a953edc0ab68afb7394694ef not found: ID does not exist" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.006847 4672 scope.go:117] "RemoveContainer" containerID="c3956c98bf2e5038917be18b9912b72873976c696a61c666e5cdb5bce5f6d8c5" Dec 06 09:24:38 crc kubenswrapper[4672]: E1206 09:24:38.008315 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3956c98bf2e5038917be18b9912b72873976c696a61c666e5cdb5bce5f6d8c5\": container with ID starting with c3956c98bf2e5038917be18b9912b72873976c696a61c666e5cdb5bce5f6d8c5 not found: ID does not exist" containerID="c3956c98bf2e5038917be18b9912b72873976c696a61c666e5cdb5bce5f6d8c5" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.008354 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3956c98bf2e5038917be18b9912b72873976c696a61c666e5cdb5bce5f6d8c5"} err="failed to get container status \"c3956c98bf2e5038917be18b9912b72873976c696a61c666e5cdb5bce5f6d8c5\": rpc error: code = NotFound desc = could not find container \"c3956c98bf2e5038917be18b9912b72873976c696a61c666e5cdb5bce5f6d8c5\": container with ID starting with c3956c98bf2e5038917be18b9912b72873976c696a61c666e5cdb5bce5f6d8c5 not found: ID does not exist" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.013709 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 09:24:38 crc kubenswrapper[4672]: E1206 09:24:38.014089 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28f3b4d4-d557-4187-b590-4601d0a3fb9b" containerName="cinder-scheduler" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.014106 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="28f3b4d4-d557-4187-b590-4601d0a3fb9b" containerName="cinder-scheduler" Dec 06 09:24:38 crc kubenswrapper[4672]: E1206 09:24:38.014126 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28f3b4d4-d557-4187-b590-4601d0a3fb9b" containerName="probe" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.014133 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="28f3b4d4-d557-4187-b590-4601d0a3fb9b" containerName="probe" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.014300 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="28f3b4d4-d557-4187-b590-4601d0a3fb9b" containerName="cinder-scheduler" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.014322 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="28f3b4d4-d557-4187-b590-4601d0a3fb9b" containerName="probe" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.015160 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.018715 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.026187 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.050633 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dda2373-3b28-4086-b29c-3415f50f1d92-config-data\") pod \"cinder-scheduler-0\" (UID: \"8dda2373-3b28-4086-b29c-3415f50f1d92\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.050708 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgf7b\" (UniqueName: \"kubernetes.io/projected/8dda2373-3b28-4086-b29c-3415f50f1d92-kube-api-access-sgf7b\") pod \"cinder-scheduler-0\" (UID: \"8dda2373-3b28-4086-b29c-3415f50f1d92\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.050811 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dda2373-3b28-4086-b29c-3415f50f1d92-scripts\") pod \"cinder-scheduler-0\" (UID: \"8dda2373-3b28-4086-b29c-3415f50f1d92\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.050829 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8dda2373-3b28-4086-b29c-3415f50f1d92-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8dda2373-3b28-4086-b29c-3415f50f1d92\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.051098 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dda2373-3b28-4086-b29c-3415f50f1d92-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8dda2373-3b28-4086-b29c-3415f50f1d92\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.051222 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8dda2373-3b28-4086-b29c-3415f50f1d92-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8dda2373-3b28-4086-b29c-3415f50f1d92\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.152705 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dda2373-3b28-4086-b29c-3415f50f1d92-scripts\") pod \"cinder-scheduler-0\" (UID: \"8dda2373-3b28-4086-b29c-3415f50f1d92\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.152751 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8dda2373-3b28-4086-b29c-3415f50f1d92-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8dda2373-3b28-4086-b29c-3415f50f1d92\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.152830 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dda2373-3b28-4086-b29c-3415f50f1d92-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8dda2373-3b28-4086-b29c-3415f50f1d92\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.152870 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8dda2373-3b28-4086-b29c-3415f50f1d92-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8dda2373-3b28-4086-b29c-3415f50f1d92\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.152893 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dda2373-3b28-4086-b29c-3415f50f1d92-config-data\") pod \"cinder-scheduler-0\" (UID: \"8dda2373-3b28-4086-b29c-3415f50f1d92\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.152909 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgf7b\" (UniqueName: \"kubernetes.io/projected/8dda2373-3b28-4086-b29c-3415f50f1d92-kube-api-access-sgf7b\") pod \"cinder-scheduler-0\" (UID: \"8dda2373-3b28-4086-b29c-3415f50f1d92\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.153279 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8dda2373-3b28-4086-b29c-3415f50f1d92-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8dda2373-3b28-4086-b29c-3415f50f1d92\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.159182 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dda2373-3b28-4086-b29c-3415f50f1d92-scripts\") pod \"cinder-scheduler-0\" (UID: \"8dda2373-3b28-4086-b29c-3415f50f1d92\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.159430 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dda2373-3b28-4086-b29c-3415f50f1d92-config-data\") pod \"cinder-scheduler-0\" (UID: \"8dda2373-3b28-4086-b29c-3415f50f1d92\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.164491 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dda2373-3b28-4086-b29c-3415f50f1d92-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8dda2373-3b28-4086-b29c-3415f50f1d92\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.179781 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgf7b\" (UniqueName: \"kubernetes.io/projected/8dda2373-3b28-4086-b29c-3415f50f1d92-kube-api-access-sgf7b\") pod \"cinder-scheduler-0\" (UID: \"8dda2373-3b28-4086-b29c-3415f50f1d92\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.182124 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8dda2373-3b28-4086-b29c-3415f50f1d92-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8dda2373-3b28-4086-b29c-3415f50f1d92\") " pod="openstack/cinder-scheduler-0" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.385572 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.616275 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28f3b4d4-d557-4187-b590-4601d0a3fb9b" path="/var/lib/kubelet/pods/28f3b4d4-d557-4187-b590-4601d0a3fb9b/volumes" Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.701682 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 09:24:38 crc kubenswrapper[4672]: I1206 09:24:38.956146 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8dda2373-3b28-4086-b29c-3415f50f1d92","Type":"ContainerStarted","Data":"18688e92919bf365d7eaa3af52b506c3f855ba1d287a9ad37f0cf1eaccf189cf"} Dec 06 09:24:39 crc kubenswrapper[4672]: I1206 09:24:39.682090 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 06 09:24:39 crc kubenswrapper[4672]: I1206 09:24:39.969069 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8dda2373-3b28-4086-b29c-3415f50f1d92","Type":"ContainerStarted","Data":"8713945f374f36a53b7012acd04b942dd21cecd9e1317ea540c61159ec692186"} Dec 06 09:24:40 crc kubenswrapper[4672]: I1206 09:24:40.985753 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8dda2373-3b28-4086-b29c-3415f50f1d92","Type":"ContainerStarted","Data":"548e3b01e6cc089c3b9455c12c474bccdd76cd7854246686ba556d58ec23dd16"} Dec 06 09:24:41 crc kubenswrapper[4672]: I1206 09:24:41.012811 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.012785952 podStartE2EDuration="4.012785952s" podCreationTimestamp="2025-12-06 09:24:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:24:41.007577701 +0000 UTC m=+1098.751838028" watchObservedRunningTime="2025-12-06 09:24:41.012785952 +0000 UTC m=+1098.757046239" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.144053 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-7xtlx"] Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.145033 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7xtlx" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.187386 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-7xtlx"] Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.244261 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbrlt\" (UniqueName: \"kubernetes.io/projected/8acaf68e-bce8-458b-bdb7-054e1ea6269a-kube-api-access-xbrlt\") pod \"nova-api-db-create-7xtlx\" (UID: \"8acaf68e-bce8-458b-bdb7-054e1ea6269a\") " pod="openstack/nova-api-db-create-7xtlx" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.244411 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8acaf68e-bce8-458b-bdb7-054e1ea6269a-operator-scripts\") pod \"nova-api-db-create-7xtlx\" (UID: \"8acaf68e-bce8-458b-bdb7-054e1ea6269a\") " pod="openstack/nova-api-db-create-7xtlx" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.345426 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8acaf68e-bce8-458b-bdb7-054e1ea6269a-operator-scripts\") pod \"nova-api-db-create-7xtlx\" (UID: \"8acaf68e-bce8-458b-bdb7-054e1ea6269a\") " pod="openstack/nova-api-db-create-7xtlx" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.345511 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbrlt\" (UniqueName: \"kubernetes.io/projected/8acaf68e-bce8-458b-bdb7-054e1ea6269a-kube-api-access-xbrlt\") pod \"nova-api-db-create-7xtlx\" (UID: \"8acaf68e-bce8-458b-bdb7-054e1ea6269a\") " pod="openstack/nova-api-db-create-7xtlx" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.346503 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8acaf68e-bce8-458b-bdb7-054e1ea6269a-operator-scripts\") pod \"nova-api-db-create-7xtlx\" (UID: \"8acaf68e-bce8-458b-bdb7-054e1ea6269a\") " pod="openstack/nova-api-db-create-7xtlx" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.349334 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-tgdtg"] Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.350378 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tgdtg" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.370155 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-tgdtg"] Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.377008 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-c7f7-account-create-update-2v6qt"] Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.385201 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c7f7-account-create-update-2v6qt" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.386138 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.389023 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbrlt\" (UniqueName: \"kubernetes.io/projected/8acaf68e-bce8-458b-bdb7-054e1ea6269a-kube-api-access-xbrlt\") pod \"nova-api-db-create-7xtlx\" (UID: \"8acaf68e-bce8-458b-bdb7-054e1ea6269a\") " pod="openstack/nova-api-db-create-7xtlx" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.389913 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.446682 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e09b8611-6210-4e51-bf26-cfbcd8732572-operator-scripts\") pod \"nova-cell0-db-create-tgdtg\" (UID: \"e09b8611-6210-4e51-bf26-cfbcd8732572\") " pod="openstack/nova-cell0-db-create-tgdtg" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.447082 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0aa4486a-c2a9-4144-88f0-68ceb2b8ce76-operator-scripts\") pod \"nova-api-c7f7-account-create-update-2v6qt\" (UID: \"0aa4486a-c2a9-4144-88f0-68ceb2b8ce76\") " pod="openstack/nova-api-c7f7-account-create-update-2v6qt" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.447253 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql5ln\" (UniqueName: \"kubernetes.io/projected/0aa4486a-c2a9-4144-88f0-68ceb2b8ce76-kube-api-access-ql5ln\") pod \"nova-api-c7f7-account-create-update-2v6qt\" (UID: \"0aa4486a-c2a9-4144-88f0-68ceb2b8ce76\") " pod="openstack/nova-api-c7f7-account-create-update-2v6qt" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.447283 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88nxn\" (UniqueName: \"kubernetes.io/projected/e09b8611-6210-4e51-bf26-cfbcd8732572-kube-api-access-88nxn\") pod \"nova-cell0-db-create-tgdtg\" (UID: \"e09b8611-6210-4e51-bf26-cfbcd8732572\") " pod="openstack/nova-cell0-db-create-tgdtg" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.458906 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c7f7-account-create-update-2v6qt"] Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.479886 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7xtlx" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.485059 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-hwq9m"] Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.486250 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hwq9m" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.494049 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-hwq9m"] Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.543156 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-6b72-account-create-update-cgrb2"] Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.544204 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6b72-account-create-update-cgrb2" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.549024 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.550381 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql5ln\" (UniqueName: \"kubernetes.io/projected/0aa4486a-c2a9-4144-88f0-68ceb2b8ce76-kube-api-access-ql5ln\") pod \"nova-api-c7f7-account-create-update-2v6qt\" (UID: \"0aa4486a-c2a9-4144-88f0-68ceb2b8ce76\") " pod="openstack/nova-api-c7f7-account-create-update-2v6qt" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.550408 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88nxn\" (UniqueName: \"kubernetes.io/projected/e09b8611-6210-4e51-bf26-cfbcd8732572-kube-api-access-88nxn\") pod \"nova-cell0-db-create-tgdtg\" (UID: \"e09b8611-6210-4e51-bf26-cfbcd8732572\") " pod="openstack/nova-cell0-db-create-tgdtg" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.550441 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltzdt\" (UniqueName: \"kubernetes.io/projected/35287f3b-4228-4e03-9ee9-c837a57009d5-kube-api-access-ltzdt\") pod \"nova-cell1-db-create-hwq9m\" (UID: \"35287f3b-4228-4e03-9ee9-c837a57009d5\") " pod="openstack/nova-cell1-db-create-hwq9m" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.550457 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35287f3b-4228-4e03-9ee9-c837a57009d5-operator-scripts\") pod \"nova-cell1-db-create-hwq9m\" (UID: \"35287f3b-4228-4e03-9ee9-c837a57009d5\") " pod="openstack/nova-cell1-db-create-hwq9m" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.550485 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e09b8611-6210-4e51-bf26-cfbcd8732572-operator-scripts\") pod \"nova-cell0-db-create-tgdtg\" (UID: \"e09b8611-6210-4e51-bf26-cfbcd8732572\") " pod="openstack/nova-cell0-db-create-tgdtg" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.550534 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0aa4486a-c2a9-4144-88f0-68ceb2b8ce76-operator-scripts\") pod \"nova-api-c7f7-account-create-update-2v6qt\" (UID: \"0aa4486a-c2a9-4144-88f0-68ceb2b8ce76\") " pod="openstack/nova-api-c7f7-account-create-update-2v6qt" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.551177 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0aa4486a-c2a9-4144-88f0-68ceb2b8ce76-operator-scripts\") pod \"nova-api-c7f7-account-create-update-2v6qt\" (UID: \"0aa4486a-c2a9-4144-88f0-68ceb2b8ce76\") " pod="openstack/nova-api-c7f7-account-create-update-2v6qt" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.551883 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e09b8611-6210-4e51-bf26-cfbcd8732572-operator-scripts\") pod \"nova-cell0-db-create-tgdtg\" (UID: \"e09b8611-6210-4e51-bf26-cfbcd8732572\") " pod="openstack/nova-cell0-db-create-tgdtg" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.567238 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-6b72-account-create-update-cgrb2"] Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.578699 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88nxn\" (UniqueName: \"kubernetes.io/projected/e09b8611-6210-4e51-bf26-cfbcd8732572-kube-api-access-88nxn\") pod \"nova-cell0-db-create-tgdtg\" (UID: \"e09b8611-6210-4e51-bf26-cfbcd8732572\") " pod="openstack/nova-cell0-db-create-tgdtg" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.580074 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql5ln\" (UniqueName: \"kubernetes.io/projected/0aa4486a-c2a9-4144-88f0-68ceb2b8ce76-kube-api-access-ql5ln\") pod \"nova-api-c7f7-account-create-update-2v6qt\" (UID: \"0aa4486a-c2a9-4144-88f0-68ceb2b8ce76\") " pod="openstack/nova-api-c7f7-account-create-update-2v6qt" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.652014 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbvhs\" (UniqueName: \"kubernetes.io/projected/3bc460c7-9f51-4f71-bc98-d4b588694439-kube-api-access-gbvhs\") pod \"nova-cell0-6b72-account-create-update-cgrb2\" (UID: \"3bc460c7-9f51-4f71-bc98-d4b588694439\") " pod="openstack/nova-cell0-6b72-account-create-update-cgrb2" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.652162 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bc460c7-9f51-4f71-bc98-d4b588694439-operator-scripts\") pod \"nova-cell0-6b72-account-create-update-cgrb2\" (UID: \"3bc460c7-9f51-4f71-bc98-d4b588694439\") " pod="openstack/nova-cell0-6b72-account-create-update-cgrb2" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.652205 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltzdt\" (UniqueName: \"kubernetes.io/projected/35287f3b-4228-4e03-9ee9-c837a57009d5-kube-api-access-ltzdt\") pod \"nova-cell1-db-create-hwq9m\" (UID: \"35287f3b-4228-4e03-9ee9-c837a57009d5\") " pod="openstack/nova-cell1-db-create-hwq9m" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.652223 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35287f3b-4228-4e03-9ee9-c837a57009d5-operator-scripts\") pod \"nova-cell1-db-create-hwq9m\" (UID: \"35287f3b-4228-4e03-9ee9-c837a57009d5\") " pod="openstack/nova-cell1-db-create-hwq9m" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.654517 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35287f3b-4228-4e03-9ee9-c837a57009d5-operator-scripts\") pod \"nova-cell1-db-create-hwq9m\" (UID: \"35287f3b-4228-4e03-9ee9-c837a57009d5\") " pod="openstack/nova-cell1-db-create-hwq9m" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.669483 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltzdt\" (UniqueName: \"kubernetes.io/projected/35287f3b-4228-4e03-9ee9-c837a57009d5-kube-api-access-ltzdt\") pod \"nova-cell1-db-create-hwq9m\" (UID: \"35287f3b-4228-4e03-9ee9-c837a57009d5\") " pod="openstack/nova-cell1-db-create-hwq9m" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.690377 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tgdtg" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.740215 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-01e6-account-create-update-t97qt"] Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.741212 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-01e6-account-create-update-t97qt" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.745244 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.755393 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz7mj\" (UniqueName: \"kubernetes.io/projected/5dfad81d-78bd-4063-87f6-a26a8d24205f-kube-api-access-hz7mj\") pod \"nova-cell1-01e6-account-create-update-t97qt\" (UID: \"5dfad81d-78bd-4063-87f6-a26a8d24205f\") " pod="openstack/nova-cell1-01e6-account-create-update-t97qt" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.755662 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dfad81d-78bd-4063-87f6-a26a8d24205f-operator-scripts\") pod \"nova-cell1-01e6-account-create-update-t97qt\" (UID: \"5dfad81d-78bd-4063-87f6-a26a8d24205f\") " pod="openstack/nova-cell1-01e6-account-create-update-t97qt" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.755815 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bc460c7-9f51-4f71-bc98-d4b588694439-operator-scripts\") pod \"nova-cell0-6b72-account-create-update-cgrb2\" (UID: \"3bc460c7-9f51-4f71-bc98-d4b588694439\") " pod="openstack/nova-cell0-6b72-account-create-update-cgrb2" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.755947 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbvhs\" (UniqueName: \"kubernetes.io/projected/3bc460c7-9f51-4f71-bc98-d4b588694439-kube-api-access-gbvhs\") pod \"nova-cell0-6b72-account-create-update-cgrb2\" (UID: \"3bc460c7-9f51-4f71-bc98-d4b588694439\") " pod="openstack/nova-cell0-6b72-account-create-update-cgrb2" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.756424 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bc460c7-9f51-4f71-bc98-d4b588694439-operator-scripts\") pod \"nova-cell0-6b72-account-create-update-cgrb2\" (UID: \"3bc460c7-9f51-4f71-bc98-d4b588694439\") " pod="openstack/nova-cell0-6b72-account-create-update-cgrb2" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.757295 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-01e6-account-create-update-t97qt"] Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.774211 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c7f7-account-create-update-2v6qt" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.781295 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbvhs\" (UniqueName: \"kubernetes.io/projected/3bc460c7-9f51-4f71-bc98-d4b588694439-kube-api-access-gbvhs\") pod \"nova-cell0-6b72-account-create-update-cgrb2\" (UID: \"3bc460c7-9f51-4f71-bc98-d4b588694439\") " pod="openstack/nova-cell0-6b72-account-create-update-cgrb2" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.836929 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hwq9m" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.857787 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dfad81d-78bd-4063-87f6-a26a8d24205f-operator-scripts\") pod \"nova-cell1-01e6-account-create-update-t97qt\" (UID: \"5dfad81d-78bd-4063-87f6-a26a8d24205f\") " pod="openstack/nova-cell1-01e6-account-create-update-t97qt" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.857972 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz7mj\" (UniqueName: \"kubernetes.io/projected/5dfad81d-78bd-4063-87f6-a26a8d24205f-kube-api-access-hz7mj\") pod \"nova-cell1-01e6-account-create-update-t97qt\" (UID: \"5dfad81d-78bd-4063-87f6-a26a8d24205f\") " pod="openstack/nova-cell1-01e6-account-create-update-t97qt" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.858739 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dfad81d-78bd-4063-87f6-a26a8d24205f-operator-scripts\") pod \"nova-cell1-01e6-account-create-update-t97qt\" (UID: \"5dfad81d-78bd-4063-87f6-a26a8d24205f\") " pod="openstack/nova-cell1-01e6-account-create-update-t97qt" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.868253 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6b72-account-create-update-cgrb2" Dec 06 09:24:43 crc kubenswrapper[4672]: I1206 09:24:43.873264 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz7mj\" (UniqueName: \"kubernetes.io/projected/5dfad81d-78bd-4063-87f6-a26a8d24205f-kube-api-access-hz7mj\") pod \"nova-cell1-01e6-account-create-update-t97qt\" (UID: \"5dfad81d-78bd-4063-87f6-a26a8d24205f\") " pod="openstack/nova-cell1-01e6-account-create-update-t97qt" Dec 06 09:24:44 crc kubenswrapper[4672]: I1206 09:24:44.066220 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-01e6-account-create-update-t97qt" Dec 06 09:24:48 crc kubenswrapper[4672]: I1206 09:24:48.644205 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 06 09:24:49 crc kubenswrapper[4672]: I1206 09:24:49.008156 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:24:49 crc kubenswrapper[4672]: I1206 09:24:49.008402 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4908de30-b638-44a7-b414-d1ac88946fb1" containerName="ceilometer-central-agent" containerID="cri-o://d2ac910eed5c8566d194e428bb1bf075dffa95db1329ba68cacb7ab41fef24fc" gracePeriod=30 Dec 06 09:24:49 crc kubenswrapper[4672]: I1206 09:24:49.009075 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4908de30-b638-44a7-b414-d1ac88946fb1" containerName="proxy-httpd" containerID="cri-o://bbe749131c3b53b37cdf15f3dffdc2092a18c46cc2d9a79592b299cc701e9f73" gracePeriod=30 Dec 06 09:24:49 crc kubenswrapper[4672]: I1206 09:24:49.009126 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4908de30-b638-44a7-b414-d1ac88946fb1" containerName="sg-core" containerID="cri-o://27fac396818df72cab7dd796207bb0cd99b118d530dfd6adc083d105564f79eb" gracePeriod=30 Dec 06 09:24:49 crc kubenswrapper[4672]: I1206 09:24:49.009159 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4908de30-b638-44a7-b414-d1ac88946fb1" containerName="ceilometer-notification-agent" containerID="cri-o://b8407de13682570be27a6bb0be250b0cbba8a25eaf0ea06fbc77a51d703a8d18" gracePeriod=30 Dec 06 09:24:49 crc kubenswrapper[4672]: I1206 09:24:49.029131 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="4908de30-b638-44a7-b414-d1ac88946fb1" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.152:3000/\": EOF" Dec 06 09:24:50 crc kubenswrapper[4672]: I1206 09:24:50.082782 4672 generic.go:334] "Generic (PLEG): container finished" podID="4908de30-b638-44a7-b414-d1ac88946fb1" containerID="bbe749131c3b53b37cdf15f3dffdc2092a18c46cc2d9a79592b299cc701e9f73" exitCode=0 Dec 06 09:24:50 crc kubenswrapper[4672]: I1206 09:24:50.083385 4672 generic.go:334] "Generic (PLEG): container finished" podID="4908de30-b638-44a7-b414-d1ac88946fb1" containerID="27fac396818df72cab7dd796207bb0cd99b118d530dfd6adc083d105564f79eb" exitCode=2 Dec 06 09:24:50 crc kubenswrapper[4672]: I1206 09:24:50.083401 4672 generic.go:334] "Generic (PLEG): container finished" podID="4908de30-b638-44a7-b414-d1ac88946fb1" containerID="d2ac910eed5c8566d194e428bb1bf075dffa95db1329ba68cacb7ab41fef24fc" exitCode=0 Dec 06 09:24:50 crc kubenswrapper[4672]: I1206 09:24:50.082901 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4908de30-b638-44a7-b414-d1ac88946fb1","Type":"ContainerDied","Data":"bbe749131c3b53b37cdf15f3dffdc2092a18c46cc2d9a79592b299cc701e9f73"} Dec 06 09:24:50 crc kubenswrapper[4672]: I1206 09:24:50.083446 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4908de30-b638-44a7-b414-d1ac88946fb1","Type":"ContainerDied","Data":"27fac396818df72cab7dd796207bb0cd99b118d530dfd6adc083d105564f79eb"} Dec 06 09:24:50 crc kubenswrapper[4672]: I1206 09:24:50.083466 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4908de30-b638-44a7-b414-d1ac88946fb1","Type":"ContainerDied","Data":"d2ac910eed5c8566d194e428bb1bf075dffa95db1329ba68cacb7ab41fef24fc"} Dec 06 09:24:50 crc kubenswrapper[4672]: I1206 09:24:50.527222 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-7xtlx"] Dec 06 09:24:50 crc kubenswrapper[4672]: I1206 09:24:50.537879 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-hwq9m"] Dec 06 09:24:50 crc kubenswrapper[4672]: I1206 09:24:50.621212 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-01e6-account-create-update-t97qt"] Dec 06 09:24:50 crc kubenswrapper[4672]: I1206 09:24:50.707655 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c7f7-account-create-update-2v6qt"] Dec 06 09:24:50 crc kubenswrapper[4672]: W1206 09:24:50.723588 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0aa4486a_c2a9_4144_88f0_68ceb2b8ce76.slice/crio-32c2e2ac1c55ee58f9ad02a1162db5c6e84ba179b6c530e93eee8d0343a00ba0 WatchSource:0}: Error finding container 32c2e2ac1c55ee58f9ad02a1162db5c6e84ba179b6c530e93eee8d0343a00ba0: Status 404 returned error can't find the container with id 32c2e2ac1c55ee58f9ad02a1162db5c6e84ba179b6c530e93eee8d0343a00ba0 Dec 06 09:24:50 crc kubenswrapper[4672]: I1206 09:24:50.723817 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-6b72-account-create-update-cgrb2"] Dec 06 09:24:50 crc kubenswrapper[4672]: I1206 09:24:50.731288 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-tgdtg"] Dec 06 09:24:50 crc kubenswrapper[4672]: W1206 09:24:50.760861 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bc460c7_9f51_4f71_bc98_d4b588694439.slice/crio-e73f5c8498a3bd13f06eb46e01e9db0bd25cd4245179cbe6afa58bc59cb541cf WatchSource:0}: Error finding container e73f5c8498a3bd13f06eb46e01e9db0bd25cd4245179cbe6afa58bc59cb541cf: Status 404 returned error can't find the container with id e73f5c8498a3bd13f06eb46e01e9db0bd25cd4245179cbe6afa58bc59cb541cf Dec 06 09:24:51 crc kubenswrapper[4672]: I1206 09:24:51.108365 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-01e6-account-create-update-t97qt" event={"ID":"5dfad81d-78bd-4063-87f6-a26a8d24205f","Type":"ContainerStarted","Data":"a1440b8c3c31a443ff15bbaf5ce2f6a35beee29b7a9789e7f12ae16f5806f0a6"} Dec 06 09:24:51 crc kubenswrapper[4672]: I1206 09:24:51.135392 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c7f7-account-create-update-2v6qt" event={"ID":"0aa4486a-c2a9-4144-88f0-68ceb2b8ce76","Type":"ContainerStarted","Data":"32c2e2ac1c55ee58f9ad02a1162db5c6e84ba179b6c530e93eee8d0343a00ba0"} Dec 06 09:24:51 crc kubenswrapper[4672]: I1206 09:24:51.136863 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6b72-account-create-update-cgrb2" event={"ID":"3bc460c7-9f51-4f71-bc98-d4b588694439","Type":"ContainerStarted","Data":"e73f5c8498a3bd13f06eb46e01e9db0bd25cd4245179cbe6afa58bc59cb541cf"} Dec 06 09:24:51 crc kubenswrapper[4672]: I1206 09:24:51.138046 4672 generic.go:334] "Generic (PLEG): container finished" podID="35287f3b-4228-4e03-9ee9-c837a57009d5" containerID="1a3a91f37ad129034295446401788ca2ef731e6bf91f7c6dc9c5db44128b2214" exitCode=0 Dec 06 09:24:51 crc kubenswrapper[4672]: I1206 09:24:51.138091 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hwq9m" event={"ID":"35287f3b-4228-4e03-9ee9-c837a57009d5","Type":"ContainerDied","Data":"1a3a91f37ad129034295446401788ca2ef731e6bf91f7c6dc9c5db44128b2214"} Dec 06 09:24:51 crc kubenswrapper[4672]: I1206 09:24:51.138106 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hwq9m" event={"ID":"35287f3b-4228-4e03-9ee9-c837a57009d5","Type":"ContainerStarted","Data":"bf7407df677128d382de9c73fe5f2e62b3002099eb2dc678df4c02274fe861d3"} Dec 06 09:24:51 crc kubenswrapper[4672]: I1206 09:24:51.140053 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"44fea3e1-80c8-4525-b613-467978a95351","Type":"ContainerStarted","Data":"61ed771a30318eeda49ccd155b2c7beeb34ee84dbc635f453a4e97ccaeb31b28"} Dec 06 09:24:51 crc kubenswrapper[4672]: I1206 09:24:51.147279 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7xtlx" event={"ID":"8acaf68e-bce8-458b-bdb7-054e1ea6269a","Type":"ContainerStarted","Data":"5d88cb3ecea1b968344bbad4d530e5577cca48c6a8ff7b18cb6dde570d86a4ad"} Dec 06 09:24:51 crc kubenswrapper[4672]: I1206 09:24:51.147327 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7xtlx" event={"ID":"8acaf68e-bce8-458b-bdb7-054e1ea6269a","Type":"ContainerStarted","Data":"403c590ed2a652fe99cdf05774ec177e8db61699993a2893086dcc93e31cfcbd"} Dec 06 09:24:51 crc kubenswrapper[4672]: I1206 09:24:51.148873 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tgdtg" event={"ID":"e09b8611-6210-4e51-bf26-cfbcd8732572","Type":"ContainerStarted","Data":"a0e2a3aac0b19fb9fa840ef5e91d3c9f878a8959b7b4154bcd8d776f14468a2c"} Dec 06 09:24:51 crc kubenswrapper[4672]: I1206 09:24:51.188330 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-7xtlx" podStartSLOduration=8.188312986 podStartE2EDuration="8.188312986s" podCreationTimestamp="2025-12-06 09:24:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:24:51.187875465 +0000 UTC m=+1108.932135752" watchObservedRunningTime="2025-12-06 09:24:51.188312986 +0000 UTC m=+1108.932573273" Dec 06 09:24:51 crc kubenswrapper[4672]: I1206 09:24:51.214902 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.282990941 podStartE2EDuration="15.214885615s" podCreationTimestamp="2025-12-06 09:24:36 +0000 UTC" firstStartedPulling="2025-12-06 09:24:36.987979571 +0000 UTC m=+1094.732239858" lastFinishedPulling="2025-12-06 09:24:49.919874245 +0000 UTC m=+1107.664134532" observedRunningTime="2025-12-06 09:24:51.20731866 +0000 UTC m=+1108.951578947" watchObservedRunningTime="2025-12-06 09:24:51.214885615 +0000 UTC m=+1108.959145902" Dec 06 09:24:52 crc kubenswrapper[4672]: I1206 09:24:52.159134 4672 generic.go:334] "Generic (PLEG): container finished" podID="0aa4486a-c2a9-4144-88f0-68ceb2b8ce76" containerID="fde2a159531758f663d3b1495a399a2e254e9b46e28bb6e7a904cac7ea8b5d59" exitCode=0 Dec 06 09:24:52 crc kubenswrapper[4672]: I1206 09:24:52.159197 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c7f7-account-create-update-2v6qt" event={"ID":"0aa4486a-c2a9-4144-88f0-68ceb2b8ce76","Type":"ContainerDied","Data":"fde2a159531758f663d3b1495a399a2e254e9b46e28bb6e7a904cac7ea8b5d59"} Dec 06 09:24:52 crc kubenswrapper[4672]: I1206 09:24:52.161932 4672 generic.go:334] "Generic (PLEG): container finished" podID="3bc460c7-9f51-4f71-bc98-d4b588694439" containerID="96e31fe026ffc3c7cfa3864a6332b7a364aeb3f75dacce66d8f2c0d2c3a709a6" exitCode=0 Dec 06 09:24:52 crc kubenswrapper[4672]: I1206 09:24:52.161989 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6b72-account-create-update-cgrb2" event={"ID":"3bc460c7-9f51-4f71-bc98-d4b588694439","Type":"ContainerDied","Data":"96e31fe026ffc3c7cfa3864a6332b7a364aeb3f75dacce66d8f2c0d2c3a709a6"} Dec 06 09:24:52 crc kubenswrapper[4672]: I1206 09:24:52.163763 4672 generic.go:334] "Generic (PLEG): container finished" podID="8acaf68e-bce8-458b-bdb7-054e1ea6269a" containerID="5d88cb3ecea1b968344bbad4d530e5577cca48c6a8ff7b18cb6dde570d86a4ad" exitCode=0 Dec 06 09:24:52 crc kubenswrapper[4672]: I1206 09:24:52.163834 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7xtlx" event={"ID":"8acaf68e-bce8-458b-bdb7-054e1ea6269a","Type":"ContainerDied","Data":"5d88cb3ecea1b968344bbad4d530e5577cca48c6a8ff7b18cb6dde570d86a4ad"} Dec 06 09:24:52 crc kubenswrapper[4672]: I1206 09:24:52.167457 4672 generic.go:334] "Generic (PLEG): container finished" podID="e09b8611-6210-4e51-bf26-cfbcd8732572" containerID="72094cf055bd3e19b3c7495992f99628ed015cfb7785ef10e2c771bb078e2a54" exitCode=0 Dec 06 09:24:52 crc kubenswrapper[4672]: I1206 09:24:52.167594 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tgdtg" event={"ID":"e09b8611-6210-4e51-bf26-cfbcd8732572","Type":"ContainerDied","Data":"72094cf055bd3e19b3c7495992f99628ed015cfb7785ef10e2c771bb078e2a54"} Dec 06 09:24:52 crc kubenswrapper[4672]: I1206 09:24:52.170071 4672 generic.go:334] "Generic (PLEG): container finished" podID="5dfad81d-78bd-4063-87f6-a26a8d24205f" containerID="28b71027accb54967268c1a6d7d788e99b44d6a325bae206815caa9315911420" exitCode=0 Dec 06 09:24:52 crc kubenswrapper[4672]: I1206 09:24:52.170330 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-01e6-account-create-update-t97qt" event={"ID":"5dfad81d-78bd-4063-87f6-a26a8d24205f","Type":"ContainerDied","Data":"28b71027accb54967268c1a6d7d788e99b44d6a325bae206815caa9315911420"} Dec 06 09:24:52 crc kubenswrapper[4672]: I1206 09:24:52.546465 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hwq9m" Dec 06 09:24:52 crc kubenswrapper[4672]: I1206 09:24:52.730356 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35287f3b-4228-4e03-9ee9-c837a57009d5-operator-scripts\") pod \"35287f3b-4228-4e03-9ee9-c837a57009d5\" (UID: \"35287f3b-4228-4e03-9ee9-c837a57009d5\") " Dec 06 09:24:52 crc kubenswrapper[4672]: I1206 09:24:52.730543 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltzdt\" (UniqueName: \"kubernetes.io/projected/35287f3b-4228-4e03-9ee9-c837a57009d5-kube-api-access-ltzdt\") pod \"35287f3b-4228-4e03-9ee9-c837a57009d5\" (UID: \"35287f3b-4228-4e03-9ee9-c837a57009d5\") " Dec 06 09:24:52 crc kubenswrapper[4672]: I1206 09:24:52.730970 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35287f3b-4228-4e03-9ee9-c837a57009d5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "35287f3b-4228-4e03-9ee9-c837a57009d5" (UID: "35287f3b-4228-4e03-9ee9-c837a57009d5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:24:52 crc kubenswrapper[4672]: I1206 09:24:52.731141 4672 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35287f3b-4228-4e03-9ee9-c837a57009d5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:52 crc kubenswrapper[4672]: I1206 09:24:52.743909 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35287f3b-4228-4e03-9ee9-c837a57009d5-kube-api-access-ltzdt" (OuterVolumeSpecName: "kube-api-access-ltzdt") pod "35287f3b-4228-4e03-9ee9-c837a57009d5" (UID: "35287f3b-4228-4e03-9ee9-c837a57009d5"). InnerVolumeSpecName "kube-api-access-ltzdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:24:52 crc kubenswrapper[4672]: I1206 09:24:52.832801 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltzdt\" (UniqueName: \"kubernetes.io/projected/35287f3b-4228-4e03-9ee9-c837a57009d5-kube-api-access-ltzdt\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:53 crc kubenswrapper[4672]: I1206 09:24:53.178121 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hwq9m" Dec 06 09:24:53 crc kubenswrapper[4672]: I1206 09:24:53.181690 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hwq9m" event={"ID":"35287f3b-4228-4e03-9ee9-c837a57009d5","Type":"ContainerDied","Data":"bf7407df677128d382de9c73fe5f2e62b3002099eb2dc678df4c02274fe861d3"} Dec 06 09:24:53 crc kubenswrapper[4672]: I1206 09:24:53.181748 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf7407df677128d382de9c73fe5f2e62b3002099eb2dc678df4c02274fe861d3" Dec 06 09:24:53 crc kubenswrapper[4672]: I1206 09:24:53.701196 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7xtlx" Dec 06 09:24:53 crc kubenswrapper[4672]: I1206 09:24:53.854423 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbrlt\" (UniqueName: \"kubernetes.io/projected/8acaf68e-bce8-458b-bdb7-054e1ea6269a-kube-api-access-xbrlt\") pod \"8acaf68e-bce8-458b-bdb7-054e1ea6269a\" (UID: \"8acaf68e-bce8-458b-bdb7-054e1ea6269a\") " Dec 06 09:24:53 crc kubenswrapper[4672]: I1206 09:24:53.854476 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8acaf68e-bce8-458b-bdb7-054e1ea6269a-operator-scripts\") pod \"8acaf68e-bce8-458b-bdb7-054e1ea6269a\" (UID: \"8acaf68e-bce8-458b-bdb7-054e1ea6269a\") " Dec 06 09:24:53 crc kubenswrapper[4672]: I1206 09:24:53.855426 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8acaf68e-bce8-458b-bdb7-054e1ea6269a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8acaf68e-bce8-458b-bdb7-054e1ea6269a" (UID: "8acaf68e-bce8-458b-bdb7-054e1ea6269a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:24:53 crc kubenswrapper[4672]: I1206 09:24:53.887732 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8acaf68e-bce8-458b-bdb7-054e1ea6269a-kube-api-access-xbrlt" (OuterVolumeSpecName: "kube-api-access-xbrlt") pod "8acaf68e-bce8-458b-bdb7-054e1ea6269a" (UID: "8acaf68e-bce8-458b-bdb7-054e1ea6269a"). InnerVolumeSpecName "kube-api-access-xbrlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:24:53 crc kubenswrapper[4672]: I1206 09:24:53.956698 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbrlt\" (UniqueName: \"kubernetes.io/projected/8acaf68e-bce8-458b-bdb7-054e1ea6269a-kube-api-access-xbrlt\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:53 crc kubenswrapper[4672]: I1206 09:24:53.956727 4672 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8acaf68e-bce8-458b-bdb7-054e1ea6269a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.024578 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-01e6-account-create-update-t97qt" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.030210 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6b72-account-create-update-cgrb2" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.035985 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tgdtg" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.041771 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c7f7-account-create-update-2v6qt" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.159837 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz7mj\" (UniqueName: \"kubernetes.io/projected/5dfad81d-78bd-4063-87f6-a26a8d24205f-kube-api-access-hz7mj\") pod \"5dfad81d-78bd-4063-87f6-a26a8d24205f\" (UID: \"5dfad81d-78bd-4063-87f6-a26a8d24205f\") " Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.160357 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dfad81d-78bd-4063-87f6-a26a8d24205f-operator-scripts\") pod \"5dfad81d-78bd-4063-87f6-a26a8d24205f\" (UID: \"5dfad81d-78bd-4063-87f6-a26a8d24205f\") " Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.160634 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e09b8611-6210-4e51-bf26-cfbcd8732572-operator-scripts\") pod \"e09b8611-6210-4e51-bf26-cfbcd8732572\" (UID: \"e09b8611-6210-4e51-bf26-cfbcd8732572\") " Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.160692 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbvhs\" (UniqueName: \"kubernetes.io/projected/3bc460c7-9f51-4f71-bc98-d4b588694439-kube-api-access-gbvhs\") pod \"3bc460c7-9f51-4f71-bc98-d4b588694439\" (UID: \"3bc460c7-9f51-4f71-bc98-d4b588694439\") " Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.160814 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0aa4486a-c2a9-4144-88f0-68ceb2b8ce76-operator-scripts\") pod \"0aa4486a-c2a9-4144-88f0-68ceb2b8ce76\" (UID: \"0aa4486a-c2a9-4144-88f0-68ceb2b8ce76\") " Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.160895 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql5ln\" (UniqueName: \"kubernetes.io/projected/0aa4486a-c2a9-4144-88f0-68ceb2b8ce76-kube-api-access-ql5ln\") pod \"0aa4486a-c2a9-4144-88f0-68ceb2b8ce76\" (UID: \"0aa4486a-c2a9-4144-88f0-68ceb2b8ce76\") " Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.160930 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bc460c7-9f51-4f71-bc98-d4b588694439-operator-scripts\") pod \"3bc460c7-9f51-4f71-bc98-d4b588694439\" (UID: \"3bc460c7-9f51-4f71-bc98-d4b588694439\") " Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.160972 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88nxn\" (UniqueName: \"kubernetes.io/projected/e09b8611-6210-4e51-bf26-cfbcd8732572-kube-api-access-88nxn\") pod \"e09b8611-6210-4e51-bf26-cfbcd8732572\" (UID: \"e09b8611-6210-4e51-bf26-cfbcd8732572\") " Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.164618 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e09b8611-6210-4e51-bf26-cfbcd8732572-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e09b8611-6210-4e51-bf26-cfbcd8732572" (UID: "e09b8611-6210-4e51-bf26-cfbcd8732572"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.165102 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5dfad81d-78bd-4063-87f6-a26a8d24205f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5dfad81d-78bd-4063-87f6-a26a8d24205f" (UID: "5dfad81d-78bd-4063-87f6-a26a8d24205f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.167793 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bc460c7-9f51-4f71-bc98-d4b588694439-kube-api-access-gbvhs" (OuterVolumeSpecName: "kube-api-access-gbvhs") pod "3bc460c7-9f51-4f71-bc98-d4b588694439" (UID: "3bc460c7-9f51-4f71-bc98-d4b588694439"). InnerVolumeSpecName "kube-api-access-gbvhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.171748 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bc460c7-9f51-4f71-bc98-d4b588694439-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3bc460c7-9f51-4f71-bc98-d4b588694439" (UID: "3bc460c7-9f51-4f71-bc98-d4b588694439"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.172957 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aa4486a-c2a9-4144-88f0-68ceb2b8ce76-kube-api-access-ql5ln" (OuterVolumeSpecName: "kube-api-access-ql5ln") pod "0aa4486a-c2a9-4144-88f0-68ceb2b8ce76" (UID: "0aa4486a-c2a9-4144-88f0-68ceb2b8ce76"). InnerVolumeSpecName "kube-api-access-ql5ln". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.179942 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dfad81d-78bd-4063-87f6-a26a8d24205f-kube-api-access-hz7mj" (OuterVolumeSpecName: "kube-api-access-hz7mj") pod "5dfad81d-78bd-4063-87f6-a26a8d24205f" (UID: "5dfad81d-78bd-4063-87f6-a26a8d24205f"). InnerVolumeSpecName "kube-api-access-hz7mj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.184055 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aa4486a-c2a9-4144-88f0-68ceb2b8ce76-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0aa4486a-c2a9-4144-88f0-68ceb2b8ce76" (UID: "0aa4486a-c2a9-4144-88f0-68ceb2b8ce76"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.184244 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e09b8611-6210-4e51-bf26-cfbcd8732572-kube-api-access-88nxn" (OuterVolumeSpecName: "kube-api-access-88nxn") pod "e09b8611-6210-4e51-bf26-cfbcd8732572" (UID: "e09b8611-6210-4e51-bf26-cfbcd8732572"). InnerVolumeSpecName "kube-api-access-88nxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.202738 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6b72-account-create-update-cgrb2" event={"ID":"3bc460c7-9f51-4f71-bc98-d4b588694439","Type":"ContainerDied","Data":"e73f5c8498a3bd13f06eb46e01e9db0bd25cd4245179cbe6afa58bc59cb541cf"} Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.202783 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e73f5c8498a3bd13f06eb46e01e9db0bd25cd4245179cbe6afa58bc59cb541cf" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.202839 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6b72-account-create-update-cgrb2" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.211578 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7xtlx" event={"ID":"8acaf68e-bce8-458b-bdb7-054e1ea6269a","Type":"ContainerDied","Data":"403c590ed2a652fe99cdf05774ec177e8db61699993a2893086dcc93e31cfcbd"} Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.211629 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="403c590ed2a652fe99cdf05774ec177e8db61699993a2893086dcc93e31cfcbd" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.211673 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7xtlx" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.214917 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tgdtg" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.215065 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tgdtg" event={"ID":"e09b8611-6210-4e51-bf26-cfbcd8732572","Type":"ContainerDied","Data":"a0e2a3aac0b19fb9fa840ef5e91d3c9f878a8959b7b4154bcd8d776f14468a2c"} Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.215103 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0e2a3aac0b19fb9fa840ef5e91d3c9f878a8959b7b4154bcd8d776f14468a2c" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.226178 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-01e6-account-create-update-t97qt" event={"ID":"5dfad81d-78bd-4063-87f6-a26a8d24205f","Type":"ContainerDied","Data":"a1440b8c3c31a443ff15bbaf5ce2f6a35beee29b7a9789e7f12ae16f5806f0a6"} Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.226222 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1440b8c3c31a443ff15bbaf5ce2f6a35beee29b7a9789e7f12ae16f5806f0a6" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.226224 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-01e6-account-create-update-t97qt" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.232527 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c7f7-account-create-update-2v6qt" event={"ID":"0aa4486a-c2a9-4144-88f0-68ceb2b8ce76","Type":"ContainerDied","Data":"32c2e2ac1c55ee58f9ad02a1162db5c6e84ba179b6c530e93eee8d0343a00ba0"} Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.232576 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32c2e2ac1c55ee58f9ad02a1162db5c6e84ba179b6c530e93eee8d0343a00ba0" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.232611 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c7f7-account-create-update-2v6qt" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.237422 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.238939 4672 generic.go:334] "Generic (PLEG): container finished" podID="4908de30-b638-44a7-b414-d1ac88946fb1" containerID="b8407de13682570be27a6bb0be250b0cbba8a25eaf0ea06fbc77a51d703a8d18" exitCode=0 Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.238972 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4908de30-b638-44a7-b414-d1ac88946fb1","Type":"ContainerDied","Data":"b8407de13682570be27a6bb0be250b0cbba8a25eaf0ea06fbc77a51d703a8d18"} Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.238999 4672 scope.go:117] "RemoveContainer" containerID="bbe749131c3b53b37cdf15f3dffdc2092a18c46cc2d9a79592b299cc701e9f73" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.264888 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql5ln\" (UniqueName: \"kubernetes.io/projected/0aa4486a-c2a9-4144-88f0-68ceb2b8ce76-kube-api-access-ql5ln\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.264972 4672 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bc460c7-9f51-4f71-bc98-d4b588694439-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.264986 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88nxn\" (UniqueName: \"kubernetes.io/projected/e09b8611-6210-4e51-bf26-cfbcd8732572-kube-api-access-88nxn\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.265032 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz7mj\" (UniqueName: \"kubernetes.io/projected/5dfad81d-78bd-4063-87f6-a26a8d24205f-kube-api-access-hz7mj\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.265048 4672 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dfad81d-78bd-4063-87f6-a26a8d24205f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.265062 4672 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e09b8611-6210-4e51-bf26-cfbcd8732572-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.265075 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbvhs\" (UniqueName: \"kubernetes.io/projected/3bc460c7-9f51-4f71-bc98-d4b588694439-kube-api-access-gbvhs\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.265126 4672 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0aa4486a-c2a9-4144-88f0-68ceb2b8ce76-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.309438 4672 scope.go:117] "RemoveContainer" containerID="27fac396818df72cab7dd796207bb0cd99b118d530dfd6adc083d105564f79eb" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.373040 4672 scope.go:117] "RemoveContainer" containerID="b8407de13682570be27a6bb0be250b0cbba8a25eaf0ea06fbc77a51d703a8d18" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.374330 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4908de30-b638-44a7-b414-d1ac88946fb1-log-httpd\") pod \"4908de30-b638-44a7-b414-d1ac88946fb1\" (UID: \"4908de30-b638-44a7-b414-d1ac88946fb1\") " Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.374364 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4908de30-b638-44a7-b414-d1ac88946fb1-combined-ca-bundle\") pod \"4908de30-b638-44a7-b414-d1ac88946fb1\" (UID: \"4908de30-b638-44a7-b414-d1ac88946fb1\") " Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.374391 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4908de30-b638-44a7-b414-d1ac88946fb1-config-data\") pod \"4908de30-b638-44a7-b414-d1ac88946fb1\" (UID: \"4908de30-b638-44a7-b414-d1ac88946fb1\") " Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.374425 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4908de30-b638-44a7-b414-d1ac88946fb1-sg-core-conf-yaml\") pod \"4908de30-b638-44a7-b414-d1ac88946fb1\" (UID: \"4908de30-b638-44a7-b414-d1ac88946fb1\") " Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.374481 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z79fh\" (UniqueName: \"kubernetes.io/projected/4908de30-b638-44a7-b414-d1ac88946fb1-kube-api-access-z79fh\") pod \"4908de30-b638-44a7-b414-d1ac88946fb1\" (UID: \"4908de30-b638-44a7-b414-d1ac88946fb1\") " Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.374566 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4908de30-b638-44a7-b414-d1ac88946fb1-scripts\") pod \"4908de30-b638-44a7-b414-d1ac88946fb1\" (UID: \"4908de30-b638-44a7-b414-d1ac88946fb1\") " Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.374626 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4908de30-b638-44a7-b414-d1ac88946fb1-run-httpd\") pod \"4908de30-b638-44a7-b414-d1ac88946fb1\" (UID: \"4908de30-b638-44a7-b414-d1ac88946fb1\") " Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.402858 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4908de30-b638-44a7-b414-d1ac88946fb1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4908de30-b638-44a7-b414-d1ac88946fb1" (UID: "4908de30-b638-44a7-b414-d1ac88946fb1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.403390 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4908de30-b638-44a7-b414-d1ac88946fb1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4908de30-b638-44a7-b414-d1ac88946fb1" (UID: "4908de30-b638-44a7-b414-d1ac88946fb1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.425159 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4908de30-b638-44a7-b414-d1ac88946fb1-kube-api-access-z79fh" (OuterVolumeSpecName: "kube-api-access-z79fh") pod "4908de30-b638-44a7-b414-d1ac88946fb1" (UID: "4908de30-b638-44a7-b414-d1ac88946fb1"). InnerVolumeSpecName "kube-api-access-z79fh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.427640 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4908de30-b638-44a7-b414-d1ac88946fb1-scripts" (OuterVolumeSpecName: "scripts") pod "4908de30-b638-44a7-b414-d1ac88946fb1" (UID: "4908de30-b638-44a7-b414-d1ac88946fb1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.430894 4672 scope.go:117] "RemoveContainer" containerID="d2ac910eed5c8566d194e428bb1bf075dffa95db1329ba68cacb7ab41fef24fc" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.434770 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4908de30-b638-44a7-b414-d1ac88946fb1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4908de30-b638-44a7-b414-d1ac88946fb1" (UID: "4908de30-b638-44a7-b414-d1ac88946fb1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.476984 4672 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4908de30-b638-44a7-b414-d1ac88946fb1-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.477024 4672 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4908de30-b638-44a7-b414-d1ac88946fb1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.477038 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z79fh\" (UniqueName: \"kubernetes.io/projected/4908de30-b638-44a7-b414-d1ac88946fb1-kube-api-access-z79fh\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.477049 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4908de30-b638-44a7-b414-d1ac88946fb1-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.477061 4672 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4908de30-b638-44a7-b414-d1ac88946fb1-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.486229 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4908de30-b638-44a7-b414-d1ac88946fb1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4908de30-b638-44a7-b414-d1ac88946fb1" (UID: "4908de30-b638-44a7-b414-d1ac88946fb1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.509784 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4908de30-b638-44a7-b414-d1ac88946fb1-config-data" (OuterVolumeSpecName: "config-data") pod "4908de30-b638-44a7-b414-d1ac88946fb1" (UID: "4908de30-b638-44a7-b414-d1ac88946fb1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.578690 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4908de30-b638-44a7-b414-d1ac88946fb1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:54 crc kubenswrapper[4672]: I1206 09:24:54.578722 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4908de30-b638-44a7-b414-d1ac88946fb1-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.246992 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4908de30-b638-44a7-b414-d1ac88946fb1","Type":"ContainerDied","Data":"8ffc6c533b3e606bbeec645322302b554124d47450969ae47366a7f60d2c6408"} Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.247058 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.272175 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.281446 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.300364 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:24:55 crc kubenswrapper[4672]: E1206 09:24:55.300718 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8acaf68e-bce8-458b-bdb7-054e1ea6269a" containerName="mariadb-database-create" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.300736 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="8acaf68e-bce8-458b-bdb7-054e1ea6269a" containerName="mariadb-database-create" Dec 06 09:24:55 crc kubenswrapper[4672]: E1206 09:24:55.300749 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35287f3b-4228-4e03-9ee9-c837a57009d5" containerName="mariadb-database-create" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.300755 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="35287f3b-4228-4e03-9ee9-c837a57009d5" containerName="mariadb-database-create" Dec 06 09:24:55 crc kubenswrapper[4672]: E1206 09:24:55.300770 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aa4486a-c2a9-4144-88f0-68ceb2b8ce76" containerName="mariadb-account-create-update" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.300777 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aa4486a-c2a9-4144-88f0-68ceb2b8ce76" containerName="mariadb-account-create-update" Dec 06 09:24:55 crc kubenswrapper[4672]: E1206 09:24:55.300788 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e09b8611-6210-4e51-bf26-cfbcd8732572" containerName="mariadb-database-create" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.300793 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="e09b8611-6210-4e51-bf26-cfbcd8732572" containerName="mariadb-database-create" Dec 06 09:24:55 crc kubenswrapper[4672]: E1206 09:24:55.300808 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc460c7-9f51-4f71-bc98-d4b588694439" containerName="mariadb-account-create-update" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.300813 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc460c7-9f51-4f71-bc98-d4b588694439" containerName="mariadb-account-create-update" Dec 06 09:24:55 crc kubenswrapper[4672]: E1206 09:24:55.300821 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4908de30-b638-44a7-b414-d1ac88946fb1" containerName="proxy-httpd" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.300827 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4908de30-b638-44a7-b414-d1ac88946fb1" containerName="proxy-httpd" Dec 06 09:24:55 crc kubenswrapper[4672]: E1206 09:24:55.300840 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4908de30-b638-44a7-b414-d1ac88946fb1" containerName="ceilometer-central-agent" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.300845 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4908de30-b638-44a7-b414-d1ac88946fb1" containerName="ceilometer-central-agent" Dec 06 09:24:55 crc kubenswrapper[4672]: E1206 09:24:55.300860 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4908de30-b638-44a7-b414-d1ac88946fb1" containerName="sg-core" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.300867 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4908de30-b638-44a7-b414-d1ac88946fb1" containerName="sg-core" Dec 06 09:24:55 crc kubenswrapper[4672]: E1206 09:24:55.300879 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4908de30-b638-44a7-b414-d1ac88946fb1" containerName="ceilometer-notification-agent" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.300885 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4908de30-b638-44a7-b414-d1ac88946fb1" containerName="ceilometer-notification-agent" Dec 06 09:24:55 crc kubenswrapper[4672]: E1206 09:24:55.300897 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dfad81d-78bd-4063-87f6-a26a8d24205f" containerName="mariadb-account-create-update" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.300902 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dfad81d-78bd-4063-87f6-a26a8d24205f" containerName="mariadb-account-create-update" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.301064 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="4908de30-b638-44a7-b414-d1ac88946fb1" containerName="ceilometer-central-agent" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.301076 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bc460c7-9f51-4f71-bc98-d4b588694439" containerName="mariadb-account-create-update" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.301088 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="4908de30-b638-44a7-b414-d1ac88946fb1" containerName="sg-core" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.301096 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="35287f3b-4228-4e03-9ee9-c837a57009d5" containerName="mariadb-database-create" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.301105 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="e09b8611-6210-4e51-bf26-cfbcd8732572" containerName="mariadb-database-create" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.301116 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aa4486a-c2a9-4144-88f0-68ceb2b8ce76" containerName="mariadb-account-create-update" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.301130 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="8acaf68e-bce8-458b-bdb7-054e1ea6269a" containerName="mariadb-database-create" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.301137 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dfad81d-78bd-4063-87f6-a26a8d24205f" containerName="mariadb-account-create-update" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.301148 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="4908de30-b638-44a7-b414-d1ac88946fb1" containerName="proxy-httpd" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.301158 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="4908de30-b638-44a7-b414-d1ac88946fb1" containerName="ceilometer-notification-agent" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.302626 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.304773 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.304901 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.320948 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.391647 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd631857-ab49-4ebb-be4e-5ea40cf50670-run-httpd\") pod \"ceilometer-0\" (UID: \"fd631857-ab49-4ebb-be4e-5ea40cf50670\") " pod="openstack/ceilometer-0" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.391705 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd631857-ab49-4ebb-be4e-5ea40cf50670-log-httpd\") pod \"ceilometer-0\" (UID: \"fd631857-ab49-4ebb-be4e-5ea40cf50670\") " pod="openstack/ceilometer-0" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.391752 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd631857-ab49-4ebb-be4e-5ea40cf50670-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fd631857-ab49-4ebb-be4e-5ea40cf50670\") " pod="openstack/ceilometer-0" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.391785 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd631857-ab49-4ebb-be4e-5ea40cf50670-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fd631857-ab49-4ebb-be4e-5ea40cf50670\") " pod="openstack/ceilometer-0" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.391815 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd631857-ab49-4ebb-be4e-5ea40cf50670-scripts\") pod \"ceilometer-0\" (UID: \"fd631857-ab49-4ebb-be4e-5ea40cf50670\") " pod="openstack/ceilometer-0" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.391831 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h4ls\" (UniqueName: \"kubernetes.io/projected/fd631857-ab49-4ebb-be4e-5ea40cf50670-kube-api-access-7h4ls\") pod \"ceilometer-0\" (UID: \"fd631857-ab49-4ebb-be4e-5ea40cf50670\") " pod="openstack/ceilometer-0" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.391850 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd631857-ab49-4ebb-be4e-5ea40cf50670-config-data\") pod \"ceilometer-0\" (UID: \"fd631857-ab49-4ebb-be4e-5ea40cf50670\") " pod="openstack/ceilometer-0" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.493746 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd631857-ab49-4ebb-be4e-5ea40cf50670-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fd631857-ab49-4ebb-be4e-5ea40cf50670\") " pod="openstack/ceilometer-0" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.493805 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd631857-ab49-4ebb-be4e-5ea40cf50670-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fd631857-ab49-4ebb-be4e-5ea40cf50670\") " pod="openstack/ceilometer-0" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.493840 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd631857-ab49-4ebb-be4e-5ea40cf50670-scripts\") pod \"ceilometer-0\" (UID: \"fd631857-ab49-4ebb-be4e-5ea40cf50670\") " pod="openstack/ceilometer-0" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.493860 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h4ls\" (UniqueName: \"kubernetes.io/projected/fd631857-ab49-4ebb-be4e-5ea40cf50670-kube-api-access-7h4ls\") pod \"ceilometer-0\" (UID: \"fd631857-ab49-4ebb-be4e-5ea40cf50670\") " pod="openstack/ceilometer-0" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.493883 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd631857-ab49-4ebb-be4e-5ea40cf50670-config-data\") pod \"ceilometer-0\" (UID: \"fd631857-ab49-4ebb-be4e-5ea40cf50670\") " pod="openstack/ceilometer-0" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.493937 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd631857-ab49-4ebb-be4e-5ea40cf50670-run-httpd\") pod \"ceilometer-0\" (UID: \"fd631857-ab49-4ebb-be4e-5ea40cf50670\") " pod="openstack/ceilometer-0" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.493963 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd631857-ab49-4ebb-be4e-5ea40cf50670-log-httpd\") pod \"ceilometer-0\" (UID: \"fd631857-ab49-4ebb-be4e-5ea40cf50670\") " pod="openstack/ceilometer-0" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.494422 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd631857-ab49-4ebb-be4e-5ea40cf50670-log-httpd\") pod \"ceilometer-0\" (UID: \"fd631857-ab49-4ebb-be4e-5ea40cf50670\") " pod="openstack/ceilometer-0" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.494846 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd631857-ab49-4ebb-be4e-5ea40cf50670-run-httpd\") pod \"ceilometer-0\" (UID: \"fd631857-ab49-4ebb-be4e-5ea40cf50670\") " pod="openstack/ceilometer-0" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.499580 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd631857-ab49-4ebb-be4e-5ea40cf50670-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fd631857-ab49-4ebb-be4e-5ea40cf50670\") " pod="openstack/ceilometer-0" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.499736 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd631857-ab49-4ebb-be4e-5ea40cf50670-config-data\") pod \"ceilometer-0\" (UID: \"fd631857-ab49-4ebb-be4e-5ea40cf50670\") " pod="openstack/ceilometer-0" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.500291 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd631857-ab49-4ebb-be4e-5ea40cf50670-scripts\") pod \"ceilometer-0\" (UID: \"fd631857-ab49-4ebb-be4e-5ea40cf50670\") " pod="openstack/ceilometer-0" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.517437 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd631857-ab49-4ebb-be4e-5ea40cf50670-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fd631857-ab49-4ebb-be4e-5ea40cf50670\") " pod="openstack/ceilometer-0" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.525374 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h4ls\" (UniqueName: \"kubernetes.io/projected/fd631857-ab49-4ebb-be4e-5ea40cf50670-kube-api-access-7h4ls\") pod \"ceilometer-0\" (UID: \"fd631857-ab49-4ebb-be4e-5ea40cf50670\") " pod="openstack/ceilometer-0" Dec 06 09:24:55 crc kubenswrapper[4672]: I1206 09:24:55.616903 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:24:56 crc kubenswrapper[4672]: I1206 09:24:56.099317 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:24:56 crc kubenswrapper[4672]: W1206 09:24:56.105159 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd631857_ab49_4ebb_be4e_5ea40cf50670.slice/crio-f6ab52ba3bfa6190ca973cd637f1850dcaf74d30889044098ad61865b387073d WatchSource:0}: Error finding container f6ab52ba3bfa6190ca973cd637f1850dcaf74d30889044098ad61865b387073d: Status 404 returned error can't find the container with id f6ab52ba3bfa6190ca973cd637f1850dcaf74d30889044098ad61865b387073d Dec 06 09:24:56 crc kubenswrapper[4672]: I1206 09:24:56.254770 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd631857-ab49-4ebb-be4e-5ea40cf50670","Type":"ContainerStarted","Data":"f6ab52ba3bfa6190ca973cd637f1850dcaf74d30889044098ad61865b387073d"} Dec 06 09:24:56 crc kubenswrapper[4672]: I1206 09:24:56.570767 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4908de30-b638-44a7-b414-d1ac88946fb1" path="/var/lib/kubelet/pods/4908de30-b638-44a7-b414-d1ac88946fb1/volumes" Dec 06 09:24:56 crc kubenswrapper[4672]: I1206 09:24:56.961670 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:24:57 crc kubenswrapper[4672]: I1206 09:24:57.271568 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd631857-ab49-4ebb-be4e-5ea40cf50670","Type":"ContainerStarted","Data":"9ce34cda995fe452860e9abc7ab7b3cfb26358bb7f178b5d97416d44624c3008"} Dec 06 09:24:58 crc kubenswrapper[4672]: I1206 09:24:58.297575 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd631857-ab49-4ebb-be4e-5ea40cf50670","Type":"ContainerStarted","Data":"921ae9e679b6cd28353ace79deed51f63e8f6a2b6c588520f7eba34a1d29119e"} Dec 06 09:24:58 crc kubenswrapper[4672]: I1206 09:24:58.297862 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd631857-ab49-4ebb-be4e-5ea40cf50670","Type":"ContainerStarted","Data":"df7b767ea2e3fb528e7c14b3196b4e1d985c77f781cafc11f6e45b4d8a6fd50d"} Dec 06 09:24:58 crc kubenswrapper[4672]: I1206 09:24:58.903717 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kd7p6"] Dec 06 09:24:58 crc kubenswrapper[4672]: I1206 09:24:58.905276 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kd7p6" Dec 06 09:24:58 crc kubenswrapper[4672]: I1206 09:24:58.908009 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 06 09:24:58 crc kubenswrapper[4672]: I1206 09:24:58.908706 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 06 09:24:58 crc kubenswrapper[4672]: I1206 09:24:58.909390 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-txpqx" Dec 06 09:24:58 crc kubenswrapper[4672]: I1206 09:24:58.921442 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kd7p6"] Dec 06 09:24:59 crc kubenswrapper[4672]: I1206 09:24:59.052301 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2afdbd55-d7bc-4744-878a-389d84f66824-config-data\") pod \"nova-cell0-conductor-db-sync-kd7p6\" (UID: \"2afdbd55-d7bc-4744-878a-389d84f66824\") " pod="openstack/nova-cell0-conductor-db-sync-kd7p6" Dec 06 09:24:59 crc kubenswrapper[4672]: I1206 09:24:59.052592 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2afdbd55-d7bc-4744-878a-389d84f66824-scripts\") pod \"nova-cell0-conductor-db-sync-kd7p6\" (UID: \"2afdbd55-d7bc-4744-878a-389d84f66824\") " pod="openstack/nova-cell0-conductor-db-sync-kd7p6" Dec 06 09:24:59 crc kubenswrapper[4672]: I1206 09:24:59.052815 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2afdbd55-d7bc-4744-878a-389d84f66824-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-kd7p6\" (UID: \"2afdbd55-d7bc-4744-878a-389d84f66824\") " pod="openstack/nova-cell0-conductor-db-sync-kd7p6" Dec 06 09:24:59 crc kubenswrapper[4672]: I1206 09:24:59.052991 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk452\" (UniqueName: \"kubernetes.io/projected/2afdbd55-d7bc-4744-878a-389d84f66824-kube-api-access-jk452\") pod \"nova-cell0-conductor-db-sync-kd7p6\" (UID: \"2afdbd55-d7bc-4744-878a-389d84f66824\") " pod="openstack/nova-cell0-conductor-db-sync-kd7p6" Dec 06 09:24:59 crc kubenswrapper[4672]: I1206 09:24:59.155018 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2afdbd55-d7bc-4744-878a-389d84f66824-config-data\") pod \"nova-cell0-conductor-db-sync-kd7p6\" (UID: \"2afdbd55-d7bc-4744-878a-389d84f66824\") " pod="openstack/nova-cell0-conductor-db-sync-kd7p6" Dec 06 09:24:59 crc kubenswrapper[4672]: I1206 09:24:59.155424 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2afdbd55-d7bc-4744-878a-389d84f66824-scripts\") pod \"nova-cell0-conductor-db-sync-kd7p6\" (UID: \"2afdbd55-d7bc-4744-878a-389d84f66824\") " pod="openstack/nova-cell0-conductor-db-sync-kd7p6" Dec 06 09:24:59 crc kubenswrapper[4672]: I1206 09:24:59.156536 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2afdbd55-d7bc-4744-878a-389d84f66824-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-kd7p6\" (UID: \"2afdbd55-d7bc-4744-878a-389d84f66824\") " pod="openstack/nova-cell0-conductor-db-sync-kd7p6" Dec 06 09:24:59 crc kubenswrapper[4672]: I1206 09:24:59.156752 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk452\" (UniqueName: \"kubernetes.io/projected/2afdbd55-d7bc-4744-878a-389d84f66824-kube-api-access-jk452\") pod \"nova-cell0-conductor-db-sync-kd7p6\" (UID: \"2afdbd55-d7bc-4744-878a-389d84f66824\") " pod="openstack/nova-cell0-conductor-db-sync-kd7p6" Dec 06 09:24:59 crc kubenswrapper[4672]: I1206 09:24:59.160259 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2afdbd55-d7bc-4744-878a-389d84f66824-config-data\") pod \"nova-cell0-conductor-db-sync-kd7p6\" (UID: \"2afdbd55-d7bc-4744-878a-389d84f66824\") " pod="openstack/nova-cell0-conductor-db-sync-kd7p6" Dec 06 09:24:59 crc kubenswrapper[4672]: I1206 09:24:59.160846 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2afdbd55-d7bc-4744-878a-389d84f66824-scripts\") pod \"nova-cell0-conductor-db-sync-kd7p6\" (UID: \"2afdbd55-d7bc-4744-878a-389d84f66824\") " pod="openstack/nova-cell0-conductor-db-sync-kd7p6" Dec 06 09:24:59 crc kubenswrapper[4672]: I1206 09:24:59.169073 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2afdbd55-d7bc-4744-878a-389d84f66824-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-kd7p6\" (UID: \"2afdbd55-d7bc-4744-878a-389d84f66824\") " pod="openstack/nova-cell0-conductor-db-sync-kd7p6" Dec 06 09:24:59 crc kubenswrapper[4672]: I1206 09:24:59.172162 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk452\" (UniqueName: \"kubernetes.io/projected/2afdbd55-d7bc-4744-878a-389d84f66824-kube-api-access-jk452\") pod \"nova-cell0-conductor-db-sync-kd7p6\" (UID: \"2afdbd55-d7bc-4744-878a-389d84f66824\") " pod="openstack/nova-cell0-conductor-db-sync-kd7p6" Dec 06 09:24:59 crc kubenswrapper[4672]: I1206 09:24:59.220221 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kd7p6" Dec 06 09:24:59 crc kubenswrapper[4672]: I1206 09:24:59.672331 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kd7p6"] Dec 06 09:24:59 crc kubenswrapper[4672]: W1206 09:24:59.672860 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2afdbd55_d7bc_4744_878a_389d84f66824.slice/crio-974942f29d4d62efe8cdcf36b97d43bd0ca87f5c2097d7aafc1e1fb97b4b431c WatchSource:0}: Error finding container 974942f29d4d62efe8cdcf36b97d43bd0ca87f5c2097d7aafc1e1fb97b4b431c: Status 404 returned error can't find the container with id 974942f29d4d62efe8cdcf36b97d43bd0ca87f5c2097d7aafc1e1fb97b4b431c Dec 06 09:25:00 crc kubenswrapper[4672]: I1206 09:25:00.331171 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kd7p6" event={"ID":"2afdbd55-d7bc-4744-878a-389d84f66824","Type":"ContainerStarted","Data":"974942f29d4d62efe8cdcf36b97d43bd0ca87f5c2097d7aafc1e1fb97b4b431c"} Dec 06 09:25:00 crc kubenswrapper[4672]: I1206 09:25:00.339801 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd631857-ab49-4ebb-be4e-5ea40cf50670","Type":"ContainerStarted","Data":"a7aae26ec203bd2e986ae9be42d4b08839010261c6b2f9a1434915959bd7222f"} Dec 06 09:25:00 crc kubenswrapper[4672]: I1206 09:25:00.339973 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fd631857-ab49-4ebb-be4e-5ea40cf50670" containerName="ceilometer-central-agent" containerID="cri-o://9ce34cda995fe452860e9abc7ab7b3cfb26358bb7f178b5d97416d44624c3008" gracePeriod=30 Dec 06 09:25:00 crc kubenswrapper[4672]: I1206 09:25:00.340051 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 09:25:00 crc kubenswrapper[4672]: I1206 09:25:00.340345 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fd631857-ab49-4ebb-be4e-5ea40cf50670" containerName="proxy-httpd" containerID="cri-o://a7aae26ec203bd2e986ae9be42d4b08839010261c6b2f9a1434915959bd7222f" gracePeriod=30 Dec 06 09:25:00 crc kubenswrapper[4672]: I1206 09:25:00.340374 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fd631857-ab49-4ebb-be4e-5ea40cf50670" containerName="ceilometer-notification-agent" containerID="cri-o://df7b767ea2e3fb528e7c14b3196b4e1d985c77f781cafc11f6e45b4d8a6fd50d" gracePeriod=30 Dec 06 09:25:00 crc kubenswrapper[4672]: I1206 09:25:00.340456 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fd631857-ab49-4ebb-be4e-5ea40cf50670" containerName="sg-core" containerID="cri-o://921ae9e679b6cd28353ace79deed51f63e8f6a2b6c588520f7eba34a1d29119e" gracePeriod=30 Dec 06 09:25:00 crc kubenswrapper[4672]: I1206 09:25:00.371545 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.268223596 podStartE2EDuration="5.371525824s" podCreationTimestamp="2025-12-06 09:24:55 +0000 UTC" firstStartedPulling="2025-12-06 09:24:56.107948838 +0000 UTC m=+1113.852209135" lastFinishedPulling="2025-12-06 09:24:59.211251076 +0000 UTC m=+1116.955511363" observedRunningTime="2025-12-06 09:25:00.364812843 +0000 UTC m=+1118.109073120" watchObservedRunningTime="2025-12-06 09:25:00.371525824 +0000 UTC m=+1118.115786111" Dec 06 09:25:00 crc kubenswrapper[4672]: E1206 09:25:00.888106 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd631857_ab49_4ebb_be4e_5ea40cf50670.slice/crio-df7b767ea2e3fb528e7c14b3196b4e1d985c77f781cafc11f6e45b4d8a6fd50d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd631857_ab49_4ebb_be4e_5ea40cf50670.slice/crio-conmon-df7b767ea2e3fb528e7c14b3196b4e1d985c77f781cafc11f6e45b4d8a6fd50d.scope\": RecentStats: unable to find data in memory cache]" Dec 06 09:25:01 crc kubenswrapper[4672]: I1206 09:25:01.357507 4672 generic.go:334] "Generic (PLEG): container finished" podID="fd631857-ab49-4ebb-be4e-5ea40cf50670" containerID="a7aae26ec203bd2e986ae9be42d4b08839010261c6b2f9a1434915959bd7222f" exitCode=0 Dec 06 09:25:01 crc kubenswrapper[4672]: I1206 09:25:01.357539 4672 generic.go:334] "Generic (PLEG): container finished" podID="fd631857-ab49-4ebb-be4e-5ea40cf50670" containerID="921ae9e679b6cd28353ace79deed51f63e8f6a2b6c588520f7eba34a1d29119e" exitCode=2 Dec 06 09:25:01 crc kubenswrapper[4672]: I1206 09:25:01.357548 4672 generic.go:334] "Generic (PLEG): container finished" podID="fd631857-ab49-4ebb-be4e-5ea40cf50670" containerID="df7b767ea2e3fb528e7c14b3196b4e1d985c77f781cafc11f6e45b4d8a6fd50d" exitCode=0 Dec 06 09:25:01 crc kubenswrapper[4672]: I1206 09:25:01.357567 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd631857-ab49-4ebb-be4e-5ea40cf50670","Type":"ContainerDied","Data":"a7aae26ec203bd2e986ae9be42d4b08839010261c6b2f9a1434915959bd7222f"} Dec 06 09:25:01 crc kubenswrapper[4672]: I1206 09:25:01.357592 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd631857-ab49-4ebb-be4e-5ea40cf50670","Type":"ContainerDied","Data":"921ae9e679b6cd28353ace79deed51f63e8f6a2b6c588520f7eba34a1d29119e"} Dec 06 09:25:01 crc kubenswrapper[4672]: I1206 09:25:01.357617 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd631857-ab49-4ebb-be4e-5ea40cf50670","Type":"ContainerDied","Data":"df7b767ea2e3fb528e7c14b3196b4e1d985c77f781cafc11f6e45b4d8a6fd50d"} Dec 06 09:25:06 crc kubenswrapper[4672]: I1206 09:25:06.403357 4672 generic.go:334] "Generic (PLEG): container finished" podID="fd631857-ab49-4ebb-be4e-5ea40cf50670" containerID="9ce34cda995fe452860e9abc7ab7b3cfb26358bb7f178b5d97416d44624c3008" exitCode=0 Dec 06 09:25:06 crc kubenswrapper[4672]: I1206 09:25:06.403404 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd631857-ab49-4ebb-be4e-5ea40cf50670","Type":"ContainerDied","Data":"9ce34cda995fe452860e9abc7ab7b3cfb26358bb7f178b5d97416d44624c3008"} Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.333087 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.434477 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd631857-ab49-4ebb-be4e-5ea40cf50670","Type":"ContainerDied","Data":"f6ab52ba3bfa6190ca973cd637f1850dcaf74d30889044098ad61865b387073d"} Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.434968 4672 scope.go:117] "RemoveContainer" containerID="a7aae26ec203bd2e986ae9be42d4b08839010261c6b2f9a1434915959bd7222f" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.434515 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.436164 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kd7p6" event={"ID":"2afdbd55-d7bc-4744-878a-389d84f66824","Type":"ContainerStarted","Data":"50a8592699b7ae495abb8cd99ba5e3f70412e0bcffd0a769435c6ab04dcdfda1"} Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.448940 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd631857-ab49-4ebb-be4e-5ea40cf50670-scripts\") pod \"fd631857-ab49-4ebb-be4e-5ea40cf50670\" (UID: \"fd631857-ab49-4ebb-be4e-5ea40cf50670\") " Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.448983 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd631857-ab49-4ebb-be4e-5ea40cf50670-log-httpd\") pod \"fd631857-ab49-4ebb-be4e-5ea40cf50670\" (UID: \"fd631857-ab49-4ebb-be4e-5ea40cf50670\") " Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.449161 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h4ls\" (UniqueName: \"kubernetes.io/projected/fd631857-ab49-4ebb-be4e-5ea40cf50670-kube-api-access-7h4ls\") pod \"fd631857-ab49-4ebb-be4e-5ea40cf50670\" (UID: \"fd631857-ab49-4ebb-be4e-5ea40cf50670\") " Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.449213 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd631857-ab49-4ebb-be4e-5ea40cf50670-sg-core-conf-yaml\") pod \"fd631857-ab49-4ebb-be4e-5ea40cf50670\" (UID: \"fd631857-ab49-4ebb-be4e-5ea40cf50670\") " Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.449243 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd631857-ab49-4ebb-be4e-5ea40cf50670-run-httpd\") pod \"fd631857-ab49-4ebb-be4e-5ea40cf50670\" (UID: \"fd631857-ab49-4ebb-be4e-5ea40cf50670\") " Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.449262 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd631857-ab49-4ebb-be4e-5ea40cf50670-config-data\") pod \"fd631857-ab49-4ebb-be4e-5ea40cf50670\" (UID: \"fd631857-ab49-4ebb-be4e-5ea40cf50670\") " Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.449302 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd631857-ab49-4ebb-be4e-5ea40cf50670-combined-ca-bundle\") pod \"fd631857-ab49-4ebb-be4e-5ea40cf50670\" (UID: \"fd631857-ab49-4ebb-be4e-5ea40cf50670\") " Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.449555 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd631857-ab49-4ebb-be4e-5ea40cf50670-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fd631857-ab49-4ebb-be4e-5ea40cf50670" (UID: "fd631857-ab49-4ebb-be4e-5ea40cf50670"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.449651 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd631857-ab49-4ebb-be4e-5ea40cf50670-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fd631857-ab49-4ebb-be4e-5ea40cf50670" (UID: "fd631857-ab49-4ebb-be4e-5ea40cf50670"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.455218 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd631857-ab49-4ebb-be4e-5ea40cf50670-kube-api-access-7h4ls" (OuterVolumeSpecName: "kube-api-access-7h4ls") pod "fd631857-ab49-4ebb-be4e-5ea40cf50670" (UID: "fd631857-ab49-4ebb-be4e-5ea40cf50670"). InnerVolumeSpecName "kube-api-access-7h4ls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.456263 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd631857-ab49-4ebb-be4e-5ea40cf50670-scripts" (OuterVolumeSpecName: "scripts") pod "fd631857-ab49-4ebb-be4e-5ea40cf50670" (UID: "fd631857-ab49-4ebb-be4e-5ea40cf50670"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.464349 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-kd7p6" podStartSLOduration=2.12239043 podStartE2EDuration="11.464328978s" podCreationTimestamp="2025-12-06 09:24:58 +0000 UTC" firstStartedPulling="2025-12-06 09:24:59.675131958 +0000 UTC m=+1117.419392245" lastFinishedPulling="2025-12-06 09:25:09.017070496 +0000 UTC m=+1126.761330793" observedRunningTime="2025-12-06 09:25:09.459667662 +0000 UTC m=+1127.203927949" watchObservedRunningTime="2025-12-06 09:25:09.464328978 +0000 UTC m=+1127.208589275" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.464562 4672 scope.go:117] "RemoveContainer" containerID="921ae9e679b6cd28353ace79deed51f63e8f6a2b6c588520f7eba34a1d29119e" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.479878 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd631857-ab49-4ebb-be4e-5ea40cf50670-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fd631857-ab49-4ebb-be4e-5ea40cf50670" (UID: "fd631857-ab49-4ebb-be4e-5ea40cf50670"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.484478 4672 scope.go:117] "RemoveContainer" containerID="df7b767ea2e3fb528e7c14b3196b4e1d985c77f781cafc11f6e45b4d8a6fd50d" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.507462 4672 scope.go:117] "RemoveContainer" containerID="9ce34cda995fe452860e9abc7ab7b3cfb26358bb7f178b5d97416d44624c3008" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.542947 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd631857-ab49-4ebb-be4e-5ea40cf50670-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd631857-ab49-4ebb-be4e-5ea40cf50670" (UID: "fd631857-ab49-4ebb-be4e-5ea40cf50670"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.548033 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd631857-ab49-4ebb-be4e-5ea40cf50670-config-data" (OuterVolumeSpecName: "config-data") pod "fd631857-ab49-4ebb-be4e-5ea40cf50670" (UID: "fd631857-ab49-4ebb-be4e-5ea40cf50670"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.551150 4672 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd631857-ab49-4ebb-be4e-5ea40cf50670-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.551877 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd631857-ab49-4ebb-be4e-5ea40cf50670-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.551894 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd631857-ab49-4ebb-be4e-5ea40cf50670-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.551903 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd631857-ab49-4ebb-be4e-5ea40cf50670-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.551911 4672 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd631857-ab49-4ebb-be4e-5ea40cf50670-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.551920 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h4ls\" (UniqueName: \"kubernetes.io/projected/fd631857-ab49-4ebb-be4e-5ea40cf50670-kube-api-access-7h4ls\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.551928 4672 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd631857-ab49-4ebb-be4e-5ea40cf50670-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.774395 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.791109 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.806554 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:25:09 crc kubenswrapper[4672]: E1206 09:25:09.807218 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd631857-ab49-4ebb-be4e-5ea40cf50670" containerName="proxy-httpd" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.807313 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd631857-ab49-4ebb-be4e-5ea40cf50670" containerName="proxy-httpd" Dec 06 09:25:09 crc kubenswrapper[4672]: E1206 09:25:09.807402 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd631857-ab49-4ebb-be4e-5ea40cf50670" containerName="sg-core" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.807477 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd631857-ab49-4ebb-be4e-5ea40cf50670" containerName="sg-core" Dec 06 09:25:09 crc kubenswrapper[4672]: E1206 09:25:09.807548 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd631857-ab49-4ebb-be4e-5ea40cf50670" containerName="ceilometer-notification-agent" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.807671 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd631857-ab49-4ebb-be4e-5ea40cf50670" containerName="ceilometer-notification-agent" Dec 06 09:25:09 crc kubenswrapper[4672]: E1206 09:25:09.807766 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd631857-ab49-4ebb-be4e-5ea40cf50670" containerName="ceilometer-central-agent" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.807843 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd631857-ab49-4ebb-be4e-5ea40cf50670" containerName="ceilometer-central-agent" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.808111 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd631857-ab49-4ebb-be4e-5ea40cf50670" containerName="sg-core" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.808196 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd631857-ab49-4ebb-be4e-5ea40cf50670" containerName="ceilometer-central-agent" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.808268 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd631857-ab49-4ebb-be4e-5ea40cf50670" containerName="ceilometer-notification-agent" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.808369 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd631857-ab49-4ebb-be4e-5ea40cf50670" containerName="proxy-httpd" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.810347 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.813547 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.814045 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.820722 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.959072 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-scripts\") pod \"ceilometer-0\" (UID: \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\") " pod="openstack/ceilometer-0" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.959264 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqqmb\" (UniqueName: \"kubernetes.io/projected/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-kube-api-access-sqqmb\") pod \"ceilometer-0\" (UID: \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\") " pod="openstack/ceilometer-0" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.959374 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-log-httpd\") pod \"ceilometer-0\" (UID: \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\") " pod="openstack/ceilometer-0" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.959499 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-config-data\") pod \"ceilometer-0\" (UID: \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\") " pod="openstack/ceilometer-0" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.959727 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\") " pod="openstack/ceilometer-0" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.959875 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\") " pod="openstack/ceilometer-0" Dec 06 09:25:09 crc kubenswrapper[4672]: I1206 09:25:09.959966 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-run-httpd\") pod \"ceilometer-0\" (UID: \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\") " pod="openstack/ceilometer-0" Dec 06 09:25:10 crc kubenswrapper[4672]: I1206 09:25:10.062056 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\") " pod="openstack/ceilometer-0" Dec 06 09:25:10 crc kubenswrapper[4672]: I1206 09:25:10.062113 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-run-httpd\") pod \"ceilometer-0\" (UID: \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\") " pod="openstack/ceilometer-0" Dec 06 09:25:10 crc kubenswrapper[4672]: I1206 09:25:10.062158 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-scripts\") pod \"ceilometer-0\" (UID: \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\") " pod="openstack/ceilometer-0" Dec 06 09:25:10 crc kubenswrapper[4672]: I1206 09:25:10.062197 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqqmb\" (UniqueName: \"kubernetes.io/projected/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-kube-api-access-sqqmb\") pod \"ceilometer-0\" (UID: \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\") " pod="openstack/ceilometer-0" Dec 06 09:25:10 crc kubenswrapper[4672]: I1206 09:25:10.062225 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-log-httpd\") pod \"ceilometer-0\" (UID: \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\") " pod="openstack/ceilometer-0" Dec 06 09:25:10 crc kubenswrapper[4672]: I1206 09:25:10.062259 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-config-data\") pod \"ceilometer-0\" (UID: \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\") " pod="openstack/ceilometer-0" Dec 06 09:25:10 crc kubenswrapper[4672]: I1206 09:25:10.062295 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\") " pod="openstack/ceilometer-0" Dec 06 09:25:10 crc kubenswrapper[4672]: I1206 09:25:10.062749 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-run-httpd\") pod \"ceilometer-0\" (UID: \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\") " pod="openstack/ceilometer-0" Dec 06 09:25:10 crc kubenswrapper[4672]: I1206 09:25:10.063806 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-log-httpd\") pod \"ceilometer-0\" (UID: \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\") " pod="openstack/ceilometer-0" Dec 06 09:25:10 crc kubenswrapper[4672]: I1206 09:25:10.068178 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-scripts\") pod \"ceilometer-0\" (UID: \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\") " pod="openstack/ceilometer-0" Dec 06 09:25:10 crc kubenswrapper[4672]: I1206 09:25:10.073006 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\") " pod="openstack/ceilometer-0" Dec 06 09:25:10 crc kubenswrapper[4672]: I1206 09:25:10.073951 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-config-data\") pod \"ceilometer-0\" (UID: \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\") " pod="openstack/ceilometer-0" Dec 06 09:25:10 crc kubenswrapper[4672]: I1206 09:25:10.078778 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\") " pod="openstack/ceilometer-0" Dec 06 09:25:10 crc kubenswrapper[4672]: I1206 09:25:10.083961 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqqmb\" (UniqueName: \"kubernetes.io/projected/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-kube-api-access-sqqmb\") pod \"ceilometer-0\" (UID: \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\") " pod="openstack/ceilometer-0" Dec 06 09:25:10 crc kubenswrapper[4672]: I1206 09:25:10.131918 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:25:10 crc kubenswrapper[4672]: I1206 09:25:10.583987 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd631857-ab49-4ebb-be4e-5ea40cf50670" path="/var/lib/kubelet/pods/fd631857-ab49-4ebb-be4e-5ea40cf50670/volumes" Dec 06 09:25:10 crc kubenswrapper[4672]: I1206 09:25:10.631687 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:25:10 crc kubenswrapper[4672]: W1206 09:25:10.637890 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd66af06d_1cf9_4a5e_9649_d22bc9f00b7e.slice/crio-77fd6e4591331c63f43553a4af2f7b37ddf450e2de1ec00394ff05ca17f9956e WatchSource:0}: Error finding container 77fd6e4591331c63f43553a4af2f7b37ddf450e2de1ec00394ff05ca17f9956e: Status 404 returned error can't find the container with id 77fd6e4591331c63f43553a4af2f7b37ddf450e2de1ec00394ff05ca17f9956e Dec 06 09:25:11 crc kubenswrapper[4672]: I1206 09:25:11.459003 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e","Type":"ContainerStarted","Data":"a81773c53e6e5882eee59f1c96103188127c38505a0c88fc98f710a176858be8"} Dec 06 09:25:11 crc kubenswrapper[4672]: I1206 09:25:11.459328 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e","Type":"ContainerStarted","Data":"77fd6e4591331c63f43553a4af2f7b37ddf450e2de1ec00394ff05ca17f9956e"} Dec 06 09:25:12 crc kubenswrapper[4672]: I1206 09:25:12.469619 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e","Type":"ContainerStarted","Data":"3c550e70dde59ae2a84a9b8264c0065e536239abcea3d693949a1907f33ef09c"} Dec 06 09:25:14 crc kubenswrapper[4672]: I1206 09:25:14.504898 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e","Type":"ContainerStarted","Data":"c9c8917cf28f0cce22394943f1e4a80cf81551b68bfd2601840e353214977013"} Dec 06 09:25:16 crc kubenswrapper[4672]: I1206 09:25:16.522859 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e","Type":"ContainerStarted","Data":"81db1ec6b296b40fb45c7b143629efeefd6d83d70f5cbf0d01eb38a053b72a39"} Dec 06 09:25:16 crc kubenswrapper[4672]: I1206 09:25:16.526123 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 09:25:16 crc kubenswrapper[4672]: I1206 09:25:16.548081 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.681814398 podStartE2EDuration="7.548063526s" podCreationTimestamp="2025-12-06 09:25:09 +0000 UTC" firstStartedPulling="2025-12-06 09:25:10.639879469 +0000 UTC m=+1128.384139756" lastFinishedPulling="2025-12-06 09:25:15.506128597 +0000 UTC m=+1133.250388884" observedRunningTime="2025-12-06 09:25:16.54416269 +0000 UTC m=+1134.288422977" watchObservedRunningTime="2025-12-06 09:25:16.548063526 +0000 UTC m=+1134.292323803" Dec 06 09:25:21 crc kubenswrapper[4672]: E1206 09:25:21.348978 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2afdbd55_d7bc_4744_878a_389d84f66824.slice/crio-conmon-50a8592699b7ae495abb8cd99ba5e3f70412e0bcffd0a769435c6ab04dcdfda1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2afdbd55_d7bc_4744_878a_389d84f66824.slice/crio-50a8592699b7ae495abb8cd99ba5e3f70412e0bcffd0a769435c6ab04dcdfda1.scope\": RecentStats: unable to find data in memory cache]" Dec 06 09:25:21 crc kubenswrapper[4672]: I1206 09:25:21.566334 4672 generic.go:334] "Generic (PLEG): container finished" podID="2afdbd55-d7bc-4744-878a-389d84f66824" containerID="50a8592699b7ae495abb8cd99ba5e3f70412e0bcffd0a769435c6ab04dcdfda1" exitCode=0 Dec 06 09:25:21 crc kubenswrapper[4672]: I1206 09:25:21.566374 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kd7p6" event={"ID":"2afdbd55-d7bc-4744-878a-389d84f66824","Type":"ContainerDied","Data":"50a8592699b7ae495abb8cd99ba5e3f70412e0bcffd0a769435c6ab04dcdfda1"} Dec 06 09:25:22 crc kubenswrapper[4672]: I1206 09:25:22.922265 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kd7p6" Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.037439 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2afdbd55-d7bc-4744-878a-389d84f66824-combined-ca-bundle\") pod \"2afdbd55-d7bc-4744-878a-389d84f66824\" (UID: \"2afdbd55-d7bc-4744-878a-389d84f66824\") " Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.037537 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2afdbd55-d7bc-4744-878a-389d84f66824-config-data\") pod \"2afdbd55-d7bc-4744-878a-389d84f66824\" (UID: \"2afdbd55-d7bc-4744-878a-389d84f66824\") " Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.037659 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2afdbd55-d7bc-4744-878a-389d84f66824-scripts\") pod \"2afdbd55-d7bc-4744-878a-389d84f66824\" (UID: \"2afdbd55-d7bc-4744-878a-389d84f66824\") " Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.038351 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk452\" (UniqueName: \"kubernetes.io/projected/2afdbd55-d7bc-4744-878a-389d84f66824-kube-api-access-jk452\") pod \"2afdbd55-d7bc-4744-878a-389d84f66824\" (UID: \"2afdbd55-d7bc-4744-878a-389d84f66824\") " Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.042794 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2afdbd55-d7bc-4744-878a-389d84f66824-scripts" (OuterVolumeSpecName: "scripts") pod "2afdbd55-d7bc-4744-878a-389d84f66824" (UID: "2afdbd55-d7bc-4744-878a-389d84f66824"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.043748 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2afdbd55-d7bc-4744-878a-389d84f66824-kube-api-access-jk452" (OuterVolumeSpecName: "kube-api-access-jk452") pod "2afdbd55-d7bc-4744-878a-389d84f66824" (UID: "2afdbd55-d7bc-4744-878a-389d84f66824"). InnerVolumeSpecName "kube-api-access-jk452". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.076697 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2afdbd55-d7bc-4744-878a-389d84f66824-config-data" (OuterVolumeSpecName: "config-data") pod "2afdbd55-d7bc-4744-878a-389d84f66824" (UID: "2afdbd55-d7bc-4744-878a-389d84f66824"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.078657 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2afdbd55-d7bc-4744-878a-389d84f66824-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2afdbd55-d7bc-4744-878a-389d84f66824" (UID: "2afdbd55-d7bc-4744-878a-389d84f66824"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.139917 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk452\" (UniqueName: \"kubernetes.io/projected/2afdbd55-d7bc-4744-878a-389d84f66824-kube-api-access-jk452\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.139943 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2afdbd55-d7bc-4744-878a-389d84f66824-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.139952 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2afdbd55-d7bc-4744-878a-389d84f66824-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.139962 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2afdbd55-d7bc-4744-878a-389d84f66824-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.604096 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kd7p6" event={"ID":"2afdbd55-d7bc-4744-878a-389d84f66824","Type":"ContainerDied","Data":"974942f29d4d62efe8cdcf36b97d43bd0ca87f5c2097d7aafc1e1fb97b4b431c"} Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.604174 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="974942f29d4d62efe8cdcf36b97d43bd0ca87f5c2097d7aafc1e1fb97b4b431c" Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.604275 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kd7p6" Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.727501 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 06 09:25:23 crc kubenswrapper[4672]: E1206 09:25:23.728507 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2afdbd55-d7bc-4744-878a-389d84f66824" containerName="nova-cell0-conductor-db-sync" Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.728635 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="2afdbd55-d7bc-4744-878a-389d84f66824" containerName="nova-cell0-conductor-db-sync" Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.728952 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="2afdbd55-d7bc-4744-878a-389d84f66824" containerName="nova-cell0-conductor-db-sync" Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.729740 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.737707 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-txpqx" Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.737873 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.752112 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.850810 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn27p\" (UniqueName: \"kubernetes.io/projected/ee39fc48-02ae-46b6-90b8-5b82cafad74d-kube-api-access-xn27p\") pod \"nova-cell0-conductor-0\" (UID: \"ee39fc48-02ae-46b6-90b8-5b82cafad74d\") " pod="openstack/nova-cell0-conductor-0" Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.850975 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee39fc48-02ae-46b6-90b8-5b82cafad74d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ee39fc48-02ae-46b6-90b8-5b82cafad74d\") " pod="openstack/nova-cell0-conductor-0" Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.850993 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee39fc48-02ae-46b6-90b8-5b82cafad74d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ee39fc48-02ae-46b6-90b8-5b82cafad74d\") " pod="openstack/nova-cell0-conductor-0" Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.952994 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn27p\" (UniqueName: \"kubernetes.io/projected/ee39fc48-02ae-46b6-90b8-5b82cafad74d-kube-api-access-xn27p\") pod \"nova-cell0-conductor-0\" (UID: \"ee39fc48-02ae-46b6-90b8-5b82cafad74d\") " pod="openstack/nova-cell0-conductor-0" Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.953452 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee39fc48-02ae-46b6-90b8-5b82cafad74d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ee39fc48-02ae-46b6-90b8-5b82cafad74d\") " pod="openstack/nova-cell0-conductor-0" Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.953477 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee39fc48-02ae-46b6-90b8-5b82cafad74d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ee39fc48-02ae-46b6-90b8-5b82cafad74d\") " pod="openstack/nova-cell0-conductor-0" Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.956948 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee39fc48-02ae-46b6-90b8-5b82cafad74d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ee39fc48-02ae-46b6-90b8-5b82cafad74d\") " pod="openstack/nova-cell0-conductor-0" Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.958243 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee39fc48-02ae-46b6-90b8-5b82cafad74d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ee39fc48-02ae-46b6-90b8-5b82cafad74d\") " pod="openstack/nova-cell0-conductor-0" Dec 06 09:25:23 crc kubenswrapper[4672]: I1206 09:25:23.980404 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn27p\" (UniqueName: \"kubernetes.io/projected/ee39fc48-02ae-46b6-90b8-5b82cafad74d-kube-api-access-xn27p\") pod \"nova-cell0-conductor-0\" (UID: \"ee39fc48-02ae-46b6-90b8-5b82cafad74d\") " pod="openstack/nova-cell0-conductor-0" Dec 06 09:25:24 crc kubenswrapper[4672]: I1206 09:25:24.052856 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 06 09:25:24 crc kubenswrapper[4672]: I1206 09:25:24.531995 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 06 09:25:24 crc kubenswrapper[4672]: I1206 09:25:24.613210 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ee39fc48-02ae-46b6-90b8-5b82cafad74d","Type":"ContainerStarted","Data":"0054ed77e91ff3b4aaf19efbfeb2fb8f6b3a9554d7a3a1822bca7c8c915226ce"} Dec 06 09:25:25 crc kubenswrapper[4672]: I1206 09:25:25.622882 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ee39fc48-02ae-46b6-90b8-5b82cafad74d","Type":"ContainerStarted","Data":"30dc4d722ca69957cbc62d6679fa0bb18e2e7b975e37e68b075c3ea08e2f60de"} Dec 06 09:25:25 crc kubenswrapper[4672]: I1206 09:25:25.624274 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 06 09:25:25 crc kubenswrapper[4672]: I1206 09:25:25.649741 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.649721149 podStartE2EDuration="2.649721149s" podCreationTimestamp="2025-12-06 09:25:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:25:25.648229628 +0000 UTC m=+1143.392489925" watchObservedRunningTime="2025-12-06 09:25:25.649721149 +0000 UTC m=+1143.393981446" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.102128 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.592404 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-mc84g"] Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.594178 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mc84g" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.599735 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.601120 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.605853 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mc84g"] Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.661731 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77mgt\" (UniqueName: \"kubernetes.io/projected/8b02ee21-c208-483e-b2e9-8830c54605d7-kube-api-access-77mgt\") pod \"nova-cell0-cell-mapping-mc84g\" (UID: \"8b02ee21-c208-483e-b2e9-8830c54605d7\") " pod="openstack/nova-cell0-cell-mapping-mc84g" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.661835 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b02ee21-c208-483e-b2e9-8830c54605d7-config-data\") pod \"nova-cell0-cell-mapping-mc84g\" (UID: \"8b02ee21-c208-483e-b2e9-8830c54605d7\") " pod="openstack/nova-cell0-cell-mapping-mc84g" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.661922 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b02ee21-c208-483e-b2e9-8830c54605d7-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mc84g\" (UID: \"8b02ee21-c208-483e-b2e9-8830c54605d7\") " pod="openstack/nova-cell0-cell-mapping-mc84g" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.662151 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b02ee21-c208-483e-b2e9-8830c54605d7-scripts\") pod \"nova-cell0-cell-mapping-mc84g\" (UID: \"8b02ee21-c208-483e-b2e9-8830c54605d7\") " pod="openstack/nova-cell0-cell-mapping-mc84g" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.763821 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b02ee21-c208-483e-b2e9-8830c54605d7-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mc84g\" (UID: \"8b02ee21-c208-483e-b2e9-8830c54605d7\") " pod="openstack/nova-cell0-cell-mapping-mc84g" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.764028 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b02ee21-c208-483e-b2e9-8830c54605d7-scripts\") pod \"nova-cell0-cell-mapping-mc84g\" (UID: \"8b02ee21-c208-483e-b2e9-8830c54605d7\") " pod="openstack/nova-cell0-cell-mapping-mc84g" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.764128 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77mgt\" (UniqueName: \"kubernetes.io/projected/8b02ee21-c208-483e-b2e9-8830c54605d7-kube-api-access-77mgt\") pod \"nova-cell0-cell-mapping-mc84g\" (UID: \"8b02ee21-c208-483e-b2e9-8830c54605d7\") " pod="openstack/nova-cell0-cell-mapping-mc84g" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.764208 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b02ee21-c208-483e-b2e9-8830c54605d7-config-data\") pod \"nova-cell0-cell-mapping-mc84g\" (UID: \"8b02ee21-c208-483e-b2e9-8830c54605d7\") " pod="openstack/nova-cell0-cell-mapping-mc84g" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.769755 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b02ee21-c208-483e-b2e9-8830c54605d7-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mc84g\" (UID: \"8b02ee21-c208-483e-b2e9-8830c54605d7\") " pod="openstack/nova-cell0-cell-mapping-mc84g" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.771201 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b02ee21-c208-483e-b2e9-8830c54605d7-config-data\") pod \"nova-cell0-cell-mapping-mc84g\" (UID: \"8b02ee21-c208-483e-b2e9-8830c54605d7\") " pod="openstack/nova-cell0-cell-mapping-mc84g" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.775175 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b02ee21-c208-483e-b2e9-8830c54605d7-scripts\") pod \"nova-cell0-cell-mapping-mc84g\" (UID: \"8b02ee21-c208-483e-b2e9-8830c54605d7\") " pod="openstack/nova-cell0-cell-mapping-mc84g" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.784664 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77mgt\" (UniqueName: \"kubernetes.io/projected/8b02ee21-c208-483e-b2e9-8830c54605d7-kube-api-access-77mgt\") pod \"nova-cell0-cell-mapping-mc84g\" (UID: \"8b02ee21-c208-483e-b2e9-8830c54605d7\") " pod="openstack/nova-cell0-cell-mapping-mc84g" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.803710 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.805094 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.849779 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.865453 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcqxz\" (UniqueName: \"kubernetes.io/projected/74e55b59-cf24-44da-bb4e-045a22aea20b-kube-api-access-mcqxz\") pod \"nova-api-0\" (UID: \"74e55b59-cf24-44da-bb4e-045a22aea20b\") " pod="openstack/nova-api-0" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.865511 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e55b59-cf24-44da-bb4e-045a22aea20b-config-data\") pod \"nova-api-0\" (UID: \"74e55b59-cf24-44da-bb4e-045a22aea20b\") " pod="openstack/nova-api-0" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.865548 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74e55b59-cf24-44da-bb4e-045a22aea20b-logs\") pod \"nova-api-0\" (UID: \"74e55b59-cf24-44da-bb4e-045a22aea20b\") " pod="openstack/nova-api-0" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.865597 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e55b59-cf24-44da-bb4e-045a22aea20b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"74e55b59-cf24-44da-bb4e-045a22aea20b\") " pod="openstack/nova-api-0" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.866816 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.901875 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.913735 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mc84g" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.931458 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.945467 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.970122 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e55b59-cf24-44da-bb4e-045a22aea20b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"74e55b59-cf24-44da-bb4e-045a22aea20b\") " pod="openstack/nova-api-0" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.970267 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/701514a6-6e35-4246-96fc-c1566f9f80ad-logs\") pod \"nova-metadata-0\" (UID: \"701514a6-6e35-4246-96fc-c1566f9f80ad\") " pod="openstack/nova-metadata-0" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.970318 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcqxz\" (UniqueName: \"kubernetes.io/projected/74e55b59-cf24-44da-bb4e-045a22aea20b-kube-api-access-mcqxz\") pod \"nova-api-0\" (UID: \"74e55b59-cf24-44da-bb4e-045a22aea20b\") " pod="openstack/nova-api-0" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.970349 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701514a6-6e35-4246-96fc-c1566f9f80ad-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"701514a6-6e35-4246-96fc-c1566f9f80ad\") " pod="openstack/nova-metadata-0" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.970372 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701514a6-6e35-4246-96fc-c1566f9f80ad-config-data\") pod \"nova-metadata-0\" (UID: \"701514a6-6e35-4246-96fc-c1566f9f80ad\") " pod="openstack/nova-metadata-0" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.970395 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e55b59-cf24-44da-bb4e-045a22aea20b-config-data\") pod \"nova-api-0\" (UID: \"74e55b59-cf24-44da-bb4e-045a22aea20b\") " pod="openstack/nova-api-0" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.970427 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74e55b59-cf24-44da-bb4e-045a22aea20b-logs\") pod \"nova-api-0\" (UID: \"74e55b59-cf24-44da-bb4e-045a22aea20b\") " pod="openstack/nova-api-0" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.970473 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw5k8\" (UniqueName: \"kubernetes.io/projected/701514a6-6e35-4246-96fc-c1566f9f80ad-kube-api-access-kw5k8\") pod \"nova-metadata-0\" (UID: \"701514a6-6e35-4246-96fc-c1566f9f80ad\") " pod="openstack/nova-metadata-0" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.977870 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e55b59-cf24-44da-bb4e-045a22aea20b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"74e55b59-cf24-44da-bb4e-045a22aea20b\") " pod="openstack/nova-api-0" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.979112 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74e55b59-cf24-44da-bb4e-045a22aea20b-logs\") pod \"nova-api-0\" (UID: \"74e55b59-cf24-44da-bb4e-045a22aea20b\") " pod="openstack/nova-api-0" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.988976 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e55b59-cf24-44da-bb4e-045a22aea20b-config-data\") pod \"nova-api-0\" (UID: \"74e55b59-cf24-44da-bb4e-045a22aea20b\") " pod="openstack/nova-api-0" Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.996677 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 09:25:29 crc kubenswrapper[4672]: I1206 09:25:29.998611 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.009949 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.018184 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.051926 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcqxz\" (UniqueName: \"kubernetes.io/projected/74e55b59-cf24-44da-bb4e-045a22aea20b-kube-api-access-mcqxz\") pod \"nova-api-0\" (UID: \"74e55b59-cf24-44da-bb4e-045a22aea20b\") " pod="openstack/nova-api-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.066054 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.081143 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-888wg\" (UniqueName: \"kubernetes.io/projected/12457efd-fc0b-4659-83b8-93b93139eb73-kube-api-access-888wg\") pod \"nova-scheduler-0\" (UID: \"12457efd-fc0b-4659-83b8-93b93139eb73\") " pod="openstack/nova-scheduler-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.081197 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw5k8\" (UniqueName: \"kubernetes.io/projected/701514a6-6e35-4246-96fc-c1566f9f80ad-kube-api-access-kw5k8\") pod \"nova-metadata-0\" (UID: \"701514a6-6e35-4246-96fc-c1566f9f80ad\") " pod="openstack/nova-metadata-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.081225 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12457efd-fc0b-4659-83b8-93b93139eb73-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"12457efd-fc0b-4659-83b8-93b93139eb73\") " pod="openstack/nova-scheduler-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.081305 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/701514a6-6e35-4246-96fc-c1566f9f80ad-logs\") pod \"nova-metadata-0\" (UID: \"701514a6-6e35-4246-96fc-c1566f9f80ad\") " pod="openstack/nova-metadata-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.081342 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12457efd-fc0b-4659-83b8-93b93139eb73-config-data\") pod \"nova-scheduler-0\" (UID: \"12457efd-fc0b-4659-83b8-93b93139eb73\") " pod="openstack/nova-scheduler-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.081370 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701514a6-6e35-4246-96fc-c1566f9f80ad-config-data\") pod \"nova-metadata-0\" (UID: \"701514a6-6e35-4246-96fc-c1566f9f80ad\") " pod="openstack/nova-metadata-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.081393 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701514a6-6e35-4246-96fc-c1566f9f80ad-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"701514a6-6e35-4246-96fc-c1566f9f80ad\") " pod="openstack/nova-metadata-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.082920 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/701514a6-6e35-4246-96fc-c1566f9f80ad-logs\") pod \"nova-metadata-0\" (UID: \"701514a6-6e35-4246-96fc-c1566f9f80ad\") " pod="openstack/nova-metadata-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.099509 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701514a6-6e35-4246-96fc-c1566f9f80ad-config-data\") pod \"nova-metadata-0\" (UID: \"701514a6-6e35-4246-96fc-c1566f9f80ad\") " pod="openstack/nova-metadata-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.101501 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701514a6-6e35-4246-96fc-c1566f9f80ad-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"701514a6-6e35-4246-96fc-c1566f9f80ad\") " pod="openstack/nova-metadata-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.182766 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12457efd-fc0b-4659-83b8-93b93139eb73-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"12457efd-fc0b-4659-83b8-93b93139eb73\") " pod="openstack/nova-scheduler-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.182938 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12457efd-fc0b-4659-83b8-93b93139eb73-config-data\") pod \"nova-scheduler-0\" (UID: \"12457efd-fc0b-4659-83b8-93b93139eb73\") " pod="openstack/nova-scheduler-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.183014 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-888wg\" (UniqueName: \"kubernetes.io/projected/12457efd-fc0b-4659-83b8-93b93139eb73-kube-api-access-888wg\") pod \"nova-scheduler-0\" (UID: \"12457efd-fc0b-4659-83b8-93b93139eb73\") " pod="openstack/nova-scheduler-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.188485 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw5k8\" (UniqueName: \"kubernetes.io/projected/701514a6-6e35-4246-96fc-c1566f9f80ad-kube-api-access-kw5k8\") pod \"nova-metadata-0\" (UID: \"701514a6-6e35-4246-96fc-c1566f9f80ad\") " pod="openstack/nova-metadata-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.193055 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12457efd-fc0b-4659-83b8-93b93139eb73-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"12457efd-fc0b-4659-83b8-93b93139eb73\") " pod="openstack/nova-scheduler-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.193464 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.198094 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12457efd-fc0b-4659-83b8-93b93139eb73-config-data\") pod \"nova-scheduler-0\" (UID: \"12457efd-fc0b-4659-83b8-93b93139eb73\") " pod="openstack/nova-scheduler-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.219679 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.233346 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.241503 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.258076 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.273892 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-888wg\" (UniqueName: \"kubernetes.io/projected/12457efd-fc0b-4659-83b8-93b93139eb73-kube-api-access-888wg\") pod \"nova-scheduler-0\" (UID: \"12457efd-fc0b-4659-83b8-93b93139eb73\") " pod="openstack/nova-scheduler-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.290747 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.290859 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls82g\" (UniqueName: \"kubernetes.io/projected/a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5-kube-api-access-ls82g\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.290910 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.341697 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.392923 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.392998 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.393093 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls82g\" (UniqueName: \"kubernetes.io/projected/a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5-kube-api-access-ls82g\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.403529 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bfd54d96c-j66pm"] Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.404866 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bfd54d96c-j66pm" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.431041 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.433172 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.436416 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.453807 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bfd54d96c-j66pm"] Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.476740 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls82g\" (UniqueName: \"kubernetes.io/projected/a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5-kube-api-access-ls82g\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.510202 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c28bd8fe-e324-4fb3-9056-e15d5dc67b78-ovsdbserver-nb\") pod \"dnsmasq-dns-6bfd54d96c-j66pm\" (UID: \"c28bd8fe-e324-4fb3-9056-e15d5dc67b78\") " pod="openstack/dnsmasq-dns-6bfd54d96c-j66pm" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.510273 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c28bd8fe-e324-4fb3-9056-e15d5dc67b78-dns-svc\") pod \"dnsmasq-dns-6bfd54d96c-j66pm\" (UID: \"c28bd8fe-e324-4fb3-9056-e15d5dc67b78\") " pod="openstack/dnsmasq-dns-6bfd54d96c-j66pm" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.510308 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c28bd8fe-e324-4fb3-9056-e15d5dc67b78-ovsdbserver-sb\") pod \"dnsmasq-dns-6bfd54d96c-j66pm\" (UID: \"c28bd8fe-e324-4fb3-9056-e15d5dc67b78\") " pod="openstack/dnsmasq-dns-6bfd54d96c-j66pm" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.510331 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c28bd8fe-e324-4fb3-9056-e15d5dc67b78-config\") pod \"dnsmasq-dns-6bfd54d96c-j66pm\" (UID: \"c28bd8fe-e324-4fb3-9056-e15d5dc67b78\") " pod="openstack/dnsmasq-dns-6bfd54d96c-j66pm" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.510401 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc8kq\" (UniqueName: \"kubernetes.io/projected/c28bd8fe-e324-4fb3-9056-e15d5dc67b78-kube-api-access-nc8kq\") pod \"dnsmasq-dns-6bfd54d96c-j66pm\" (UID: \"c28bd8fe-e324-4fb3-9056-e15d5dc67b78\") " pod="openstack/dnsmasq-dns-6bfd54d96c-j66pm" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.556123 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.613682 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc8kq\" (UniqueName: \"kubernetes.io/projected/c28bd8fe-e324-4fb3-9056-e15d5dc67b78-kube-api-access-nc8kq\") pod \"dnsmasq-dns-6bfd54d96c-j66pm\" (UID: \"c28bd8fe-e324-4fb3-9056-e15d5dc67b78\") " pod="openstack/dnsmasq-dns-6bfd54d96c-j66pm" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.613778 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c28bd8fe-e324-4fb3-9056-e15d5dc67b78-ovsdbserver-nb\") pod \"dnsmasq-dns-6bfd54d96c-j66pm\" (UID: \"c28bd8fe-e324-4fb3-9056-e15d5dc67b78\") " pod="openstack/dnsmasq-dns-6bfd54d96c-j66pm" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.613870 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c28bd8fe-e324-4fb3-9056-e15d5dc67b78-dns-svc\") pod \"dnsmasq-dns-6bfd54d96c-j66pm\" (UID: \"c28bd8fe-e324-4fb3-9056-e15d5dc67b78\") " pod="openstack/dnsmasq-dns-6bfd54d96c-j66pm" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.613926 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c28bd8fe-e324-4fb3-9056-e15d5dc67b78-ovsdbserver-sb\") pod \"dnsmasq-dns-6bfd54d96c-j66pm\" (UID: \"c28bd8fe-e324-4fb3-9056-e15d5dc67b78\") " pod="openstack/dnsmasq-dns-6bfd54d96c-j66pm" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.613957 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c28bd8fe-e324-4fb3-9056-e15d5dc67b78-config\") pod \"dnsmasq-dns-6bfd54d96c-j66pm\" (UID: \"c28bd8fe-e324-4fb3-9056-e15d5dc67b78\") " pod="openstack/dnsmasq-dns-6bfd54d96c-j66pm" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.615005 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c28bd8fe-e324-4fb3-9056-e15d5dc67b78-config\") pod \"dnsmasq-dns-6bfd54d96c-j66pm\" (UID: \"c28bd8fe-e324-4fb3-9056-e15d5dc67b78\") " pod="openstack/dnsmasq-dns-6bfd54d96c-j66pm" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.615176 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c28bd8fe-e324-4fb3-9056-e15d5dc67b78-ovsdbserver-nb\") pod \"dnsmasq-dns-6bfd54d96c-j66pm\" (UID: \"c28bd8fe-e324-4fb3-9056-e15d5dc67b78\") " pod="openstack/dnsmasq-dns-6bfd54d96c-j66pm" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.640946 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c28bd8fe-e324-4fb3-9056-e15d5dc67b78-ovsdbserver-sb\") pod \"dnsmasq-dns-6bfd54d96c-j66pm\" (UID: \"c28bd8fe-e324-4fb3-9056-e15d5dc67b78\") " pod="openstack/dnsmasq-dns-6bfd54d96c-j66pm" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.643276 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c28bd8fe-e324-4fb3-9056-e15d5dc67b78-dns-svc\") pod \"dnsmasq-dns-6bfd54d96c-j66pm\" (UID: \"c28bd8fe-e324-4fb3-9056-e15d5dc67b78\") " pod="openstack/dnsmasq-dns-6bfd54d96c-j66pm" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.687757 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc8kq\" (UniqueName: \"kubernetes.io/projected/c28bd8fe-e324-4fb3-9056-e15d5dc67b78-kube-api-access-nc8kq\") pod \"dnsmasq-dns-6bfd54d96c-j66pm\" (UID: \"c28bd8fe-e324-4fb3-9056-e15d5dc67b78\") " pod="openstack/dnsmasq-dns-6bfd54d96c-j66pm" Dec 06 09:25:30 crc kubenswrapper[4672]: I1206 09:25:30.772134 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bfd54d96c-j66pm" Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.117988 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:25:31 crc kubenswrapper[4672]: W1206 09:25:31.121301 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74e55b59_cf24_44da_bb4e_045a22aea20b.slice/crio-b303be087ae9c0074a0d829f68249b45039e48e66df8598d5d3a4e0be3f54fa6 WatchSource:0}: Error finding container b303be087ae9c0074a0d829f68249b45039e48e66df8598d5d3a4e0be3f54fa6: Status 404 returned error can't find the container with id b303be087ae9c0074a0d829f68249b45039e48e66df8598d5d3a4e0be3f54fa6 Dec 06 09:25:31 crc kubenswrapper[4672]: W1206 09:25:31.280322 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b02ee21_c208_483e_b2e9_8830c54605d7.slice/crio-88bde2e856eec8316e7f660f327c67f1ced4af2fd2c7b1993414a8435348c747 WatchSource:0}: Error finding container 88bde2e856eec8316e7f660f327c67f1ced4af2fd2c7b1993414a8435348c747: Status 404 returned error can't find the container with id 88bde2e856eec8316e7f660f327c67f1ced4af2fd2c7b1993414a8435348c747 Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.281495 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mc84g"] Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.374662 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.448552 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.479996 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.620540 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bfd54d96c-j66pm"] Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.636208 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zlxms"] Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.637262 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zlxms" Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.639877 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.643267 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.657703 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zlxms"] Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.736123 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"12457efd-fc0b-4659-83b8-93b93139eb73","Type":"ContainerStarted","Data":"d7111207fa5022fe1b55800a94fff3e37ad9152778927a54a6287d62e1ae0a9b"} Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.747549 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vpkb\" (UniqueName: \"kubernetes.io/projected/71ef5f2b-0e30-4ee4-b6b3-957ee08ad335-kube-api-access-5vpkb\") pod \"nova-cell1-conductor-db-sync-zlxms\" (UID: \"71ef5f2b-0e30-4ee4-b6b3-957ee08ad335\") " pod="openstack/nova-cell1-conductor-db-sync-zlxms" Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.747829 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71ef5f2b-0e30-4ee4-b6b3-957ee08ad335-scripts\") pod \"nova-cell1-conductor-db-sync-zlxms\" (UID: \"71ef5f2b-0e30-4ee4-b6b3-957ee08ad335\") " pod="openstack/nova-cell1-conductor-db-sync-zlxms" Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.747951 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71ef5f2b-0e30-4ee4-b6b3-957ee08ad335-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zlxms\" (UID: \"71ef5f2b-0e30-4ee4-b6b3-957ee08ad335\") " pod="openstack/nova-cell1-conductor-db-sync-zlxms" Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.748048 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71ef5f2b-0e30-4ee4-b6b3-957ee08ad335-config-data\") pod \"nova-cell1-conductor-db-sync-zlxms\" (UID: \"71ef5f2b-0e30-4ee4-b6b3-957ee08ad335\") " pod="openstack/nova-cell1-conductor-db-sync-zlxms" Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.751364 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bfd54d96c-j66pm" event={"ID":"c28bd8fe-e324-4fb3-9056-e15d5dc67b78","Type":"ContainerStarted","Data":"91285225fd5495a2890a1b99ab6e38976d50bf64d571d8ecc277b8f36b099a0b"} Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.753147 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"701514a6-6e35-4246-96fc-c1566f9f80ad","Type":"ContainerStarted","Data":"8c26d4665dd436b4a6298930f430f7775c856670adf3929e93deadf6d652128c"} Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.754323 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5","Type":"ContainerStarted","Data":"32ae054a31228a97568753a627c9b670f9be87f8cec95f8643e29c53f21a7bfc"} Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.776105 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mc84g" event={"ID":"8b02ee21-c208-483e-b2e9-8830c54605d7","Type":"ContainerStarted","Data":"02d219a88ba9a4eced955f38312f6ea28188845dee97713baf52fb4683cc0a89"} Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.776237 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mc84g" event={"ID":"8b02ee21-c208-483e-b2e9-8830c54605d7","Type":"ContainerStarted","Data":"88bde2e856eec8316e7f660f327c67f1ced4af2fd2c7b1993414a8435348c747"} Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.790287 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"74e55b59-cf24-44da-bb4e-045a22aea20b","Type":"ContainerStarted","Data":"b303be087ae9c0074a0d829f68249b45039e48e66df8598d5d3a4e0be3f54fa6"} Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.805439 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-mc84g" podStartSLOduration=2.805416677 podStartE2EDuration="2.805416677s" podCreationTimestamp="2025-12-06 09:25:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:25:31.801083181 +0000 UTC m=+1149.545343468" watchObservedRunningTime="2025-12-06 09:25:31.805416677 +0000 UTC m=+1149.549676964" Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.850284 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71ef5f2b-0e30-4ee4-b6b3-957ee08ad335-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zlxms\" (UID: \"71ef5f2b-0e30-4ee4-b6b3-957ee08ad335\") " pod="openstack/nova-cell1-conductor-db-sync-zlxms" Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.850333 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71ef5f2b-0e30-4ee4-b6b3-957ee08ad335-config-data\") pod \"nova-cell1-conductor-db-sync-zlxms\" (UID: \"71ef5f2b-0e30-4ee4-b6b3-957ee08ad335\") " pod="openstack/nova-cell1-conductor-db-sync-zlxms" Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.850425 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vpkb\" (UniqueName: \"kubernetes.io/projected/71ef5f2b-0e30-4ee4-b6b3-957ee08ad335-kube-api-access-5vpkb\") pod \"nova-cell1-conductor-db-sync-zlxms\" (UID: \"71ef5f2b-0e30-4ee4-b6b3-957ee08ad335\") " pod="openstack/nova-cell1-conductor-db-sync-zlxms" Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.851531 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71ef5f2b-0e30-4ee4-b6b3-957ee08ad335-scripts\") pod \"nova-cell1-conductor-db-sync-zlxms\" (UID: \"71ef5f2b-0e30-4ee4-b6b3-957ee08ad335\") " pod="openstack/nova-cell1-conductor-db-sync-zlxms" Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.877406 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71ef5f2b-0e30-4ee4-b6b3-957ee08ad335-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zlxms\" (UID: \"71ef5f2b-0e30-4ee4-b6b3-957ee08ad335\") " pod="openstack/nova-cell1-conductor-db-sync-zlxms" Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.882287 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71ef5f2b-0e30-4ee4-b6b3-957ee08ad335-config-data\") pod \"nova-cell1-conductor-db-sync-zlxms\" (UID: \"71ef5f2b-0e30-4ee4-b6b3-957ee08ad335\") " pod="openstack/nova-cell1-conductor-db-sync-zlxms" Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.882825 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71ef5f2b-0e30-4ee4-b6b3-957ee08ad335-scripts\") pod \"nova-cell1-conductor-db-sync-zlxms\" (UID: \"71ef5f2b-0e30-4ee4-b6b3-957ee08ad335\") " pod="openstack/nova-cell1-conductor-db-sync-zlxms" Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.889295 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vpkb\" (UniqueName: \"kubernetes.io/projected/71ef5f2b-0e30-4ee4-b6b3-957ee08ad335-kube-api-access-5vpkb\") pod \"nova-cell1-conductor-db-sync-zlxms\" (UID: \"71ef5f2b-0e30-4ee4-b6b3-957ee08ad335\") " pod="openstack/nova-cell1-conductor-db-sync-zlxms" Dec 06 09:25:31 crc kubenswrapper[4672]: I1206 09:25:31.958236 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zlxms" Dec 06 09:25:32 crc kubenswrapper[4672]: I1206 09:25:32.537883 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zlxms"] Dec 06 09:25:32 crc kubenswrapper[4672]: W1206 09:25:32.581996 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71ef5f2b_0e30_4ee4_b6b3_957ee08ad335.slice/crio-d90650203eff8a387804df4c568ce1d441c34fdc46e43e33d1ab080fc5d70978 WatchSource:0}: Error finding container d90650203eff8a387804df4c568ce1d441c34fdc46e43e33d1ab080fc5d70978: Status 404 returned error can't find the container with id d90650203eff8a387804df4c568ce1d441c34fdc46e43e33d1ab080fc5d70978 Dec 06 09:25:32 crc kubenswrapper[4672]: I1206 09:25:32.860515 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zlxms" event={"ID":"71ef5f2b-0e30-4ee4-b6b3-957ee08ad335","Type":"ContainerStarted","Data":"d90650203eff8a387804df4c568ce1d441c34fdc46e43e33d1ab080fc5d70978"} Dec 06 09:25:32 crc kubenswrapper[4672]: I1206 09:25:32.872316 4672 generic.go:334] "Generic (PLEG): container finished" podID="c28bd8fe-e324-4fb3-9056-e15d5dc67b78" containerID="b5cc9d5611ffdd1b7e3607894e6d8c1e3b295332316db1bce88af0e291228e5b" exitCode=0 Dec 06 09:25:32 crc kubenswrapper[4672]: I1206 09:25:32.873435 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bfd54d96c-j66pm" event={"ID":"c28bd8fe-e324-4fb3-9056-e15d5dc67b78","Type":"ContainerDied","Data":"b5cc9d5611ffdd1b7e3607894e6d8c1e3b295332316db1bce88af0e291228e5b"} Dec 06 09:25:33 crc kubenswrapper[4672]: I1206 09:25:33.894989 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bfd54d96c-j66pm" event={"ID":"c28bd8fe-e324-4fb3-9056-e15d5dc67b78","Type":"ContainerStarted","Data":"3a3414f86377df1065630c8a1819ec4af10ac678f8ccacaf0a9cdedf02ad3a9c"} Dec 06 09:25:33 crc kubenswrapper[4672]: I1206 09:25:33.895383 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bfd54d96c-j66pm" Dec 06 09:25:33 crc kubenswrapper[4672]: I1206 09:25:33.898944 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zlxms" event={"ID":"71ef5f2b-0e30-4ee4-b6b3-957ee08ad335","Type":"ContainerStarted","Data":"a2a360d83bce94c2323b21912f26a423771f5c2386dffa6443b23b111c1fc5f9"} Dec 06 09:25:33 crc kubenswrapper[4672]: I1206 09:25:33.923962 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bfd54d96c-j66pm" podStartSLOduration=3.923942031 podStartE2EDuration="3.923942031s" podCreationTimestamp="2025-12-06 09:25:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:25:33.921124085 +0000 UTC m=+1151.665384372" watchObservedRunningTime="2025-12-06 09:25:33.923942031 +0000 UTC m=+1151.668202318" Dec 06 09:25:33 crc kubenswrapper[4672]: I1206 09:25:33.949950 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-zlxms" podStartSLOduration=2.949913554 podStartE2EDuration="2.949913554s" podCreationTimestamp="2025-12-06 09:25:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:25:33.946040079 +0000 UTC m=+1151.690300366" watchObservedRunningTime="2025-12-06 09:25:33.949913554 +0000 UTC m=+1151.694173831" Dec 06 09:25:34 crc kubenswrapper[4672]: I1206 09:25:34.383357 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:25:34 crc kubenswrapper[4672]: I1206 09:25:34.413318 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 09:25:36 crc kubenswrapper[4672]: I1206 09:25:36.957044 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"74e55b59-cf24-44da-bb4e-045a22aea20b","Type":"ContainerStarted","Data":"0a6e27df90c21d052f5bb945157bf6f17d4b0be2dce8cf6983488dca656598fa"} Dec 06 09:25:36 crc kubenswrapper[4672]: I1206 09:25:36.957469 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"74e55b59-cf24-44da-bb4e-045a22aea20b","Type":"ContainerStarted","Data":"3aec3a791963140db6c2b5f3f35bd12a96010239a5066c2c262e2de161b090a1"} Dec 06 09:25:36 crc kubenswrapper[4672]: I1206 09:25:36.959732 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"12457efd-fc0b-4659-83b8-93b93139eb73","Type":"ContainerStarted","Data":"94d8d67e6af20eb7c95f5c8d66cf1ae0319fc38c4acd1e020a78ea712245646f"} Dec 06 09:25:36 crc kubenswrapper[4672]: I1206 09:25:36.961759 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"701514a6-6e35-4246-96fc-c1566f9f80ad","Type":"ContainerStarted","Data":"f5ad4ea7aa870136b72bbb0bd3eac38be1ae1dfbdc85e328c1d7b0a7c2be88d4"} Dec 06 09:25:36 crc kubenswrapper[4672]: I1206 09:25:36.961782 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"701514a6-6e35-4246-96fc-c1566f9f80ad","Type":"ContainerStarted","Data":"7787b5e6dc51292531b69726bb067898e5f6c26528bc920820369c29124c8fb7"} Dec 06 09:25:36 crc kubenswrapper[4672]: I1206 09:25:36.961863 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="701514a6-6e35-4246-96fc-c1566f9f80ad" containerName="nova-metadata-log" containerID="cri-o://7787b5e6dc51292531b69726bb067898e5f6c26528bc920820369c29124c8fb7" gracePeriod=30 Dec 06 09:25:36 crc kubenswrapper[4672]: I1206 09:25:36.962077 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="701514a6-6e35-4246-96fc-c1566f9f80ad" containerName="nova-metadata-metadata" containerID="cri-o://f5ad4ea7aa870136b72bbb0bd3eac38be1ae1dfbdc85e328c1d7b0a7c2be88d4" gracePeriod=30 Dec 06 09:25:36 crc kubenswrapper[4672]: I1206 09:25:36.967217 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5","Type":"ContainerStarted","Data":"aa15165f386808bb3f9ec3238e74576530d764b0877ac2673a8957cc8fc11283"} Dec 06 09:25:36 crc kubenswrapper[4672]: I1206 09:25:36.967479 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://aa15165f386808bb3f9ec3238e74576530d764b0877ac2673a8957cc8fc11283" gracePeriod=30 Dec 06 09:25:36 crc kubenswrapper[4672]: I1206 09:25:36.988625 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.355441617 podStartE2EDuration="7.988593204s" podCreationTimestamp="2025-12-06 09:25:29 +0000 UTC" firstStartedPulling="2025-12-06 09:25:31.124374866 +0000 UTC m=+1148.868635153" lastFinishedPulling="2025-12-06 09:25:35.757526443 +0000 UTC m=+1153.501786740" observedRunningTime="2025-12-06 09:25:36.981971185 +0000 UTC m=+1154.726231472" watchObservedRunningTime="2025-12-06 09:25:36.988593204 +0000 UTC m=+1154.732853481" Dec 06 09:25:37 crc kubenswrapper[4672]: I1206 09:25:37.001020 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.634086421 podStartE2EDuration="8.00100112s" podCreationTimestamp="2025-12-06 09:25:29 +0000 UTC" firstStartedPulling="2025-12-06 09:25:31.392000471 +0000 UTC m=+1149.136260748" lastFinishedPulling="2025-12-06 09:25:35.75891516 +0000 UTC m=+1153.503175447" observedRunningTime="2025-12-06 09:25:36.997814854 +0000 UTC m=+1154.742075141" watchObservedRunningTime="2025-12-06 09:25:37.00100112 +0000 UTC m=+1154.745261407" Dec 06 09:25:37 crc kubenswrapper[4672]: I1206 09:25:37.015723 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.785942226 podStartE2EDuration="8.015706887s" podCreationTimestamp="2025-12-06 09:25:29 +0000 UTC" firstStartedPulling="2025-12-06 09:25:31.529287653 +0000 UTC m=+1149.273547940" lastFinishedPulling="2025-12-06 09:25:35.759052314 +0000 UTC m=+1153.503312601" observedRunningTime="2025-12-06 09:25:37.012734247 +0000 UTC m=+1154.756994534" watchObservedRunningTime="2025-12-06 09:25:37.015706887 +0000 UTC m=+1154.759967174" Dec 06 09:25:37 crc kubenswrapper[4672]: I1206 09:25:37.567940 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:25:37 crc kubenswrapper[4672]: I1206 09:25:37.599798 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.338856294 podStartE2EDuration="7.599777278s" podCreationTimestamp="2025-12-06 09:25:30 +0000 UTC" firstStartedPulling="2025-12-06 09:25:31.502506768 +0000 UTC m=+1149.246767055" lastFinishedPulling="2025-12-06 09:25:35.763427752 +0000 UTC m=+1153.507688039" observedRunningTime="2025-12-06 09:25:37.036219422 +0000 UTC m=+1154.780479709" watchObservedRunningTime="2025-12-06 09:25:37.599777278 +0000 UTC m=+1155.344037555" Dec 06 09:25:37 crc kubenswrapper[4672]: I1206 09:25:37.753527 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701514a6-6e35-4246-96fc-c1566f9f80ad-combined-ca-bundle\") pod \"701514a6-6e35-4246-96fc-c1566f9f80ad\" (UID: \"701514a6-6e35-4246-96fc-c1566f9f80ad\") " Dec 06 09:25:37 crc kubenswrapper[4672]: I1206 09:25:37.753667 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/701514a6-6e35-4246-96fc-c1566f9f80ad-logs\") pod \"701514a6-6e35-4246-96fc-c1566f9f80ad\" (UID: \"701514a6-6e35-4246-96fc-c1566f9f80ad\") " Dec 06 09:25:37 crc kubenswrapper[4672]: I1206 09:25:37.753723 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kw5k8\" (UniqueName: \"kubernetes.io/projected/701514a6-6e35-4246-96fc-c1566f9f80ad-kube-api-access-kw5k8\") pod \"701514a6-6e35-4246-96fc-c1566f9f80ad\" (UID: \"701514a6-6e35-4246-96fc-c1566f9f80ad\") " Dec 06 09:25:37 crc kubenswrapper[4672]: I1206 09:25:37.754215 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/701514a6-6e35-4246-96fc-c1566f9f80ad-logs" (OuterVolumeSpecName: "logs") pod "701514a6-6e35-4246-96fc-c1566f9f80ad" (UID: "701514a6-6e35-4246-96fc-c1566f9f80ad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:25:37 crc kubenswrapper[4672]: I1206 09:25:37.754559 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701514a6-6e35-4246-96fc-c1566f9f80ad-config-data\") pod \"701514a6-6e35-4246-96fc-c1566f9f80ad\" (UID: \"701514a6-6e35-4246-96fc-c1566f9f80ad\") " Dec 06 09:25:37 crc kubenswrapper[4672]: I1206 09:25:37.755124 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/701514a6-6e35-4246-96fc-c1566f9f80ad-logs\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:37 crc kubenswrapper[4672]: I1206 09:25:37.761259 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/701514a6-6e35-4246-96fc-c1566f9f80ad-kube-api-access-kw5k8" (OuterVolumeSpecName: "kube-api-access-kw5k8") pod "701514a6-6e35-4246-96fc-c1566f9f80ad" (UID: "701514a6-6e35-4246-96fc-c1566f9f80ad"). InnerVolumeSpecName "kube-api-access-kw5k8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:25:37 crc kubenswrapper[4672]: I1206 09:25:37.788940 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701514a6-6e35-4246-96fc-c1566f9f80ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "701514a6-6e35-4246-96fc-c1566f9f80ad" (UID: "701514a6-6e35-4246-96fc-c1566f9f80ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:25:37 crc kubenswrapper[4672]: I1206 09:25:37.790803 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701514a6-6e35-4246-96fc-c1566f9f80ad-config-data" (OuterVolumeSpecName: "config-data") pod "701514a6-6e35-4246-96fc-c1566f9f80ad" (UID: "701514a6-6e35-4246-96fc-c1566f9f80ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:25:37 crc kubenswrapper[4672]: I1206 09:25:37.856337 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701514a6-6e35-4246-96fc-c1566f9f80ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:37 crc kubenswrapper[4672]: I1206 09:25:37.856923 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kw5k8\" (UniqueName: \"kubernetes.io/projected/701514a6-6e35-4246-96fc-c1566f9f80ad-kube-api-access-kw5k8\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:37 crc kubenswrapper[4672]: I1206 09:25:37.856947 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701514a6-6e35-4246-96fc-c1566f9f80ad-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:37 crc kubenswrapper[4672]: I1206 09:25:37.977178 4672 generic.go:334] "Generic (PLEG): container finished" podID="701514a6-6e35-4246-96fc-c1566f9f80ad" containerID="f5ad4ea7aa870136b72bbb0bd3eac38be1ae1dfbdc85e328c1d7b0a7c2be88d4" exitCode=0 Dec 06 09:25:37 crc kubenswrapper[4672]: I1206 09:25:37.977208 4672 generic.go:334] "Generic (PLEG): container finished" podID="701514a6-6e35-4246-96fc-c1566f9f80ad" containerID="7787b5e6dc51292531b69726bb067898e5f6c26528bc920820369c29124c8fb7" exitCode=143 Dec 06 09:25:37 crc kubenswrapper[4672]: I1206 09:25:37.977984 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:25:37 crc kubenswrapper[4672]: I1206 09:25:37.978084 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"701514a6-6e35-4246-96fc-c1566f9f80ad","Type":"ContainerDied","Data":"f5ad4ea7aa870136b72bbb0bd3eac38be1ae1dfbdc85e328c1d7b0a7c2be88d4"} Dec 06 09:25:37 crc kubenswrapper[4672]: I1206 09:25:37.978209 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"701514a6-6e35-4246-96fc-c1566f9f80ad","Type":"ContainerDied","Data":"7787b5e6dc51292531b69726bb067898e5f6c26528bc920820369c29124c8fb7"} Dec 06 09:25:37 crc kubenswrapper[4672]: I1206 09:25:37.978232 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"701514a6-6e35-4246-96fc-c1566f9f80ad","Type":"ContainerDied","Data":"8c26d4665dd436b4a6298930f430f7775c856670adf3929e93deadf6d652128c"} Dec 06 09:25:37 crc kubenswrapper[4672]: I1206 09:25:37.978247 4672 scope.go:117] "RemoveContainer" containerID="f5ad4ea7aa870136b72bbb0bd3eac38be1ae1dfbdc85e328c1d7b0a7c2be88d4" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.002914 4672 scope.go:117] "RemoveContainer" containerID="7787b5e6dc51292531b69726bb067898e5f6c26528bc920820369c29124c8fb7" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.013844 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.021075 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.038698 4672 scope.go:117] "RemoveContainer" containerID="f5ad4ea7aa870136b72bbb0bd3eac38be1ae1dfbdc85e328c1d7b0a7c2be88d4" Dec 06 09:25:38 crc kubenswrapper[4672]: E1206 09:25:38.039161 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5ad4ea7aa870136b72bbb0bd3eac38be1ae1dfbdc85e328c1d7b0a7c2be88d4\": container with ID starting with f5ad4ea7aa870136b72bbb0bd3eac38be1ae1dfbdc85e328c1d7b0a7c2be88d4 not found: ID does not exist" containerID="f5ad4ea7aa870136b72bbb0bd3eac38be1ae1dfbdc85e328c1d7b0a7c2be88d4" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.039200 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5ad4ea7aa870136b72bbb0bd3eac38be1ae1dfbdc85e328c1d7b0a7c2be88d4"} err="failed to get container status \"f5ad4ea7aa870136b72bbb0bd3eac38be1ae1dfbdc85e328c1d7b0a7c2be88d4\": rpc error: code = NotFound desc = could not find container \"f5ad4ea7aa870136b72bbb0bd3eac38be1ae1dfbdc85e328c1d7b0a7c2be88d4\": container with ID starting with f5ad4ea7aa870136b72bbb0bd3eac38be1ae1dfbdc85e328c1d7b0a7c2be88d4 not found: ID does not exist" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.039227 4672 scope.go:117] "RemoveContainer" containerID="7787b5e6dc51292531b69726bb067898e5f6c26528bc920820369c29124c8fb7" Dec 06 09:25:38 crc kubenswrapper[4672]: E1206 09:25:38.039452 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7787b5e6dc51292531b69726bb067898e5f6c26528bc920820369c29124c8fb7\": container with ID starting with 7787b5e6dc51292531b69726bb067898e5f6c26528bc920820369c29124c8fb7 not found: ID does not exist" containerID="7787b5e6dc51292531b69726bb067898e5f6c26528bc920820369c29124c8fb7" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.039469 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7787b5e6dc51292531b69726bb067898e5f6c26528bc920820369c29124c8fb7"} err="failed to get container status \"7787b5e6dc51292531b69726bb067898e5f6c26528bc920820369c29124c8fb7\": rpc error: code = NotFound desc = could not find container \"7787b5e6dc51292531b69726bb067898e5f6c26528bc920820369c29124c8fb7\": container with ID starting with 7787b5e6dc51292531b69726bb067898e5f6c26528bc920820369c29124c8fb7 not found: ID does not exist" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.039483 4672 scope.go:117] "RemoveContainer" containerID="f5ad4ea7aa870136b72bbb0bd3eac38be1ae1dfbdc85e328c1d7b0a7c2be88d4" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.039756 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5ad4ea7aa870136b72bbb0bd3eac38be1ae1dfbdc85e328c1d7b0a7c2be88d4"} err="failed to get container status \"f5ad4ea7aa870136b72bbb0bd3eac38be1ae1dfbdc85e328c1d7b0a7c2be88d4\": rpc error: code = NotFound desc = could not find container \"f5ad4ea7aa870136b72bbb0bd3eac38be1ae1dfbdc85e328c1d7b0a7c2be88d4\": container with ID starting with f5ad4ea7aa870136b72bbb0bd3eac38be1ae1dfbdc85e328c1d7b0a7c2be88d4 not found: ID does not exist" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.039796 4672 scope.go:117] "RemoveContainer" containerID="7787b5e6dc51292531b69726bb067898e5f6c26528bc920820369c29124c8fb7" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.040064 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7787b5e6dc51292531b69726bb067898e5f6c26528bc920820369c29124c8fb7"} err="failed to get container status \"7787b5e6dc51292531b69726bb067898e5f6c26528bc920820369c29124c8fb7\": rpc error: code = NotFound desc = could not find container \"7787b5e6dc51292531b69726bb067898e5f6c26528bc920820369c29124c8fb7\": container with ID starting with 7787b5e6dc51292531b69726bb067898e5f6c26528bc920820369c29124c8fb7 not found: ID does not exist" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.041231 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:25:38 crc kubenswrapper[4672]: E1206 09:25:38.041588 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="701514a6-6e35-4246-96fc-c1566f9f80ad" containerName="nova-metadata-metadata" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.041616 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="701514a6-6e35-4246-96fc-c1566f9f80ad" containerName="nova-metadata-metadata" Dec 06 09:25:38 crc kubenswrapper[4672]: E1206 09:25:38.041640 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="701514a6-6e35-4246-96fc-c1566f9f80ad" containerName="nova-metadata-log" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.041646 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="701514a6-6e35-4246-96fc-c1566f9f80ad" containerName="nova-metadata-log" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.041799 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="701514a6-6e35-4246-96fc-c1566f9f80ad" containerName="nova-metadata-metadata" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.041826 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="701514a6-6e35-4246-96fc-c1566f9f80ad" containerName="nova-metadata-log" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.042659 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.050122 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.055227 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.132722 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.160533 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afae2fde-e366-49ff-867b-a698d71c22c6-logs\") pod \"nova-metadata-0\" (UID: \"afae2fde-e366-49ff-867b-a698d71c22c6\") " pod="openstack/nova-metadata-0" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.161937 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swgq8\" (UniqueName: \"kubernetes.io/projected/afae2fde-e366-49ff-867b-a698d71c22c6-kube-api-access-swgq8\") pod \"nova-metadata-0\" (UID: \"afae2fde-e366-49ff-867b-a698d71c22c6\") " pod="openstack/nova-metadata-0" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.162082 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afae2fde-e366-49ff-867b-a698d71c22c6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"afae2fde-e366-49ff-867b-a698d71c22c6\") " pod="openstack/nova-metadata-0" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.162221 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/afae2fde-e366-49ff-867b-a698d71c22c6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"afae2fde-e366-49ff-867b-a698d71c22c6\") " pod="openstack/nova-metadata-0" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.162305 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afae2fde-e366-49ff-867b-a698d71c22c6-config-data\") pod \"nova-metadata-0\" (UID: \"afae2fde-e366-49ff-867b-a698d71c22c6\") " pod="openstack/nova-metadata-0" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.263685 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/afae2fde-e366-49ff-867b-a698d71c22c6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"afae2fde-e366-49ff-867b-a698d71c22c6\") " pod="openstack/nova-metadata-0" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.263758 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afae2fde-e366-49ff-867b-a698d71c22c6-config-data\") pod \"nova-metadata-0\" (UID: \"afae2fde-e366-49ff-867b-a698d71c22c6\") " pod="openstack/nova-metadata-0" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.263858 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afae2fde-e366-49ff-867b-a698d71c22c6-logs\") pod \"nova-metadata-0\" (UID: \"afae2fde-e366-49ff-867b-a698d71c22c6\") " pod="openstack/nova-metadata-0" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.263897 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swgq8\" (UniqueName: \"kubernetes.io/projected/afae2fde-e366-49ff-867b-a698d71c22c6-kube-api-access-swgq8\") pod \"nova-metadata-0\" (UID: \"afae2fde-e366-49ff-867b-a698d71c22c6\") " pod="openstack/nova-metadata-0" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.263933 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afae2fde-e366-49ff-867b-a698d71c22c6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"afae2fde-e366-49ff-867b-a698d71c22c6\") " pod="openstack/nova-metadata-0" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.264430 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afae2fde-e366-49ff-867b-a698d71c22c6-logs\") pod \"nova-metadata-0\" (UID: \"afae2fde-e366-49ff-867b-a698d71c22c6\") " pod="openstack/nova-metadata-0" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.267420 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afae2fde-e366-49ff-867b-a698d71c22c6-config-data\") pod \"nova-metadata-0\" (UID: \"afae2fde-e366-49ff-867b-a698d71c22c6\") " pod="openstack/nova-metadata-0" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.267493 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/afae2fde-e366-49ff-867b-a698d71c22c6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"afae2fde-e366-49ff-867b-a698d71c22c6\") " pod="openstack/nova-metadata-0" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.277156 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afae2fde-e366-49ff-867b-a698d71c22c6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"afae2fde-e366-49ff-867b-a698d71c22c6\") " pod="openstack/nova-metadata-0" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.288058 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swgq8\" (UniqueName: \"kubernetes.io/projected/afae2fde-e366-49ff-867b-a698d71c22c6-kube-api-access-swgq8\") pod \"nova-metadata-0\" (UID: \"afae2fde-e366-49ff-867b-a698d71c22c6\") " pod="openstack/nova-metadata-0" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.358994 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.567994 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="701514a6-6e35-4246-96fc-c1566f9f80ad" path="/var/lib/kubelet/pods/701514a6-6e35-4246-96fc-c1566f9f80ad/volumes" Dec 06 09:25:38 crc kubenswrapper[4672]: I1206 09:25:38.851666 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:25:39 crc kubenswrapper[4672]: I1206 09:25:39.010586 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"afae2fde-e366-49ff-867b-a698d71c22c6","Type":"ContainerStarted","Data":"90b58163f542efc84999b9415ceed90e7cd72bd86f9213c01031786897a6f13f"} Dec 06 09:25:40 crc kubenswrapper[4672]: I1206 09:25:40.023067 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"afae2fde-e366-49ff-867b-a698d71c22c6","Type":"ContainerStarted","Data":"fc8e7d4894d84704b2306557cad8eaf3cb355f2238ec887087359ae8c9a692d3"} Dec 06 09:25:40 crc kubenswrapper[4672]: I1206 09:25:40.023438 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"afae2fde-e366-49ff-867b-a698d71c22c6","Type":"ContainerStarted","Data":"3b972b404617d604634bdcc53babc2414ae7efe49676e8a2339c8bf9aec19335"} Dec 06 09:25:40 crc kubenswrapper[4672]: I1206 09:25:40.060463 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.060441192 podStartE2EDuration="2.060441192s" podCreationTimestamp="2025-12-06 09:25:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:25:40.055393395 +0000 UTC m=+1157.799653682" watchObservedRunningTime="2025-12-06 09:25:40.060441192 +0000 UTC m=+1157.804701479" Dec 06 09:25:40 crc kubenswrapper[4672]: I1206 09:25:40.194845 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 09:25:40 crc kubenswrapper[4672]: I1206 09:25:40.194894 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 09:25:40 crc kubenswrapper[4672]: I1206 09:25:40.262957 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 06 09:25:40 crc kubenswrapper[4672]: I1206 09:25:40.432126 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 06 09:25:40 crc kubenswrapper[4672]: I1206 09:25:40.432178 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 06 09:25:40 crc kubenswrapper[4672]: I1206 09:25:40.455808 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 06 09:25:40 crc kubenswrapper[4672]: I1206 09:25:40.565277 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:25:40 crc kubenswrapper[4672]: I1206 09:25:40.773826 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bfd54d96c-j66pm" Dec 06 09:25:40 crc kubenswrapper[4672]: I1206 09:25:40.866714 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d6875bb67-xw22g"] Dec 06 09:25:41 crc kubenswrapper[4672]: I1206 09:25:41.032198 4672 generic.go:334] "Generic (PLEG): container finished" podID="8b02ee21-c208-483e-b2e9-8830c54605d7" containerID="02d219a88ba9a4eced955f38312f6ea28188845dee97713baf52fb4683cc0a89" exitCode=0 Dec 06 09:25:41 crc kubenswrapper[4672]: I1206 09:25:41.032389 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d6875bb67-xw22g" podUID="1601ecfd-770a-4d88-9a5e-fb465a0b98f0" containerName="dnsmasq-dns" containerID="cri-o://a66f3532346abb63e584d93d265146eb0811561ebfa6f4fd948f61ca6a6105fb" gracePeriod=10 Dec 06 09:25:41 crc kubenswrapper[4672]: I1206 09:25:41.032466 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mc84g" event={"ID":"8b02ee21-c208-483e-b2e9-8830c54605d7","Type":"ContainerDied","Data":"02d219a88ba9a4eced955f38312f6ea28188845dee97713baf52fb4683cc0a89"} Dec 06 09:25:41 crc kubenswrapper[4672]: I1206 09:25:41.103885 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 06 09:25:41 crc kubenswrapper[4672]: I1206 09:25:41.280318 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="74e55b59-cf24-44da-bb4e-045a22aea20b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.167:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 09:25:41 crc kubenswrapper[4672]: I1206 09:25:41.280583 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="74e55b59-cf24-44da-bb4e-045a22aea20b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.167:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 09:25:41 crc kubenswrapper[4672]: I1206 09:25:41.582020 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d6875bb67-xw22g" Dec 06 09:25:41 crc kubenswrapper[4672]: I1206 09:25:41.739830 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1601ecfd-770a-4d88-9a5e-fb465a0b98f0-dns-svc\") pod \"1601ecfd-770a-4d88-9a5e-fb465a0b98f0\" (UID: \"1601ecfd-770a-4d88-9a5e-fb465a0b98f0\") " Dec 06 09:25:41 crc kubenswrapper[4672]: I1206 09:25:41.739890 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1601ecfd-770a-4d88-9a5e-fb465a0b98f0-config\") pod \"1601ecfd-770a-4d88-9a5e-fb465a0b98f0\" (UID: \"1601ecfd-770a-4d88-9a5e-fb465a0b98f0\") " Dec 06 09:25:41 crc kubenswrapper[4672]: I1206 09:25:41.739987 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjjpq\" (UniqueName: \"kubernetes.io/projected/1601ecfd-770a-4d88-9a5e-fb465a0b98f0-kube-api-access-rjjpq\") pod \"1601ecfd-770a-4d88-9a5e-fb465a0b98f0\" (UID: \"1601ecfd-770a-4d88-9a5e-fb465a0b98f0\") " Dec 06 09:25:41 crc kubenswrapper[4672]: I1206 09:25:41.740122 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1601ecfd-770a-4d88-9a5e-fb465a0b98f0-ovsdbserver-nb\") pod \"1601ecfd-770a-4d88-9a5e-fb465a0b98f0\" (UID: \"1601ecfd-770a-4d88-9a5e-fb465a0b98f0\") " Dec 06 09:25:41 crc kubenswrapper[4672]: I1206 09:25:41.740355 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1601ecfd-770a-4d88-9a5e-fb465a0b98f0-ovsdbserver-sb\") pod \"1601ecfd-770a-4d88-9a5e-fb465a0b98f0\" (UID: \"1601ecfd-770a-4d88-9a5e-fb465a0b98f0\") " Dec 06 09:25:41 crc kubenswrapper[4672]: I1206 09:25:41.746865 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1601ecfd-770a-4d88-9a5e-fb465a0b98f0-kube-api-access-rjjpq" (OuterVolumeSpecName: "kube-api-access-rjjpq") pod "1601ecfd-770a-4d88-9a5e-fb465a0b98f0" (UID: "1601ecfd-770a-4d88-9a5e-fb465a0b98f0"). InnerVolumeSpecName "kube-api-access-rjjpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:25:41 crc kubenswrapper[4672]: I1206 09:25:41.826894 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1601ecfd-770a-4d88-9a5e-fb465a0b98f0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1601ecfd-770a-4d88-9a5e-fb465a0b98f0" (UID: "1601ecfd-770a-4d88-9a5e-fb465a0b98f0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:25:41 crc kubenswrapper[4672]: I1206 09:25:41.843054 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1601ecfd-770a-4d88-9a5e-fb465a0b98f0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:41 crc kubenswrapper[4672]: I1206 09:25:41.843083 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjjpq\" (UniqueName: \"kubernetes.io/projected/1601ecfd-770a-4d88-9a5e-fb465a0b98f0-kube-api-access-rjjpq\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:41 crc kubenswrapper[4672]: I1206 09:25:41.858531 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1601ecfd-770a-4d88-9a5e-fb465a0b98f0-config" (OuterVolumeSpecName: "config") pod "1601ecfd-770a-4d88-9a5e-fb465a0b98f0" (UID: "1601ecfd-770a-4d88-9a5e-fb465a0b98f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:25:41 crc kubenswrapper[4672]: I1206 09:25:41.860824 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1601ecfd-770a-4d88-9a5e-fb465a0b98f0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1601ecfd-770a-4d88-9a5e-fb465a0b98f0" (UID: "1601ecfd-770a-4d88-9a5e-fb465a0b98f0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:25:41 crc kubenswrapper[4672]: I1206 09:25:41.863140 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1601ecfd-770a-4d88-9a5e-fb465a0b98f0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1601ecfd-770a-4d88-9a5e-fb465a0b98f0" (UID: "1601ecfd-770a-4d88-9a5e-fb465a0b98f0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:25:41 crc kubenswrapper[4672]: I1206 09:25:41.944212 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1601ecfd-770a-4d88-9a5e-fb465a0b98f0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:41 crc kubenswrapper[4672]: I1206 09:25:41.944243 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1601ecfd-770a-4d88-9a5e-fb465a0b98f0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:41 crc kubenswrapper[4672]: I1206 09:25:41.944254 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1601ecfd-770a-4d88-9a5e-fb465a0b98f0-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:42 crc kubenswrapper[4672]: I1206 09:25:42.042228 4672 generic.go:334] "Generic (PLEG): container finished" podID="1601ecfd-770a-4d88-9a5e-fb465a0b98f0" containerID="a66f3532346abb63e584d93d265146eb0811561ebfa6f4fd948f61ca6a6105fb" exitCode=0 Dec 06 09:25:42 crc kubenswrapper[4672]: I1206 09:25:42.042301 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d6875bb67-xw22g" event={"ID":"1601ecfd-770a-4d88-9a5e-fb465a0b98f0","Type":"ContainerDied","Data":"a66f3532346abb63e584d93d265146eb0811561ebfa6f4fd948f61ca6a6105fb"} Dec 06 09:25:42 crc kubenswrapper[4672]: I1206 09:25:42.042347 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d6875bb67-xw22g" event={"ID":"1601ecfd-770a-4d88-9a5e-fb465a0b98f0","Type":"ContainerDied","Data":"705dd69ff52892f097c13b0c504c7592fb8487f4b2bee996854e6481248050cc"} Dec 06 09:25:42 crc kubenswrapper[4672]: I1206 09:25:42.042367 4672 scope.go:117] "RemoveContainer" containerID="a66f3532346abb63e584d93d265146eb0811561ebfa6f4fd948f61ca6a6105fb" Dec 06 09:25:42 crc kubenswrapper[4672]: I1206 09:25:42.042488 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d6875bb67-xw22g" Dec 06 09:25:42 crc kubenswrapper[4672]: I1206 09:25:42.086418 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d6875bb67-xw22g"] Dec 06 09:25:42 crc kubenswrapper[4672]: I1206 09:25:42.096615 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d6875bb67-xw22g"] Dec 06 09:25:42 crc kubenswrapper[4672]: I1206 09:25:42.100277 4672 scope.go:117] "RemoveContainer" containerID="c9aae3441a78a1a5aab55b0bbb2bc8b9c2ee997420c9af7f90dd14217bafbf8c" Dec 06 09:25:42 crc kubenswrapper[4672]: I1206 09:25:42.134762 4672 scope.go:117] "RemoveContainer" containerID="a66f3532346abb63e584d93d265146eb0811561ebfa6f4fd948f61ca6a6105fb" Dec 06 09:25:42 crc kubenswrapper[4672]: E1206 09:25:42.135240 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a66f3532346abb63e584d93d265146eb0811561ebfa6f4fd948f61ca6a6105fb\": container with ID starting with a66f3532346abb63e584d93d265146eb0811561ebfa6f4fd948f61ca6a6105fb not found: ID does not exist" containerID="a66f3532346abb63e584d93d265146eb0811561ebfa6f4fd948f61ca6a6105fb" Dec 06 09:25:42 crc kubenswrapper[4672]: I1206 09:25:42.135289 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a66f3532346abb63e584d93d265146eb0811561ebfa6f4fd948f61ca6a6105fb"} err="failed to get container status \"a66f3532346abb63e584d93d265146eb0811561ebfa6f4fd948f61ca6a6105fb\": rpc error: code = NotFound desc = could not find container \"a66f3532346abb63e584d93d265146eb0811561ebfa6f4fd948f61ca6a6105fb\": container with ID starting with a66f3532346abb63e584d93d265146eb0811561ebfa6f4fd948f61ca6a6105fb not found: ID does not exist" Dec 06 09:25:42 crc kubenswrapper[4672]: I1206 09:25:42.135315 4672 scope.go:117] "RemoveContainer" containerID="c9aae3441a78a1a5aab55b0bbb2bc8b9c2ee997420c9af7f90dd14217bafbf8c" Dec 06 09:25:42 crc kubenswrapper[4672]: E1206 09:25:42.135640 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9aae3441a78a1a5aab55b0bbb2bc8b9c2ee997420c9af7f90dd14217bafbf8c\": container with ID starting with c9aae3441a78a1a5aab55b0bbb2bc8b9c2ee997420c9af7f90dd14217bafbf8c not found: ID does not exist" containerID="c9aae3441a78a1a5aab55b0bbb2bc8b9c2ee997420c9af7f90dd14217bafbf8c" Dec 06 09:25:42 crc kubenswrapper[4672]: I1206 09:25:42.135674 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9aae3441a78a1a5aab55b0bbb2bc8b9c2ee997420c9af7f90dd14217bafbf8c"} err="failed to get container status \"c9aae3441a78a1a5aab55b0bbb2bc8b9c2ee997420c9af7f90dd14217bafbf8c\": rpc error: code = NotFound desc = could not find container \"c9aae3441a78a1a5aab55b0bbb2bc8b9c2ee997420c9af7f90dd14217bafbf8c\": container with ID starting with c9aae3441a78a1a5aab55b0bbb2bc8b9c2ee997420c9af7f90dd14217bafbf8c not found: ID does not exist" Dec 06 09:25:42 crc kubenswrapper[4672]: I1206 09:25:42.319780 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:25:42 crc kubenswrapper[4672]: I1206 09:25:42.319831 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:25:42 crc kubenswrapper[4672]: I1206 09:25:42.428781 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mc84g" Dec 06 09:25:42 crc kubenswrapper[4672]: I1206 09:25:42.561519 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b02ee21-c208-483e-b2e9-8830c54605d7-config-data\") pod \"8b02ee21-c208-483e-b2e9-8830c54605d7\" (UID: \"8b02ee21-c208-483e-b2e9-8830c54605d7\") " Dec 06 09:25:42 crc kubenswrapper[4672]: I1206 09:25:42.561680 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b02ee21-c208-483e-b2e9-8830c54605d7-scripts\") pod \"8b02ee21-c208-483e-b2e9-8830c54605d7\" (UID: \"8b02ee21-c208-483e-b2e9-8830c54605d7\") " Dec 06 09:25:42 crc kubenswrapper[4672]: I1206 09:25:42.561751 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77mgt\" (UniqueName: \"kubernetes.io/projected/8b02ee21-c208-483e-b2e9-8830c54605d7-kube-api-access-77mgt\") pod \"8b02ee21-c208-483e-b2e9-8830c54605d7\" (UID: \"8b02ee21-c208-483e-b2e9-8830c54605d7\") " Dec 06 09:25:42 crc kubenswrapper[4672]: I1206 09:25:42.561769 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b02ee21-c208-483e-b2e9-8830c54605d7-combined-ca-bundle\") pod \"8b02ee21-c208-483e-b2e9-8830c54605d7\" (UID: \"8b02ee21-c208-483e-b2e9-8830c54605d7\") " Dec 06 09:25:42 crc kubenswrapper[4672]: I1206 09:25:42.582053 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b02ee21-c208-483e-b2e9-8830c54605d7-kube-api-access-77mgt" (OuterVolumeSpecName: "kube-api-access-77mgt") pod "8b02ee21-c208-483e-b2e9-8830c54605d7" (UID: "8b02ee21-c208-483e-b2e9-8830c54605d7"). InnerVolumeSpecName "kube-api-access-77mgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:25:42 crc kubenswrapper[4672]: I1206 09:25:42.582831 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1601ecfd-770a-4d88-9a5e-fb465a0b98f0" path="/var/lib/kubelet/pods/1601ecfd-770a-4d88-9a5e-fb465a0b98f0/volumes" Dec 06 09:25:42 crc kubenswrapper[4672]: I1206 09:25:42.586896 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b02ee21-c208-483e-b2e9-8830c54605d7-scripts" (OuterVolumeSpecName: "scripts") pod "8b02ee21-c208-483e-b2e9-8830c54605d7" (UID: "8b02ee21-c208-483e-b2e9-8830c54605d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:25:42 crc kubenswrapper[4672]: I1206 09:25:42.609072 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b02ee21-c208-483e-b2e9-8830c54605d7-config-data" (OuterVolumeSpecName: "config-data") pod "8b02ee21-c208-483e-b2e9-8830c54605d7" (UID: "8b02ee21-c208-483e-b2e9-8830c54605d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:25:42 crc kubenswrapper[4672]: I1206 09:25:42.630855 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b02ee21-c208-483e-b2e9-8830c54605d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b02ee21-c208-483e-b2e9-8830c54605d7" (UID: "8b02ee21-c208-483e-b2e9-8830c54605d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:25:42 crc kubenswrapper[4672]: I1206 09:25:42.664545 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b02ee21-c208-483e-b2e9-8830c54605d7-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:42 crc kubenswrapper[4672]: I1206 09:25:42.664578 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77mgt\" (UniqueName: \"kubernetes.io/projected/8b02ee21-c208-483e-b2e9-8830c54605d7-kube-api-access-77mgt\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:42 crc kubenswrapper[4672]: I1206 09:25:42.664590 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b02ee21-c208-483e-b2e9-8830c54605d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:42 crc kubenswrapper[4672]: I1206 09:25:42.664618 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b02ee21-c208-483e-b2e9-8830c54605d7-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:43 crc kubenswrapper[4672]: I1206 09:25:43.050017 4672 generic.go:334] "Generic (PLEG): container finished" podID="71ef5f2b-0e30-4ee4-b6b3-957ee08ad335" containerID="a2a360d83bce94c2323b21912f26a423771f5c2386dffa6443b23b111c1fc5f9" exitCode=0 Dec 06 09:25:43 crc kubenswrapper[4672]: I1206 09:25:43.050102 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zlxms" event={"ID":"71ef5f2b-0e30-4ee4-b6b3-957ee08ad335","Type":"ContainerDied","Data":"a2a360d83bce94c2323b21912f26a423771f5c2386dffa6443b23b111c1fc5f9"} Dec 06 09:25:43 crc kubenswrapper[4672]: I1206 09:25:43.051631 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mc84g" event={"ID":"8b02ee21-c208-483e-b2e9-8830c54605d7","Type":"ContainerDied","Data":"88bde2e856eec8316e7f660f327c67f1ced4af2fd2c7b1993414a8435348c747"} Dec 06 09:25:43 crc kubenswrapper[4672]: I1206 09:25:43.051658 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88bde2e856eec8316e7f660f327c67f1ced4af2fd2c7b1993414a8435348c747" Dec 06 09:25:43 crc kubenswrapper[4672]: I1206 09:25:43.051696 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mc84g" Dec 06 09:25:43 crc kubenswrapper[4672]: I1206 09:25:43.244732 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:25:43 crc kubenswrapper[4672]: I1206 09:25:43.245035 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="74e55b59-cf24-44da-bb4e-045a22aea20b" containerName="nova-api-log" containerID="cri-o://3aec3a791963140db6c2b5f3f35bd12a96010239a5066c2c262e2de161b090a1" gracePeriod=30 Dec 06 09:25:43 crc kubenswrapper[4672]: I1206 09:25:43.251276 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="74e55b59-cf24-44da-bb4e-045a22aea20b" containerName="nova-api-api" containerID="cri-o://0a6e27df90c21d052f5bb945157bf6f17d4b0be2dce8cf6983488dca656598fa" gracePeriod=30 Dec 06 09:25:43 crc kubenswrapper[4672]: I1206 09:25:43.270157 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 09:25:43 crc kubenswrapper[4672]: I1206 09:25:43.270396 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="12457efd-fc0b-4659-83b8-93b93139eb73" containerName="nova-scheduler-scheduler" containerID="cri-o://94d8d67e6af20eb7c95f5c8d66cf1ae0319fc38c4acd1e020a78ea712245646f" gracePeriod=30 Dec 06 09:25:43 crc kubenswrapper[4672]: I1206 09:25:43.288117 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:25:43 crc kubenswrapper[4672]: I1206 09:25:43.288817 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="afae2fde-e366-49ff-867b-a698d71c22c6" containerName="nova-metadata-metadata" containerID="cri-o://fc8e7d4894d84704b2306557cad8eaf3cb355f2238ec887087359ae8c9a692d3" gracePeriod=30 Dec 06 09:25:43 crc kubenswrapper[4672]: I1206 09:25:43.289006 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="afae2fde-e366-49ff-867b-a698d71c22c6" containerName="nova-metadata-log" containerID="cri-o://3b972b404617d604634bdcc53babc2414ae7efe49676e8a2339c8bf9aec19335" gracePeriod=30 Dec 06 09:25:43 crc kubenswrapper[4672]: I1206 09:25:43.360544 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 09:25:43 crc kubenswrapper[4672]: I1206 09:25:43.360591 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 09:25:43 crc kubenswrapper[4672]: I1206 09:25:43.896422 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:25:43 crc kubenswrapper[4672]: I1206 09:25:43.985893 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swgq8\" (UniqueName: \"kubernetes.io/projected/afae2fde-e366-49ff-867b-a698d71c22c6-kube-api-access-swgq8\") pod \"afae2fde-e366-49ff-867b-a698d71c22c6\" (UID: \"afae2fde-e366-49ff-867b-a698d71c22c6\") " Dec 06 09:25:43 crc kubenswrapper[4672]: I1206 09:25:43.985969 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/afae2fde-e366-49ff-867b-a698d71c22c6-nova-metadata-tls-certs\") pod \"afae2fde-e366-49ff-867b-a698d71c22c6\" (UID: \"afae2fde-e366-49ff-867b-a698d71c22c6\") " Dec 06 09:25:43 crc kubenswrapper[4672]: I1206 09:25:43.986046 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afae2fde-e366-49ff-867b-a698d71c22c6-combined-ca-bundle\") pod \"afae2fde-e366-49ff-867b-a698d71c22c6\" (UID: \"afae2fde-e366-49ff-867b-a698d71c22c6\") " Dec 06 09:25:43 crc kubenswrapper[4672]: I1206 09:25:43.986074 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afae2fde-e366-49ff-867b-a698d71c22c6-logs\") pod \"afae2fde-e366-49ff-867b-a698d71c22c6\" (UID: \"afae2fde-e366-49ff-867b-a698d71c22c6\") " Dec 06 09:25:43 crc kubenswrapper[4672]: I1206 09:25:43.986200 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afae2fde-e366-49ff-867b-a698d71c22c6-config-data\") pod \"afae2fde-e366-49ff-867b-a698d71c22c6\" (UID: \"afae2fde-e366-49ff-867b-a698d71c22c6\") " Dec 06 09:25:43 crc kubenswrapper[4672]: I1206 09:25:43.987481 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afae2fde-e366-49ff-867b-a698d71c22c6-logs" (OuterVolumeSpecName: "logs") pod "afae2fde-e366-49ff-867b-a698d71c22c6" (UID: "afae2fde-e366-49ff-867b-a698d71c22c6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:25:43 crc kubenswrapper[4672]: I1206 09:25:43.996666 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afae2fde-e366-49ff-867b-a698d71c22c6-kube-api-access-swgq8" (OuterVolumeSpecName: "kube-api-access-swgq8") pod "afae2fde-e366-49ff-867b-a698d71c22c6" (UID: "afae2fde-e366-49ff-867b-a698d71c22c6"). InnerVolumeSpecName "kube-api-access-swgq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.043133 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afae2fde-e366-49ff-867b-a698d71c22c6-config-data" (OuterVolumeSpecName: "config-data") pod "afae2fde-e366-49ff-867b-a698d71c22c6" (UID: "afae2fde-e366-49ff-867b-a698d71c22c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.045178 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afae2fde-e366-49ff-867b-a698d71c22c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "afae2fde-e366-49ff-867b-a698d71c22c6" (UID: "afae2fde-e366-49ff-867b-a698d71c22c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.056314 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.056517 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="4a2f21bb-694a-4fd5-a5b5-e1d094f8ef62" containerName="kube-state-metrics" containerID="cri-o://b50b3aaf277df5d74f98ed52523368d6640aed76ba4a931bdf1412d08d32a4b3" gracePeriod=30 Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.072462 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afae2fde-e366-49ff-867b-a698d71c22c6-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "afae2fde-e366-49ff-867b-a698d71c22c6" (UID: "afae2fde-e366-49ff-867b-a698d71c22c6"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.088728 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afae2fde-e366-49ff-867b-a698d71c22c6-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.088756 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swgq8\" (UniqueName: \"kubernetes.io/projected/afae2fde-e366-49ff-867b-a698d71c22c6-kube-api-access-swgq8\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.088767 4672 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/afae2fde-e366-49ff-867b-a698d71c22c6-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.088776 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afae2fde-e366-49ff-867b-a698d71c22c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.088784 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afae2fde-e366-49ff-867b-a698d71c22c6-logs\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.093915 4672 generic.go:334] "Generic (PLEG): container finished" podID="74e55b59-cf24-44da-bb4e-045a22aea20b" containerID="3aec3a791963140db6c2b5f3f35bd12a96010239a5066c2c262e2de161b090a1" exitCode=143 Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.093992 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"74e55b59-cf24-44da-bb4e-045a22aea20b","Type":"ContainerDied","Data":"3aec3a791963140db6c2b5f3f35bd12a96010239a5066c2c262e2de161b090a1"} Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.096902 4672 generic.go:334] "Generic (PLEG): container finished" podID="afae2fde-e366-49ff-867b-a698d71c22c6" containerID="fc8e7d4894d84704b2306557cad8eaf3cb355f2238ec887087359ae8c9a692d3" exitCode=0 Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.096922 4672 generic.go:334] "Generic (PLEG): container finished" podID="afae2fde-e366-49ff-867b-a698d71c22c6" containerID="3b972b404617d604634bdcc53babc2414ae7efe49676e8a2339c8bf9aec19335" exitCode=143 Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.097087 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.101805 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"afae2fde-e366-49ff-867b-a698d71c22c6","Type":"ContainerDied","Data":"fc8e7d4894d84704b2306557cad8eaf3cb355f2238ec887087359ae8c9a692d3"} Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.101852 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"afae2fde-e366-49ff-867b-a698d71c22c6","Type":"ContainerDied","Data":"3b972b404617d604634bdcc53babc2414ae7efe49676e8a2339c8bf9aec19335"} Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.101862 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"afae2fde-e366-49ff-867b-a698d71c22c6","Type":"ContainerDied","Data":"90b58163f542efc84999b9415ceed90e7cd72bd86f9213c01031786897a6f13f"} Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.101881 4672 scope.go:117] "RemoveContainer" containerID="fc8e7d4894d84704b2306557cad8eaf3cb355f2238ec887087359ae8c9a692d3" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.249391 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.259157 4672 scope.go:117] "RemoveContainer" containerID="3b972b404617d604634bdcc53babc2414ae7efe49676e8a2339c8bf9aec19335" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.275179 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.292791 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:25:44 crc kubenswrapper[4672]: E1206 09:25:44.293184 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afae2fde-e366-49ff-867b-a698d71c22c6" containerName="nova-metadata-log" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.293197 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="afae2fde-e366-49ff-867b-a698d71c22c6" containerName="nova-metadata-log" Dec 06 09:25:44 crc kubenswrapper[4672]: E1206 09:25:44.293208 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1601ecfd-770a-4d88-9a5e-fb465a0b98f0" containerName="dnsmasq-dns" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.293214 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="1601ecfd-770a-4d88-9a5e-fb465a0b98f0" containerName="dnsmasq-dns" Dec 06 09:25:44 crc kubenswrapper[4672]: E1206 09:25:44.293251 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b02ee21-c208-483e-b2e9-8830c54605d7" containerName="nova-manage" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.293258 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b02ee21-c208-483e-b2e9-8830c54605d7" containerName="nova-manage" Dec 06 09:25:44 crc kubenswrapper[4672]: E1206 09:25:44.293271 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afae2fde-e366-49ff-867b-a698d71c22c6" containerName="nova-metadata-metadata" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.293277 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="afae2fde-e366-49ff-867b-a698d71c22c6" containerName="nova-metadata-metadata" Dec 06 09:25:44 crc kubenswrapper[4672]: E1206 09:25:44.293292 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1601ecfd-770a-4d88-9a5e-fb465a0b98f0" containerName="init" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.293299 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="1601ecfd-770a-4d88-9a5e-fb465a0b98f0" containerName="init" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.293457 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="afae2fde-e366-49ff-867b-a698d71c22c6" containerName="nova-metadata-metadata" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.293469 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="afae2fde-e366-49ff-867b-a698d71c22c6" containerName="nova-metadata-log" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.293476 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="1601ecfd-770a-4d88-9a5e-fb465a0b98f0" containerName="dnsmasq-dns" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.293484 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b02ee21-c208-483e-b2e9-8830c54605d7" containerName="nova-manage" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.294443 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.297544 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.299033 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.300683 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.319367 4672 scope.go:117] "RemoveContainer" containerID="fc8e7d4894d84704b2306557cad8eaf3cb355f2238ec887087359ae8c9a692d3" Dec 06 09:25:44 crc kubenswrapper[4672]: E1206 09:25:44.321736 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc8e7d4894d84704b2306557cad8eaf3cb355f2238ec887087359ae8c9a692d3\": container with ID starting with fc8e7d4894d84704b2306557cad8eaf3cb355f2238ec887087359ae8c9a692d3 not found: ID does not exist" containerID="fc8e7d4894d84704b2306557cad8eaf3cb355f2238ec887087359ae8c9a692d3" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.321774 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc8e7d4894d84704b2306557cad8eaf3cb355f2238ec887087359ae8c9a692d3"} err="failed to get container status \"fc8e7d4894d84704b2306557cad8eaf3cb355f2238ec887087359ae8c9a692d3\": rpc error: code = NotFound desc = could not find container \"fc8e7d4894d84704b2306557cad8eaf3cb355f2238ec887087359ae8c9a692d3\": container with ID starting with fc8e7d4894d84704b2306557cad8eaf3cb355f2238ec887087359ae8c9a692d3 not found: ID does not exist" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.321800 4672 scope.go:117] "RemoveContainer" containerID="3b972b404617d604634bdcc53babc2414ae7efe49676e8a2339c8bf9aec19335" Dec 06 09:25:44 crc kubenswrapper[4672]: E1206 09:25:44.323731 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b972b404617d604634bdcc53babc2414ae7efe49676e8a2339c8bf9aec19335\": container with ID starting with 3b972b404617d604634bdcc53babc2414ae7efe49676e8a2339c8bf9aec19335 not found: ID does not exist" containerID="3b972b404617d604634bdcc53babc2414ae7efe49676e8a2339c8bf9aec19335" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.323750 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b972b404617d604634bdcc53babc2414ae7efe49676e8a2339c8bf9aec19335"} err="failed to get container status \"3b972b404617d604634bdcc53babc2414ae7efe49676e8a2339c8bf9aec19335\": rpc error: code = NotFound desc = could not find container \"3b972b404617d604634bdcc53babc2414ae7efe49676e8a2339c8bf9aec19335\": container with ID starting with 3b972b404617d604634bdcc53babc2414ae7efe49676e8a2339c8bf9aec19335 not found: ID does not exist" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.323764 4672 scope.go:117] "RemoveContainer" containerID="fc8e7d4894d84704b2306557cad8eaf3cb355f2238ec887087359ae8c9a692d3" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.326699 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc8e7d4894d84704b2306557cad8eaf3cb355f2238ec887087359ae8c9a692d3"} err="failed to get container status \"fc8e7d4894d84704b2306557cad8eaf3cb355f2238ec887087359ae8c9a692d3\": rpc error: code = NotFound desc = could not find container \"fc8e7d4894d84704b2306557cad8eaf3cb355f2238ec887087359ae8c9a692d3\": container with ID starting with fc8e7d4894d84704b2306557cad8eaf3cb355f2238ec887087359ae8c9a692d3 not found: ID does not exist" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.326723 4672 scope.go:117] "RemoveContainer" containerID="3b972b404617d604634bdcc53babc2414ae7efe49676e8a2339c8bf9aec19335" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.329629 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b972b404617d604634bdcc53babc2414ae7efe49676e8a2339c8bf9aec19335"} err="failed to get container status \"3b972b404617d604634bdcc53babc2414ae7efe49676e8a2339c8bf9aec19335\": rpc error: code = NotFound desc = could not find container \"3b972b404617d604634bdcc53babc2414ae7efe49676e8a2339c8bf9aec19335\": container with ID starting with 3b972b404617d604634bdcc53babc2414ae7efe49676e8a2339c8bf9aec19335 not found: ID does not exist" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.393982 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a8fc694-01bf-4882-a9f8-07d026a37ee2-config-data\") pod \"nova-metadata-0\" (UID: \"8a8fc694-01bf-4882-a9f8-07d026a37ee2\") " pod="openstack/nova-metadata-0" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.394375 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a8fc694-01bf-4882-a9f8-07d026a37ee2-logs\") pod \"nova-metadata-0\" (UID: \"8a8fc694-01bf-4882-a9f8-07d026a37ee2\") " pod="openstack/nova-metadata-0" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.394408 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a8fc694-01bf-4882-a9f8-07d026a37ee2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8a8fc694-01bf-4882-a9f8-07d026a37ee2\") " pod="openstack/nova-metadata-0" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.394471 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a8fc694-01bf-4882-a9f8-07d026a37ee2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8a8fc694-01bf-4882-a9f8-07d026a37ee2\") " pod="openstack/nova-metadata-0" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.394505 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jck7x\" (UniqueName: \"kubernetes.io/projected/8a8fc694-01bf-4882-a9f8-07d026a37ee2-kube-api-access-jck7x\") pod \"nova-metadata-0\" (UID: \"8a8fc694-01bf-4882-a9f8-07d026a37ee2\") " pod="openstack/nova-metadata-0" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.495673 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a8fc694-01bf-4882-a9f8-07d026a37ee2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8a8fc694-01bf-4882-a9f8-07d026a37ee2\") " pod="openstack/nova-metadata-0" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.495719 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jck7x\" (UniqueName: \"kubernetes.io/projected/8a8fc694-01bf-4882-a9f8-07d026a37ee2-kube-api-access-jck7x\") pod \"nova-metadata-0\" (UID: \"8a8fc694-01bf-4882-a9f8-07d026a37ee2\") " pod="openstack/nova-metadata-0" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.495753 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a8fc694-01bf-4882-a9f8-07d026a37ee2-config-data\") pod \"nova-metadata-0\" (UID: \"8a8fc694-01bf-4882-a9f8-07d026a37ee2\") " pod="openstack/nova-metadata-0" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.495809 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a8fc694-01bf-4882-a9f8-07d026a37ee2-logs\") pod \"nova-metadata-0\" (UID: \"8a8fc694-01bf-4882-a9f8-07d026a37ee2\") " pod="openstack/nova-metadata-0" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.495843 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a8fc694-01bf-4882-a9f8-07d026a37ee2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8a8fc694-01bf-4882-a9f8-07d026a37ee2\") " pod="openstack/nova-metadata-0" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.496939 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a8fc694-01bf-4882-a9f8-07d026a37ee2-logs\") pod \"nova-metadata-0\" (UID: \"8a8fc694-01bf-4882-a9f8-07d026a37ee2\") " pod="openstack/nova-metadata-0" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.499378 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a8fc694-01bf-4882-a9f8-07d026a37ee2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8a8fc694-01bf-4882-a9f8-07d026a37ee2\") " pod="openstack/nova-metadata-0" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.499533 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a8fc694-01bf-4882-a9f8-07d026a37ee2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8a8fc694-01bf-4882-a9f8-07d026a37ee2\") " pod="openstack/nova-metadata-0" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.500700 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a8fc694-01bf-4882-a9f8-07d026a37ee2-config-data\") pod \"nova-metadata-0\" (UID: \"8a8fc694-01bf-4882-a9f8-07d026a37ee2\") " pod="openstack/nova-metadata-0" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.526559 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jck7x\" (UniqueName: \"kubernetes.io/projected/8a8fc694-01bf-4882-a9f8-07d026a37ee2-kube-api-access-jck7x\") pod \"nova-metadata-0\" (UID: \"8a8fc694-01bf-4882-a9f8-07d026a37ee2\") " pod="openstack/nova-metadata-0" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.588159 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afae2fde-e366-49ff-867b-a698d71c22c6" path="/var/lib/kubelet/pods/afae2fde-e366-49ff-867b-a698d71c22c6/volumes" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.621379 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.644153 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zlxms" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.649697 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.830966 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.888767 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vpkb\" (UniqueName: \"kubernetes.io/projected/71ef5f2b-0e30-4ee4-b6b3-957ee08ad335-kube-api-access-5vpkb\") pod \"71ef5f2b-0e30-4ee4-b6b3-957ee08ad335\" (UID: \"71ef5f2b-0e30-4ee4-b6b3-957ee08ad335\") " Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.888825 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bldqw\" (UniqueName: \"kubernetes.io/projected/4a2f21bb-694a-4fd5-a5b5-e1d094f8ef62-kube-api-access-bldqw\") pod \"4a2f21bb-694a-4fd5-a5b5-e1d094f8ef62\" (UID: \"4a2f21bb-694a-4fd5-a5b5-e1d094f8ef62\") " Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.888932 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71ef5f2b-0e30-4ee4-b6b3-957ee08ad335-config-data\") pod \"71ef5f2b-0e30-4ee4-b6b3-957ee08ad335\" (UID: \"71ef5f2b-0e30-4ee4-b6b3-957ee08ad335\") " Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.888966 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71ef5f2b-0e30-4ee4-b6b3-957ee08ad335-combined-ca-bundle\") pod \"71ef5f2b-0e30-4ee4-b6b3-957ee08ad335\" (UID: \"71ef5f2b-0e30-4ee4-b6b3-957ee08ad335\") " Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.889010 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71ef5f2b-0e30-4ee4-b6b3-957ee08ad335-scripts\") pod \"71ef5f2b-0e30-4ee4-b6b3-957ee08ad335\" (UID: \"71ef5f2b-0e30-4ee4-b6b3-957ee08ad335\") " Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.897367 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a2f21bb-694a-4fd5-a5b5-e1d094f8ef62-kube-api-access-bldqw" (OuterVolumeSpecName: "kube-api-access-bldqw") pod "4a2f21bb-694a-4fd5-a5b5-e1d094f8ef62" (UID: "4a2f21bb-694a-4fd5-a5b5-e1d094f8ef62"). InnerVolumeSpecName "kube-api-access-bldqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.898913 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71ef5f2b-0e30-4ee4-b6b3-957ee08ad335-scripts" (OuterVolumeSpecName: "scripts") pod "71ef5f2b-0e30-4ee4-b6b3-957ee08ad335" (UID: "71ef5f2b-0e30-4ee4-b6b3-957ee08ad335"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.899411 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71ef5f2b-0e30-4ee4-b6b3-957ee08ad335-kube-api-access-5vpkb" (OuterVolumeSpecName: "kube-api-access-5vpkb") pod "71ef5f2b-0e30-4ee4-b6b3-957ee08ad335" (UID: "71ef5f2b-0e30-4ee4-b6b3-957ee08ad335"). InnerVolumeSpecName "kube-api-access-5vpkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.916810 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71ef5f2b-0e30-4ee4-b6b3-957ee08ad335-config-data" (OuterVolumeSpecName: "config-data") pod "71ef5f2b-0e30-4ee4-b6b3-957ee08ad335" (UID: "71ef5f2b-0e30-4ee4-b6b3-957ee08ad335"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.939690 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71ef5f2b-0e30-4ee4-b6b3-957ee08ad335-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71ef5f2b-0e30-4ee4-b6b3-957ee08ad335" (UID: "71ef5f2b-0e30-4ee4-b6b3-957ee08ad335"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.992268 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12457efd-fc0b-4659-83b8-93b93139eb73-combined-ca-bundle\") pod \"12457efd-fc0b-4659-83b8-93b93139eb73\" (UID: \"12457efd-fc0b-4659-83b8-93b93139eb73\") " Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.992358 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12457efd-fc0b-4659-83b8-93b93139eb73-config-data\") pod \"12457efd-fc0b-4659-83b8-93b93139eb73\" (UID: \"12457efd-fc0b-4659-83b8-93b93139eb73\") " Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.992447 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-888wg\" (UniqueName: \"kubernetes.io/projected/12457efd-fc0b-4659-83b8-93b93139eb73-kube-api-access-888wg\") pod \"12457efd-fc0b-4659-83b8-93b93139eb73\" (UID: \"12457efd-fc0b-4659-83b8-93b93139eb73\") " Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.993894 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vpkb\" (UniqueName: \"kubernetes.io/projected/71ef5f2b-0e30-4ee4-b6b3-957ee08ad335-kube-api-access-5vpkb\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.993925 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bldqw\" (UniqueName: \"kubernetes.io/projected/4a2f21bb-694a-4fd5-a5b5-e1d094f8ef62-kube-api-access-bldqw\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.993936 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71ef5f2b-0e30-4ee4-b6b3-957ee08ad335-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.993946 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71ef5f2b-0e30-4ee4-b6b3-957ee08ad335-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.993954 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71ef5f2b-0e30-4ee4-b6b3-957ee08ad335-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:44 crc kubenswrapper[4672]: I1206 09:25:44.997748 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12457efd-fc0b-4659-83b8-93b93139eb73-kube-api-access-888wg" (OuterVolumeSpecName: "kube-api-access-888wg") pod "12457efd-fc0b-4659-83b8-93b93139eb73" (UID: "12457efd-fc0b-4659-83b8-93b93139eb73"). InnerVolumeSpecName "kube-api-access-888wg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.018601 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12457efd-fc0b-4659-83b8-93b93139eb73-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12457efd-fc0b-4659-83b8-93b93139eb73" (UID: "12457efd-fc0b-4659-83b8-93b93139eb73"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.023193 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12457efd-fc0b-4659-83b8-93b93139eb73-config-data" (OuterVolumeSpecName: "config-data") pod "12457efd-fc0b-4659-83b8-93b93139eb73" (UID: "12457efd-fc0b-4659-83b8-93b93139eb73"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.095916 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12457efd-fc0b-4659-83b8-93b93139eb73-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.095953 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-888wg\" (UniqueName: \"kubernetes.io/projected/12457efd-fc0b-4659-83b8-93b93139eb73-kube-api-access-888wg\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.095965 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12457efd-fc0b-4659-83b8-93b93139eb73-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.111700 4672 generic.go:334] "Generic (PLEG): container finished" podID="4a2f21bb-694a-4fd5-a5b5-e1d094f8ef62" containerID="b50b3aaf277df5d74f98ed52523368d6640aed76ba4a931bdf1412d08d32a4b3" exitCode=2 Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.111775 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4a2f21bb-694a-4fd5-a5b5-e1d094f8ef62","Type":"ContainerDied","Data":"b50b3aaf277df5d74f98ed52523368d6640aed76ba4a931bdf1412d08d32a4b3"} Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.111785 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.111801 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4a2f21bb-694a-4fd5-a5b5-e1d094f8ef62","Type":"ContainerDied","Data":"c1a5266830d1127726b933682c619f5717e57d1112af092a9dec2d6b7102b4b5"} Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.111819 4672 scope.go:117] "RemoveContainer" containerID="b50b3aaf277df5d74f98ed52523368d6640aed76ba4a931bdf1412d08d32a4b3" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.113948 4672 generic.go:334] "Generic (PLEG): container finished" podID="12457efd-fc0b-4659-83b8-93b93139eb73" containerID="94d8d67e6af20eb7c95f5c8d66cf1ae0319fc38c4acd1e020a78ea712245646f" exitCode=0 Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.114018 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"12457efd-fc0b-4659-83b8-93b93139eb73","Type":"ContainerDied","Data":"94d8d67e6af20eb7c95f5c8d66cf1ae0319fc38c4acd1e020a78ea712245646f"} Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.114035 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"12457efd-fc0b-4659-83b8-93b93139eb73","Type":"ContainerDied","Data":"d7111207fa5022fe1b55800a94fff3e37ad9152778927a54a6287d62e1ae0a9b"} Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.114110 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.125979 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zlxms" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.125977 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zlxms" event={"ID":"71ef5f2b-0e30-4ee4-b6b3-957ee08ad335","Type":"ContainerDied","Data":"d90650203eff8a387804df4c568ce1d441c34fdc46e43e33d1ab080fc5d70978"} Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.126097 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d90650203eff8a387804df4c568ce1d441c34fdc46e43e33d1ab080fc5d70978" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.142139 4672 scope.go:117] "RemoveContainer" containerID="b50b3aaf277df5d74f98ed52523368d6640aed76ba4a931bdf1412d08d32a4b3" Dec 06 09:25:45 crc kubenswrapper[4672]: E1206 09:25:45.142959 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b50b3aaf277df5d74f98ed52523368d6640aed76ba4a931bdf1412d08d32a4b3\": container with ID starting with b50b3aaf277df5d74f98ed52523368d6640aed76ba4a931bdf1412d08d32a4b3 not found: ID does not exist" containerID="b50b3aaf277df5d74f98ed52523368d6640aed76ba4a931bdf1412d08d32a4b3" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.142992 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b50b3aaf277df5d74f98ed52523368d6640aed76ba4a931bdf1412d08d32a4b3"} err="failed to get container status \"b50b3aaf277df5d74f98ed52523368d6640aed76ba4a931bdf1412d08d32a4b3\": rpc error: code = NotFound desc = could not find container \"b50b3aaf277df5d74f98ed52523368d6640aed76ba4a931bdf1412d08d32a4b3\": container with ID starting with b50b3aaf277df5d74f98ed52523368d6640aed76ba4a931bdf1412d08d32a4b3 not found: ID does not exist" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.143017 4672 scope.go:117] "RemoveContainer" containerID="94d8d67e6af20eb7c95f5c8d66cf1ae0319fc38c4acd1e020a78ea712245646f" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.178356 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.205687 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.219552 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 09:25:45 crc kubenswrapper[4672]: E1206 09:25:45.220030 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a2f21bb-694a-4fd5-a5b5-e1d094f8ef62" containerName="kube-state-metrics" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.220043 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2f21bb-694a-4fd5-a5b5-e1d094f8ef62" containerName="kube-state-metrics" Dec 06 09:25:45 crc kubenswrapper[4672]: E1206 09:25:45.220056 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ef5f2b-0e30-4ee4-b6b3-957ee08ad335" containerName="nova-cell1-conductor-db-sync" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.220069 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ef5f2b-0e30-4ee4-b6b3-957ee08ad335" containerName="nova-cell1-conductor-db-sync" Dec 06 09:25:45 crc kubenswrapper[4672]: E1206 09:25:45.220092 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12457efd-fc0b-4659-83b8-93b93139eb73" containerName="nova-scheduler-scheduler" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.220099 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="12457efd-fc0b-4659-83b8-93b93139eb73" containerName="nova-scheduler-scheduler" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.220273 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="12457efd-fc0b-4659-83b8-93b93139eb73" containerName="nova-scheduler-scheduler" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.220284 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a2f21bb-694a-4fd5-a5b5-e1d094f8ef62" containerName="kube-state-metrics" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.220301 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ef5f2b-0e30-4ee4-b6b3-957ee08ad335" containerName="nova-cell1-conductor-db-sync" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.220877 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.223733 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.223916 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.245165 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.246344 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.250466 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.259657 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.268188 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.278893 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.304697 4672 scope.go:117] "RemoveContainer" containerID="94d8d67e6af20eb7c95f5c8d66cf1ae0319fc38c4acd1e020a78ea712245646f" Dec 06 09:25:45 crc kubenswrapper[4672]: E1206 09:25:45.308197 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94d8d67e6af20eb7c95f5c8d66cf1ae0319fc38c4acd1e020a78ea712245646f\": container with ID starting with 94d8d67e6af20eb7c95f5c8d66cf1ae0319fc38c4acd1e020a78ea712245646f not found: ID does not exist" containerID="94d8d67e6af20eb7c95f5c8d66cf1ae0319fc38c4acd1e020a78ea712245646f" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.308350 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94d8d67e6af20eb7c95f5c8d66cf1ae0319fc38c4acd1e020a78ea712245646f"} err="failed to get container status \"94d8d67e6af20eb7c95f5c8d66cf1ae0319fc38c4acd1e020a78ea712245646f\": rpc error: code = NotFound desc = could not find container \"94d8d67e6af20eb7c95f5c8d66cf1ae0319fc38c4acd1e020a78ea712245646f\": container with ID starting with 94d8d67e6af20eb7c95f5c8d66cf1ae0319fc38c4acd1e020a78ea712245646f not found: ID does not exist" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.390818 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.405551 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d6a82c78-9f40-4c1a-8f10-03f92549df7b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d6a82c78-9f40-4c1a-8f10-03f92549df7b\") " pod="openstack/kube-state-metrics-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.405628 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1902d9-bb65-4974-a922-056811447603-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"2c1902d9-bb65-4974-a922-056811447603\") " pod="openstack/nova-cell1-conductor-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.405671 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1902d9-bb65-4974-a922-056811447603-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"2c1902d9-bb65-4974-a922-056811447603\") " pod="openstack/nova-cell1-conductor-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.405693 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6a82c78-9f40-4c1a-8f10-03f92549df7b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d6a82c78-9f40-4c1a-8f10-03f92549df7b\") " pod="openstack/kube-state-metrics-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.405711 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z52tv\" (UniqueName: \"kubernetes.io/projected/2c1902d9-bb65-4974-a922-056811447603-kube-api-access-z52tv\") pod \"nova-cell1-conductor-0\" (UID: \"2c1902d9-bb65-4974-a922-056811447603\") " pod="openstack/nova-cell1-conductor-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.405727 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6a82c78-9f40-4c1a-8f10-03f92549df7b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d6a82c78-9f40-4c1a-8f10-03f92549df7b\") " pod="openstack/kube-state-metrics-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.405756 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsd5z\" (UniqueName: \"kubernetes.io/projected/d6a82c78-9f40-4c1a-8f10-03f92549df7b-kube-api-access-dsd5z\") pod \"kube-state-metrics-0\" (UID: \"d6a82c78-9f40-4c1a-8f10-03f92549df7b\") " pod="openstack/kube-state-metrics-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.411679 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.465006 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.466035 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.469265 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.481357 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.507433 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1902d9-bb65-4974-a922-056811447603-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"2c1902d9-bb65-4974-a922-056811447603\") " pod="openstack/nova-cell1-conductor-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.507497 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6a82c78-9f40-4c1a-8f10-03f92549df7b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d6a82c78-9f40-4c1a-8f10-03f92549df7b\") " pod="openstack/kube-state-metrics-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.507520 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z52tv\" (UniqueName: \"kubernetes.io/projected/2c1902d9-bb65-4974-a922-056811447603-kube-api-access-z52tv\") pod \"nova-cell1-conductor-0\" (UID: \"2c1902d9-bb65-4974-a922-056811447603\") " pod="openstack/nova-cell1-conductor-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.507539 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6a82c78-9f40-4c1a-8f10-03f92549df7b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d6a82c78-9f40-4c1a-8f10-03f92549df7b\") " pod="openstack/kube-state-metrics-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.507574 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsd5z\" (UniqueName: \"kubernetes.io/projected/d6a82c78-9f40-4c1a-8f10-03f92549df7b-kube-api-access-dsd5z\") pod \"kube-state-metrics-0\" (UID: \"d6a82c78-9f40-4c1a-8f10-03f92549df7b\") " pod="openstack/kube-state-metrics-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.507641 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d6a82c78-9f40-4c1a-8f10-03f92549df7b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d6a82c78-9f40-4c1a-8f10-03f92549df7b\") " pod="openstack/kube-state-metrics-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.507695 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1902d9-bb65-4974-a922-056811447603-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"2c1902d9-bb65-4974-a922-056811447603\") " pod="openstack/nova-cell1-conductor-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.527463 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6a82c78-9f40-4c1a-8f10-03f92549df7b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d6a82c78-9f40-4c1a-8f10-03f92549df7b\") " pod="openstack/kube-state-metrics-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.528886 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1902d9-bb65-4974-a922-056811447603-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"2c1902d9-bb65-4974-a922-056811447603\") " pod="openstack/nova-cell1-conductor-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.538215 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1902d9-bb65-4974-a922-056811447603-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"2c1902d9-bb65-4974-a922-056811447603\") " pod="openstack/nova-cell1-conductor-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.608820 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b05d0f1-5b9d-4e82-84ff-8addbc45aec2-config-data\") pod \"nova-scheduler-0\" (UID: \"2b05d0f1-5b9d-4e82-84ff-8addbc45aec2\") " pod="openstack/nova-scheduler-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.608884 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b05d0f1-5b9d-4e82-84ff-8addbc45aec2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2b05d0f1-5b9d-4e82-84ff-8addbc45aec2\") " pod="openstack/nova-scheduler-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.609308 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk85f\" (UniqueName: \"kubernetes.io/projected/2b05d0f1-5b9d-4e82-84ff-8addbc45aec2-kube-api-access-fk85f\") pod \"nova-scheduler-0\" (UID: \"2b05d0f1-5b9d-4e82-84ff-8addbc45aec2\") " pod="openstack/nova-scheduler-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.610784 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6a82c78-9f40-4c1a-8f10-03f92549df7b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d6a82c78-9f40-4c1a-8f10-03f92549df7b\") " pod="openstack/kube-state-metrics-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.610974 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d6a82c78-9f40-4c1a-8f10-03f92549df7b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d6a82c78-9f40-4c1a-8f10-03f92549df7b\") " pod="openstack/kube-state-metrics-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.611559 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsd5z\" (UniqueName: \"kubernetes.io/projected/d6a82c78-9f40-4c1a-8f10-03f92549df7b-kube-api-access-dsd5z\") pod \"kube-state-metrics-0\" (UID: \"d6a82c78-9f40-4c1a-8f10-03f92549df7b\") " pod="openstack/kube-state-metrics-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.611929 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z52tv\" (UniqueName: \"kubernetes.io/projected/2c1902d9-bb65-4974-a922-056811447603-kube-api-access-z52tv\") pod \"nova-cell1-conductor-0\" (UID: \"2c1902d9-bb65-4974-a922-056811447603\") " pod="openstack/nova-cell1-conductor-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.666100 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.712639 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk85f\" (UniqueName: \"kubernetes.io/projected/2b05d0f1-5b9d-4e82-84ff-8addbc45aec2-kube-api-access-fk85f\") pod \"nova-scheduler-0\" (UID: \"2b05d0f1-5b9d-4e82-84ff-8addbc45aec2\") " pod="openstack/nova-scheduler-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.712696 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b05d0f1-5b9d-4e82-84ff-8addbc45aec2-config-data\") pod \"nova-scheduler-0\" (UID: \"2b05d0f1-5b9d-4e82-84ff-8addbc45aec2\") " pod="openstack/nova-scheduler-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.712726 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b05d0f1-5b9d-4e82-84ff-8addbc45aec2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2b05d0f1-5b9d-4e82-84ff-8addbc45aec2\") " pod="openstack/nova-scheduler-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.719698 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b05d0f1-5b9d-4e82-84ff-8addbc45aec2-config-data\") pod \"nova-scheduler-0\" (UID: \"2b05d0f1-5b9d-4e82-84ff-8addbc45aec2\") " pod="openstack/nova-scheduler-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.723288 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b05d0f1-5b9d-4e82-84ff-8addbc45aec2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2b05d0f1-5b9d-4e82-84ff-8addbc45aec2\") " pod="openstack/nova-scheduler-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.738737 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk85f\" (UniqueName: \"kubernetes.io/projected/2b05d0f1-5b9d-4e82-84ff-8addbc45aec2-kube-api-access-fk85f\") pod \"nova-scheduler-0\" (UID: \"2b05d0f1-5b9d-4e82-84ff-8addbc45aec2\") " pod="openstack/nova-scheduler-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.798014 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.902735 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.903003 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d66af06d-1cf9-4a5e-9649-d22bc9f00b7e" containerName="ceilometer-central-agent" containerID="cri-o://a81773c53e6e5882eee59f1c96103188127c38505a0c88fc98f710a176858be8" gracePeriod=30 Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.903401 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d66af06d-1cf9-4a5e-9649-d22bc9f00b7e" containerName="proxy-httpd" containerID="cri-o://81db1ec6b296b40fb45c7b143629efeefd6d83d70f5cbf0d01eb38a053b72a39" gracePeriod=30 Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.903447 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d66af06d-1cf9-4a5e-9649-d22bc9f00b7e" containerName="sg-core" containerID="cri-o://c9c8917cf28f0cce22394943f1e4a80cf81551b68bfd2601840e353214977013" gracePeriod=30 Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.903480 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d66af06d-1cf9-4a5e-9649-d22bc9f00b7e" containerName="ceilometer-notification-agent" containerID="cri-o://3c550e70dde59ae2a84a9b8264c0065e536239abcea3d693949a1907f33ef09c" gracePeriod=30 Dec 06 09:25:45 crc kubenswrapper[4672]: I1206 09:25:45.917193 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 09:25:46 crc kubenswrapper[4672]: I1206 09:25:46.193568 4672 generic.go:334] "Generic (PLEG): container finished" podID="d66af06d-1cf9-4a5e-9649-d22bc9f00b7e" containerID="81db1ec6b296b40fb45c7b143629efeefd6d83d70f5cbf0d01eb38a053b72a39" exitCode=0 Dec 06 09:25:46 crc kubenswrapper[4672]: I1206 09:25:46.193910 4672 generic.go:334] "Generic (PLEG): container finished" podID="d66af06d-1cf9-4a5e-9649-d22bc9f00b7e" containerID="c9c8917cf28f0cce22394943f1e4a80cf81551b68bfd2601840e353214977013" exitCode=2 Dec 06 09:25:46 crc kubenswrapper[4672]: I1206 09:25:46.193961 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e","Type":"ContainerDied","Data":"81db1ec6b296b40fb45c7b143629efeefd6d83d70f5cbf0d01eb38a053b72a39"} Dec 06 09:25:46 crc kubenswrapper[4672]: I1206 09:25:46.193982 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e","Type":"ContainerDied","Data":"c9c8917cf28f0cce22394943f1e4a80cf81551b68bfd2601840e353214977013"} Dec 06 09:25:46 crc kubenswrapper[4672]: I1206 09:25:46.196256 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8a8fc694-01bf-4882-a9f8-07d026a37ee2","Type":"ContainerStarted","Data":"c1e2396203e3e0186e0e875737f8010bd4e89c783e31dae43aa25a3c81ec7818"} Dec 06 09:25:46 crc kubenswrapper[4672]: I1206 09:25:46.196280 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8a8fc694-01bf-4882-a9f8-07d026a37ee2","Type":"ContainerStarted","Data":"1b633af106ee2192aa25533c340e84906a577eb6eebbec06978bc6e3a755b39c"} Dec 06 09:25:46 crc kubenswrapper[4672]: I1206 09:25:46.196288 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8a8fc694-01bf-4882-a9f8-07d026a37ee2","Type":"ContainerStarted","Data":"1a4341bb5f5009f04bdadc83e7efa94ffe8e8e417e952befa1ae7d47f7297b98"} Dec 06 09:25:46 crc kubenswrapper[4672]: I1206 09:25:46.223781 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.223765627 podStartE2EDuration="2.223765627s" podCreationTimestamp="2025-12-06 09:25:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:25:46.218133824 +0000 UTC m=+1163.962394111" watchObservedRunningTime="2025-12-06 09:25:46.223765627 +0000 UTC m=+1163.968025914" Dec 06 09:25:46 crc kubenswrapper[4672]: I1206 09:25:46.255368 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 06 09:25:46 crc kubenswrapper[4672]: I1206 09:25:46.337224 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 09:25:46 crc kubenswrapper[4672]: I1206 09:25:46.568156 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12457efd-fc0b-4659-83b8-93b93139eb73" path="/var/lib/kubelet/pods/12457efd-fc0b-4659-83b8-93b93139eb73/volumes" Dec 06 09:25:46 crc kubenswrapper[4672]: I1206 09:25:46.569005 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a2f21bb-694a-4fd5-a5b5-e1d094f8ef62" path="/var/lib/kubelet/pods/4a2f21bb-694a-4fd5-a5b5-e1d094f8ef62/volumes" Dec 06 09:25:46 crc kubenswrapper[4672]: I1206 09:25:46.624050 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 09:25:47 crc kubenswrapper[4672]: I1206 09:25:47.207498 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"2c1902d9-bb65-4974-a922-056811447603","Type":"ContainerStarted","Data":"1d02614d24fc2d624c0505b85e2f6633f8907f85bf8b04a838c997172354d544"} Dec 06 09:25:47 crc kubenswrapper[4672]: I1206 09:25:47.207539 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"2c1902d9-bb65-4974-a922-056811447603","Type":"ContainerStarted","Data":"806a334d03e77fdf588cfff2606098f5ae0369572d998e8a0f835d24f3779452"} Dec 06 09:25:47 crc kubenswrapper[4672]: I1206 09:25:47.208496 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 06 09:25:47 crc kubenswrapper[4672]: I1206 09:25:47.209706 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d6a82c78-9f40-4c1a-8f10-03f92549df7b","Type":"ContainerStarted","Data":"864db44916d89753ba09bf8409aa12653ffb755c73584c12ba1a194178919108"} Dec 06 09:25:47 crc kubenswrapper[4672]: I1206 09:25:47.243488 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.243470324 podStartE2EDuration="2.243470324s" podCreationTimestamp="2025-12-06 09:25:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:25:47.236041873 +0000 UTC m=+1164.980302170" watchObservedRunningTime="2025-12-06 09:25:47.243470324 +0000 UTC m=+1164.987730611" Dec 06 09:25:47 crc kubenswrapper[4672]: I1206 09:25:47.247008 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2b05d0f1-5b9d-4e82-84ff-8addbc45aec2","Type":"ContainerStarted","Data":"e07590524f183820b0a8a6e77c268ad94429c6cd54e7127b1081a709225ce187"} Dec 06 09:25:47 crc kubenswrapper[4672]: I1206 09:25:47.247063 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2b05d0f1-5b9d-4e82-84ff-8addbc45aec2","Type":"ContainerStarted","Data":"b33c16d8ffbc119ad65281554f37c7c6e784d249ca3a5b686312a98b5fc0dad6"} Dec 06 09:25:47 crc kubenswrapper[4672]: I1206 09:25:47.269446 4672 generic.go:334] "Generic (PLEG): container finished" podID="d66af06d-1cf9-4a5e-9649-d22bc9f00b7e" containerID="a81773c53e6e5882eee59f1c96103188127c38505a0c88fc98f710a176858be8" exitCode=0 Dec 06 09:25:47 crc kubenswrapper[4672]: I1206 09:25:47.269574 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.269556689 podStartE2EDuration="2.269556689s" podCreationTimestamp="2025-12-06 09:25:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:25:47.264057701 +0000 UTC m=+1165.008317988" watchObservedRunningTime="2025-12-06 09:25:47.269556689 +0000 UTC m=+1165.013816976" Dec 06 09:25:47 crc kubenswrapper[4672]: I1206 09:25:47.269724 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e","Type":"ContainerDied","Data":"a81773c53e6e5882eee59f1c96103188127c38505a0c88fc98f710a176858be8"} Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.247069 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.278730 4672 generic.go:334] "Generic (PLEG): container finished" podID="74e55b59-cf24-44da-bb4e-045a22aea20b" containerID="0a6e27df90c21d052f5bb945157bf6f17d4b0be2dce8cf6983488dca656598fa" exitCode=0 Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.278783 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.278799 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"74e55b59-cf24-44da-bb4e-045a22aea20b","Type":"ContainerDied","Data":"0a6e27df90c21d052f5bb945157bf6f17d4b0be2dce8cf6983488dca656598fa"} Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.278827 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"74e55b59-cf24-44da-bb4e-045a22aea20b","Type":"ContainerDied","Data":"b303be087ae9c0074a0d829f68249b45039e48e66df8598d5d3a4e0be3f54fa6"} Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.278844 4672 scope.go:117] "RemoveContainer" containerID="0a6e27df90c21d052f5bb945157bf6f17d4b0be2dce8cf6983488dca656598fa" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.280854 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d6a82c78-9f40-4c1a-8f10-03f92549df7b","Type":"ContainerStarted","Data":"f455675d6d0ec68c386e46ec4d0208b5e0c05b34c2e6eb0fde23887e08b99168"} Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.281000 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.311448 4672 scope.go:117] "RemoveContainer" containerID="3aec3a791963140db6c2b5f3f35bd12a96010239a5066c2c262e2de161b090a1" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.330350 4672 scope.go:117] "RemoveContainer" containerID="0a6e27df90c21d052f5bb945157bf6f17d4b0be2dce8cf6983488dca656598fa" Dec 06 09:25:48 crc kubenswrapper[4672]: E1206 09:25:48.330905 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a6e27df90c21d052f5bb945157bf6f17d4b0be2dce8cf6983488dca656598fa\": container with ID starting with 0a6e27df90c21d052f5bb945157bf6f17d4b0be2dce8cf6983488dca656598fa not found: ID does not exist" containerID="0a6e27df90c21d052f5bb945157bf6f17d4b0be2dce8cf6983488dca656598fa" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.330940 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a6e27df90c21d052f5bb945157bf6f17d4b0be2dce8cf6983488dca656598fa"} err="failed to get container status \"0a6e27df90c21d052f5bb945157bf6f17d4b0be2dce8cf6983488dca656598fa\": rpc error: code = NotFound desc = could not find container \"0a6e27df90c21d052f5bb945157bf6f17d4b0be2dce8cf6983488dca656598fa\": container with ID starting with 0a6e27df90c21d052f5bb945157bf6f17d4b0be2dce8cf6983488dca656598fa not found: ID does not exist" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.330961 4672 scope.go:117] "RemoveContainer" containerID="3aec3a791963140db6c2b5f3f35bd12a96010239a5066c2c262e2de161b090a1" Dec 06 09:25:48 crc kubenswrapper[4672]: E1206 09:25:48.334803 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3aec3a791963140db6c2b5f3f35bd12a96010239a5066c2c262e2de161b090a1\": container with ID starting with 3aec3a791963140db6c2b5f3f35bd12a96010239a5066c2c262e2de161b090a1 not found: ID does not exist" containerID="3aec3a791963140db6c2b5f3f35bd12a96010239a5066c2c262e2de161b090a1" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.334847 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aec3a791963140db6c2b5f3f35bd12a96010239a5066c2c262e2de161b090a1"} err="failed to get container status \"3aec3a791963140db6c2b5f3f35bd12a96010239a5066c2c262e2de161b090a1\": rpc error: code = NotFound desc = could not find container \"3aec3a791963140db6c2b5f3f35bd12a96010239a5066c2c262e2de161b090a1\": container with ID starting with 3aec3a791963140db6c2b5f3f35bd12a96010239a5066c2c262e2de161b090a1 not found: ID does not exist" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.414303 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e55b59-cf24-44da-bb4e-045a22aea20b-config-data\") pod \"74e55b59-cf24-44da-bb4e-045a22aea20b\" (UID: \"74e55b59-cf24-44da-bb4e-045a22aea20b\") " Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.414382 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e55b59-cf24-44da-bb4e-045a22aea20b-combined-ca-bundle\") pod \"74e55b59-cf24-44da-bb4e-045a22aea20b\" (UID: \"74e55b59-cf24-44da-bb4e-045a22aea20b\") " Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.414484 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcqxz\" (UniqueName: \"kubernetes.io/projected/74e55b59-cf24-44da-bb4e-045a22aea20b-kube-api-access-mcqxz\") pod \"74e55b59-cf24-44da-bb4e-045a22aea20b\" (UID: \"74e55b59-cf24-44da-bb4e-045a22aea20b\") " Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.414508 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74e55b59-cf24-44da-bb4e-045a22aea20b-logs\") pod \"74e55b59-cf24-44da-bb4e-045a22aea20b\" (UID: \"74e55b59-cf24-44da-bb4e-045a22aea20b\") " Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.414940 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74e55b59-cf24-44da-bb4e-045a22aea20b-logs" (OuterVolumeSpecName: "logs") pod "74e55b59-cf24-44da-bb4e-045a22aea20b" (UID: "74e55b59-cf24-44da-bb4e-045a22aea20b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.429797 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74e55b59-cf24-44da-bb4e-045a22aea20b-kube-api-access-mcqxz" (OuterVolumeSpecName: "kube-api-access-mcqxz") pod "74e55b59-cf24-44da-bb4e-045a22aea20b" (UID: "74e55b59-cf24-44da-bb4e-045a22aea20b"). InnerVolumeSpecName "kube-api-access-mcqxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.450796 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74e55b59-cf24-44da-bb4e-045a22aea20b-config-data" (OuterVolumeSpecName: "config-data") pod "74e55b59-cf24-44da-bb4e-045a22aea20b" (UID: "74e55b59-cf24-44da-bb4e-045a22aea20b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.466686 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74e55b59-cf24-44da-bb4e-045a22aea20b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74e55b59-cf24-44da-bb4e-045a22aea20b" (UID: "74e55b59-cf24-44da-bb4e-045a22aea20b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.521846 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e55b59-cf24-44da-bb4e-045a22aea20b-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.522094 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e55b59-cf24-44da-bb4e-045a22aea20b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.522155 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcqxz\" (UniqueName: \"kubernetes.io/projected/74e55b59-cf24-44da-bb4e-045a22aea20b-kube-api-access-mcqxz\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.522208 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74e55b59-cf24-44da-bb4e-045a22aea20b-logs\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.609865 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.150185769 podStartE2EDuration="3.609849185s" podCreationTimestamp="2025-12-06 09:25:45 +0000 UTC" firstStartedPulling="2025-12-06 09:25:46.629114916 +0000 UTC m=+1164.373375203" lastFinishedPulling="2025-12-06 09:25:47.088778332 +0000 UTC m=+1164.833038619" observedRunningTime="2025-12-06 09:25:48.311722484 +0000 UTC m=+1166.055982771" watchObservedRunningTime="2025-12-06 09:25:48.609849185 +0000 UTC m=+1166.354109472" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.617265 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.626938 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.658369 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 06 09:25:48 crc kubenswrapper[4672]: E1206 09:25:48.658745 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e55b59-cf24-44da-bb4e-045a22aea20b" containerName="nova-api-log" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.658762 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e55b59-cf24-44da-bb4e-045a22aea20b" containerName="nova-api-log" Dec 06 09:25:48 crc kubenswrapper[4672]: E1206 09:25:48.658788 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e55b59-cf24-44da-bb4e-045a22aea20b" containerName="nova-api-api" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.658797 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e55b59-cf24-44da-bb4e-045a22aea20b" containerName="nova-api-api" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.658950 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="74e55b59-cf24-44da-bb4e-045a22aea20b" containerName="nova-api-log" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.658964 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="74e55b59-cf24-44da-bb4e-045a22aea20b" containerName="nova-api-api" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.659850 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.663500 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.683796 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.826681 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.828034 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a34da5c4-fd91-4b56-a534-9ee9350bb7d0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a34da5c4-fd91-4b56-a534-9ee9350bb7d0\") " pod="openstack/nova-api-0" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.828097 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fqsl\" (UniqueName: \"kubernetes.io/projected/a34da5c4-fd91-4b56-a534-9ee9350bb7d0-kube-api-access-5fqsl\") pod \"nova-api-0\" (UID: \"a34da5c4-fd91-4b56-a534-9ee9350bb7d0\") " pod="openstack/nova-api-0" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.828207 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a34da5c4-fd91-4b56-a534-9ee9350bb7d0-logs\") pod \"nova-api-0\" (UID: \"a34da5c4-fd91-4b56-a534-9ee9350bb7d0\") " pod="openstack/nova-api-0" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.828257 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a34da5c4-fd91-4b56-a534-9ee9350bb7d0-config-data\") pod \"nova-api-0\" (UID: \"a34da5c4-fd91-4b56-a534-9ee9350bb7d0\") " pod="openstack/nova-api-0" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.930114 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-log-httpd\") pod \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\" (UID: \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\") " Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.930178 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-run-httpd\") pod \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\" (UID: \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\") " Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.930212 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-scripts\") pod \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\" (UID: \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\") " Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.930253 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqqmb\" (UniqueName: \"kubernetes.io/projected/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-kube-api-access-sqqmb\") pod \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\" (UID: \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\") " Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.930293 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-config-data\") pod \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\" (UID: \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\") " Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.930356 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-sg-core-conf-yaml\") pod \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\" (UID: \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\") " Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.930385 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-combined-ca-bundle\") pod \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\" (UID: \"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e\") " Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.930648 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a34da5c4-fd91-4b56-a534-9ee9350bb7d0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a34da5c4-fd91-4b56-a534-9ee9350bb7d0\") " pod="openstack/nova-api-0" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.930665 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d66af06d-1cf9-4a5e-9649-d22bc9f00b7e" (UID: "d66af06d-1cf9-4a5e-9649-d22bc9f00b7e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.930709 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fqsl\" (UniqueName: \"kubernetes.io/projected/a34da5c4-fd91-4b56-a534-9ee9350bb7d0-kube-api-access-5fqsl\") pod \"nova-api-0\" (UID: \"a34da5c4-fd91-4b56-a534-9ee9350bb7d0\") " pod="openstack/nova-api-0" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.930775 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a34da5c4-fd91-4b56-a534-9ee9350bb7d0-logs\") pod \"nova-api-0\" (UID: \"a34da5c4-fd91-4b56-a534-9ee9350bb7d0\") " pod="openstack/nova-api-0" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.930815 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a34da5c4-fd91-4b56-a534-9ee9350bb7d0-config-data\") pod \"nova-api-0\" (UID: \"a34da5c4-fd91-4b56-a534-9ee9350bb7d0\") " pod="openstack/nova-api-0" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.930865 4672 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.931191 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d66af06d-1cf9-4a5e-9649-d22bc9f00b7e" (UID: "d66af06d-1cf9-4a5e-9649-d22bc9f00b7e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.931704 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a34da5c4-fd91-4b56-a534-9ee9350bb7d0-logs\") pod \"nova-api-0\" (UID: \"a34da5c4-fd91-4b56-a534-9ee9350bb7d0\") " pod="openstack/nova-api-0" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.948358 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a34da5c4-fd91-4b56-a534-9ee9350bb7d0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a34da5c4-fd91-4b56-a534-9ee9350bb7d0\") " pod="openstack/nova-api-0" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.948385 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-kube-api-access-sqqmb" (OuterVolumeSpecName: "kube-api-access-sqqmb") pod "d66af06d-1cf9-4a5e-9649-d22bc9f00b7e" (UID: "d66af06d-1cf9-4a5e-9649-d22bc9f00b7e"). InnerVolumeSpecName "kube-api-access-sqqmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.948433 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a34da5c4-fd91-4b56-a534-9ee9350bb7d0-config-data\") pod \"nova-api-0\" (UID: \"a34da5c4-fd91-4b56-a534-9ee9350bb7d0\") " pod="openstack/nova-api-0" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.949484 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-scripts" (OuterVolumeSpecName: "scripts") pod "d66af06d-1cf9-4a5e-9649-d22bc9f00b7e" (UID: "d66af06d-1cf9-4a5e-9649-d22bc9f00b7e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.951230 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fqsl\" (UniqueName: \"kubernetes.io/projected/a34da5c4-fd91-4b56-a534-9ee9350bb7d0-kube-api-access-5fqsl\") pod \"nova-api-0\" (UID: \"a34da5c4-fd91-4b56-a534-9ee9350bb7d0\") " pod="openstack/nova-api-0" Dec 06 09:25:48 crc kubenswrapper[4672]: I1206 09:25:48.957228 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d66af06d-1cf9-4a5e-9649-d22bc9f00b7e" (UID: "d66af06d-1cf9-4a5e-9649-d22bc9f00b7e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.004470 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.028757 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-config-data" (OuterVolumeSpecName: "config-data") pod "d66af06d-1cf9-4a5e-9649-d22bc9f00b7e" (UID: "d66af06d-1cf9-4a5e-9649-d22bc9f00b7e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.028835 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d66af06d-1cf9-4a5e-9649-d22bc9f00b7e" (UID: "d66af06d-1cf9-4a5e-9649-d22bc9f00b7e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.032647 4672 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.032679 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.032689 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqqmb\" (UniqueName: \"kubernetes.io/projected/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-kube-api-access-sqqmb\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.032700 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.032707 4672 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.032715 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.307566 4672 generic.go:334] "Generic (PLEG): container finished" podID="d66af06d-1cf9-4a5e-9649-d22bc9f00b7e" containerID="3c550e70dde59ae2a84a9b8264c0065e536239abcea3d693949a1907f33ef09c" exitCode=0 Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.307726 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e","Type":"ContainerDied","Data":"3c550e70dde59ae2a84a9b8264c0065e536239abcea3d693949a1907f33ef09c"} Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.307765 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d66af06d-1cf9-4a5e-9649-d22bc9f00b7e","Type":"ContainerDied","Data":"77fd6e4591331c63f43553a4af2f7b37ddf450e2de1ec00394ff05ca17f9956e"} Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.307770 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.307781 4672 scope.go:117] "RemoveContainer" containerID="81db1ec6b296b40fb45c7b143629efeefd6d83d70f5cbf0d01eb38a053b72a39" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.382792 4672 scope.go:117] "RemoveContainer" containerID="c9c8917cf28f0cce22394943f1e4a80cf81551b68bfd2601840e353214977013" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.392669 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.409694 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.440177 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:25:49 crc kubenswrapper[4672]: E1206 09:25:49.440558 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66af06d-1cf9-4a5e-9649-d22bc9f00b7e" containerName="proxy-httpd" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.440577 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66af06d-1cf9-4a5e-9649-d22bc9f00b7e" containerName="proxy-httpd" Dec 06 09:25:49 crc kubenswrapper[4672]: E1206 09:25:49.440619 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66af06d-1cf9-4a5e-9649-d22bc9f00b7e" containerName="ceilometer-central-agent" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.440626 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66af06d-1cf9-4a5e-9649-d22bc9f00b7e" containerName="ceilometer-central-agent" Dec 06 09:25:49 crc kubenswrapper[4672]: E1206 09:25:49.440645 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66af06d-1cf9-4a5e-9649-d22bc9f00b7e" containerName="ceilometer-notification-agent" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.440651 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66af06d-1cf9-4a5e-9649-d22bc9f00b7e" containerName="ceilometer-notification-agent" Dec 06 09:25:49 crc kubenswrapper[4672]: E1206 09:25:49.440669 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66af06d-1cf9-4a5e-9649-d22bc9f00b7e" containerName="sg-core" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.440675 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66af06d-1cf9-4a5e-9649-d22bc9f00b7e" containerName="sg-core" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.440958 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="d66af06d-1cf9-4a5e-9649-d22bc9f00b7e" containerName="sg-core" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.440981 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="d66af06d-1cf9-4a5e-9649-d22bc9f00b7e" containerName="ceilometer-central-agent" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.440990 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="d66af06d-1cf9-4a5e-9649-d22bc9f00b7e" containerName="ceilometer-notification-agent" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.441001 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="d66af06d-1cf9-4a5e-9649-d22bc9f00b7e" containerName="proxy-httpd" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.448466 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.455400 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.455724 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.455929 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.471949 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.494026 4672 scope.go:117] "RemoveContainer" containerID="3c550e70dde59ae2a84a9b8264c0065e536239abcea3d693949a1907f33ef09c" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.513257 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.558979 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-log-httpd\") pod \"ceilometer-0\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " pod="openstack/ceilometer-0" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.559018 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " pod="openstack/ceilometer-0" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.559057 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-run-httpd\") pod \"ceilometer-0\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " pod="openstack/ceilometer-0" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.559088 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " pod="openstack/ceilometer-0" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.559117 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-scripts\") pod \"ceilometer-0\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " pod="openstack/ceilometer-0" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.559163 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " pod="openstack/ceilometer-0" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.559204 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5x4w\" (UniqueName: \"kubernetes.io/projected/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-kube-api-access-t5x4w\") pod \"ceilometer-0\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " pod="openstack/ceilometer-0" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.559233 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-config-data\") pod \"ceilometer-0\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " pod="openstack/ceilometer-0" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.574017 4672 scope.go:117] "RemoveContainer" containerID="a81773c53e6e5882eee59f1c96103188127c38505a0c88fc98f710a176858be8" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.604808 4672 scope.go:117] "RemoveContainer" containerID="81db1ec6b296b40fb45c7b143629efeefd6d83d70f5cbf0d01eb38a053b72a39" Dec 06 09:25:49 crc kubenswrapper[4672]: E1206 09:25:49.605230 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81db1ec6b296b40fb45c7b143629efeefd6d83d70f5cbf0d01eb38a053b72a39\": container with ID starting with 81db1ec6b296b40fb45c7b143629efeefd6d83d70f5cbf0d01eb38a053b72a39 not found: ID does not exist" containerID="81db1ec6b296b40fb45c7b143629efeefd6d83d70f5cbf0d01eb38a053b72a39" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.605275 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81db1ec6b296b40fb45c7b143629efeefd6d83d70f5cbf0d01eb38a053b72a39"} err="failed to get container status \"81db1ec6b296b40fb45c7b143629efeefd6d83d70f5cbf0d01eb38a053b72a39\": rpc error: code = NotFound desc = could not find container \"81db1ec6b296b40fb45c7b143629efeefd6d83d70f5cbf0d01eb38a053b72a39\": container with ID starting with 81db1ec6b296b40fb45c7b143629efeefd6d83d70f5cbf0d01eb38a053b72a39 not found: ID does not exist" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.605299 4672 scope.go:117] "RemoveContainer" containerID="c9c8917cf28f0cce22394943f1e4a80cf81551b68bfd2601840e353214977013" Dec 06 09:25:49 crc kubenswrapper[4672]: E1206 09:25:49.610200 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9c8917cf28f0cce22394943f1e4a80cf81551b68bfd2601840e353214977013\": container with ID starting with c9c8917cf28f0cce22394943f1e4a80cf81551b68bfd2601840e353214977013 not found: ID does not exist" containerID="c9c8917cf28f0cce22394943f1e4a80cf81551b68bfd2601840e353214977013" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.610255 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9c8917cf28f0cce22394943f1e4a80cf81551b68bfd2601840e353214977013"} err="failed to get container status \"c9c8917cf28f0cce22394943f1e4a80cf81551b68bfd2601840e353214977013\": rpc error: code = NotFound desc = could not find container \"c9c8917cf28f0cce22394943f1e4a80cf81551b68bfd2601840e353214977013\": container with ID starting with c9c8917cf28f0cce22394943f1e4a80cf81551b68bfd2601840e353214977013 not found: ID does not exist" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.610273 4672 scope.go:117] "RemoveContainer" containerID="3c550e70dde59ae2a84a9b8264c0065e536239abcea3d693949a1907f33ef09c" Dec 06 09:25:49 crc kubenswrapper[4672]: E1206 09:25:49.610574 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c550e70dde59ae2a84a9b8264c0065e536239abcea3d693949a1907f33ef09c\": container with ID starting with 3c550e70dde59ae2a84a9b8264c0065e536239abcea3d693949a1907f33ef09c not found: ID does not exist" containerID="3c550e70dde59ae2a84a9b8264c0065e536239abcea3d693949a1907f33ef09c" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.610627 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c550e70dde59ae2a84a9b8264c0065e536239abcea3d693949a1907f33ef09c"} err="failed to get container status \"3c550e70dde59ae2a84a9b8264c0065e536239abcea3d693949a1907f33ef09c\": rpc error: code = NotFound desc = could not find container \"3c550e70dde59ae2a84a9b8264c0065e536239abcea3d693949a1907f33ef09c\": container with ID starting with 3c550e70dde59ae2a84a9b8264c0065e536239abcea3d693949a1907f33ef09c not found: ID does not exist" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.610640 4672 scope.go:117] "RemoveContainer" containerID="a81773c53e6e5882eee59f1c96103188127c38505a0c88fc98f710a176858be8" Dec 06 09:25:49 crc kubenswrapper[4672]: E1206 09:25:49.610968 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a81773c53e6e5882eee59f1c96103188127c38505a0c88fc98f710a176858be8\": container with ID starting with a81773c53e6e5882eee59f1c96103188127c38505a0c88fc98f710a176858be8 not found: ID does not exist" containerID="a81773c53e6e5882eee59f1c96103188127c38505a0c88fc98f710a176858be8" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.611008 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a81773c53e6e5882eee59f1c96103188127c38505a0c88fc98f710a176858be8"} err="failed to get container status \"a81773c53e6e5882eee59f1c96103188127c38505a0c88fc98f710a176858be8\": rpc error: code = NotFound desc = could not find container \"a81773c53e6e5882eee59f1c96103188127c38505a0c88fc98f710a176858be8\": container with ID starting with a81773c53e6e5882eee59f1c96103188127c38505a0c88fc98f710a176858be8 not found: ID does not exist" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.622036 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.622920 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.660880 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " pod="openstack/ceilometer-0" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.660974 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5x4w\" (UniqueName: \"kubernetes.io/projected/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-kube-api-access-t5x4w\") pod \"ceilometer-0\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " pod="openstack/ceilometer-0" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.661037 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-config-data\") pod \"ceilometer-0\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " pod="openstack/ceilometer-0" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.661473 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-log-httpd\") pod \"ceilometer-0\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " pod="openstack/ceilometer-0" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.661505 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " pod="openstack/ceilometer-0" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.661555 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-run-httpd\") pod \"ceilometer-0\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " pod="openstack/ceilometer-0" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.661588 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " pod="openstack/ceilometer-0" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.661643 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-scripts\") pod \"ceilometer-0\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " pod="openstack/ceilometer-0" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.662087 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-run-httpd\") pod \"ceilometer-0\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " pod="openstack/ceilometer-0" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.662337 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-log-httpd\") pod \"ceilometer-0\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " pod="openstack/ceilometer-0" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.669705 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " pod="openstack/ceilometer-0" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.673638 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " pod="openstack/ceilometer-0" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.674068 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-scripts\") pod \"ceilometer-0\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " pod="openstack/ceilometer-0" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.688982 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-config-data\") pod \"ceilometer-0\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " pod="openstack/ceilometer-0" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.693416 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5x4w\" (UniqueName: \"kubernetes.io/projected/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-kube-api-access-t5x4w\") pod \"ceilometer-0\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " pod="openstack/ceilometer-0" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.693642 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " pod="openstack/ceilometer-0" Dec 06 09:25:49 crc kubenswrapper[4672]: I1206 09:25:49.796019 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:25:50 crc kubenswrapper[4672]: I1206 09:25:50.312186 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:25:50 crc kubenswrapper[4672]: I1206 09:25:50.330234 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cf51826-9cd4-4e4a-9cc5-316ddef360a3","Type":"ContainerStarted","Data":"0ecd71fb211d3b646a6f5a6b9485425c51f04ad8553df33f23fbf4f19f1c8490"} Dec 06 09:25:50 crc kubenswrapper[4672]: I1206 09:25:50.335728 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a34da5c4-fd91-4b56-a534-9ee9350bb7d0","Type":"ContainerStarted","Data":"80e425f38c39ae44a432e681991e30d937835dc0916e12cb1be05682e502a85e"} Dec 06 09:25:50 crc kubenswrapper[4672]: I1206 09:25:50.335781 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a34da5c4-fd91-4b56-a534-9ee9350bb7d0","Type":"ContainerStarted","Data":"f9a9b5ab3e9798522998d9f8a4b26e61cf394499b857291cffd146b24425be98"} Dec 06 09:25:50 crc kubenswrapper[4672]: I1206 09:25:50.335794 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a34da5c4-fd91-4b56-a534-9ee9350bb7d0","Type":"ContainerStarted","Data":"00d960578b0d7598ae45e5ea5f66c025e4c77fb3ee0dbfe0a34b85cbf80b3100"} Dec 06 09:25:50 crc kubenswrapper[4672]: I1206 09:25:50.358621 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.3586057719999998 podStartE2EDuration="2.358605772s" podCreationTimestamp="2025-12-06 09:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:25:50.356984039 +0000 UTC m=+1168.101244326" watchObservedRunningTime="2025-12-06 09:25:50.358605772 +0000 UTC m=+1168.102866059" Dec 06 09:25:50 crc kubenswrapper[4672]: I1206 09:25:50.565825 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74e55b59-cf24-44da-bb4e-045a22aea20b" path="/var/lib/kubelet/pods/74e55b59-cf24-44da-bb4e-045a22aea20b/volumes" Dec 06 09:25:50 crc kubenswrapper[4672]: I1206 09:25:50.566424 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d66af06d-1cf9-4a5e-9649-d22bc9f00b7e" path="/var/lib/kubelet/pods/d66af06d-1cf9-4a5e-9649-d22bc9f00b7e/volumes" Dec 06 09:25:50 crc kubenswrapper[4672]: I1206 09:25:50.799417 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 06 09:25:51 crc kubenswrapper[4672]: I1206 09:25:51.348088 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cf51826-9cd4-4e4a-9cc5-316ddef360a3","Type":"ContainerStarted","Data":"e46341045ab043c958b8069fef460fb38a81aaa5bbb7edd6520c1ea8b2d567c5"} Dec 06 09:25:52 crc kubenswrapper[4672]: I1206 09:25:52.361008 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cf51826-9cd4-4e4a-9cc5-316ddef360a3","Type":"ContainerStarted","Data":"9b9d3569045b442df2f659d3795653d6ec784a0f4f89e5759cdb93380f81e29a"} Dec 06 09:25:52 crc kubenswrapper[4672]: I1206 09:25:52.362560 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cf51826-9cd4-4e4a-9cc5-316ddef360a3","Type":"ContainerStarted","Data":"de5e4bd0a42ccb46675587314062f7381e8ee5e2872add27c039be6e99e309a2"} Dec 06 09:25:54 crc kubenswrapper[4672]: I1206 09:25:54.388612 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cf51826-9cd4-4e4a-9cc5-316ddef360a3","Type":"ContainerStarted","Data":"797d71a1bca14ee691d9f38683504ad0d42b08dd101f39ff6b32e5418468ceb6"} Dec 06 09:25:54 crc kubenswrapper[4672]: I1206 09:25:54.390394 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 09:25:54 crc kubenswrapper[4672]: I1206 09:25:54.423245 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.422455251 podStartE2EDuration="5.423223711s" podCreationTimestamp="2025-12-06 09:25:49 +0000 UTC" firstStartedPulling="2025-12-06 09:25:50.313457352 +0000 UTC m=+1168.057717639" lastFinishedPulling="2025-12-06 09:25:53.314225802 +0000 UTC m=+1171.058486099" observedRunningTime="2025-12-06 09:25:54.412973887 +0000 UTC m=+1172.157234204" watchObservedRunningTime="2025-12-06 09:25:54.423223711 +0000 UTC m=+1172.167484008" Dec 06 09:25:54 crc kubenswrapper[4672]: I1206 09:25:54.621857 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 06 09:25:54 crc kubenswrapper[4672]: I1206 09:25:54.622474 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 06 09:25:55 crc kubenswrapper[4672]: I1206 09:25:55.636820 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8a8fc694-01bf-4882-a9f8-07d026a37ee2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.174:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 09:25:55 crc kubenswrapper[4672]: I1206 09:25:55.636848 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8a8fc694-01bf-4882-a9f8-07d026a37ee2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.174:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 09:25:55 crc kubenswrapper[4672]: I1206 09:25:55.694253 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 06 09:25:55 crc kubenswrapper[4672]: I1206 09:25:55.799028 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 06 09:25:55 crc kubenswrapper[4672]: I1206 09:25:55.834256 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 06 09:25:55 crc kubenswrapper[4672]: I1206 09:25:55.935489 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 06 09:25:56 crc kubenswrapper[4672]: I1206 09:25:56.446361 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 06 09:25:59 crc kubenswrapper[4672]: I1206 09:25:59.005770 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 09:25:59 crc kubenswrapper[4672]: I1206 09:25:59.006133 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 09:26:00 crc kubenswrapper[4672]: I1206 09:26:00.095095 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a34da5c4-fd91-4b56-a534-9ee9350bb7d0" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.178:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 09:26:00 crc kubenswrapper[4672]: I1206 09:26:00.095515 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a34da5c4-fd91-4b56-a534-9ee9350bb7d0" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.178:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 09:26:04 crc kubenswrapper[4672]: I1206 09:26:04.630680 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 06 09:26:04 crc kubenswrapper[4672]: I1206 09:26:04.633751 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 06 09:26:04 crc kubenswrapper[4672]: I1206 09:26:04.639551 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 06 09:26:05 crc kubenswrapper[4672]: I1206 09:26:05.497104 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.399031 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.497788 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5-combined-ca-bundle\") pod \"a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5\" (UID: \"a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5\") " Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.498507 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5-config-data\") pod \"a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5\" (UID: \"a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5\") " Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.498734 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls82g\" (UniqueName: \"kubernetes.io/projected/a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5-kube-api-access-ls82g\") pod \"a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5\" (UID: \"a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5\") " Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.503363 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5-kube-api-access-ls82g" (OuterVolumeSpecName: "kube-api-access-ls82g") pod "a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5" (UID: "a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5"). InnerVolumeSpecName "kube-api-access-ls82g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.505509 4672 generic.go:334] "Generic (PLEG): container finished" podID="a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5" containerID="aa15165f386808bb3f9ec3238e74576530d764b0877ac2673a8957cc8fc11283" exitCode=137 Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.506572 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.507063 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5","Type":"ContainerDied","Data":"aa15165f386808bb3f9ec3238e74576530d764b0877ac2673a8957cc8fc11283"} Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.507097 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5","Type":"ContainerDied","Data":"32ae054a31228a97568753a627c9b670f9be87f8cec95f8643e29c53f21a7bfc"} Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.507113 4672 scope.go:117] "RemoveContainer" containerID="aa15165f386808bb3f9ec3238e74576530d764b0877ac2673a8957cc8fc11283" Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.521802 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5" (UID: "a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.537284 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5-config-data" (OuterVolumeSpecName: "config-data") pod "a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5" (UID: "a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.599845 4672 scope.go:117] "RemoveContainer" containerID="aa15165f386808bb3f9ec3238e74576530d764b0877ac2673a8957cc8fc11283" Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.600989 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.601028 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.601042 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls82g\" (UniqueName: \"kubernetes.io/projected/a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5-kube-api-access-ls82g\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:07 crc kubenswrapper[4672]: E1206 09:26:07.601299 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa15165f386808bb3f9ec3238e74576530d764b0877ac2673a8957cc8fc11283\": container with ID starting with aa15165f386808bb3f9ec3238e74576530d764b0877ac2673a8957cc8fc11283 not found: ID does not exist" containerID="aa15165f386808bb3f9ec3238e74576530d764b0877ac2673a8957cc8fc11283" Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.601333 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa15165f386808bb3f9ec3238e74576530d764b0877ac2673a8957cc8fc11283"} err="failed to get container status \"aa15165f386808bb3f9ec3238e74576530d764b0877ac2673a8957cc8fc11283\": rpc error: code = NotFound desc = could not find container \"aa15165f386808bb3f9ec3238e74576530d764b0877ac2673a8957cc8fc11283\": container with ID starting with aa15165f386808bb3f9ec3238e74576530d764b0877ac2673a8957cc8fc11283 not found: ID does not exist" Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.843525 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.851880 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.874270 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 09:26:07 crc kubenswrapper[4672]: E1206 09:26:07.874732 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5" containerName="nova-cell1-novncproxy-novncproxy" Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.874753 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5" containerName="nova-cell1-novncproxy-novncproxy" Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.875011 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5" containerName="nova-cell1-novncproxy-novncproxy" Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.875807 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.877823 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.877988 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.879062 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.902303 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.906025 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/21ff730f-c3e2-4cf0-8e52-8345907156f1-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"21ff730f-c3e2-4cf0-8e52-8345907156f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.906117 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bfnx\" (UniqueName: \"kubernetes.io/projected/21ff730f-c3e2-4cf0-8e52-8345907156f1-kube-api-access-2bfnx\") pod \"nova-cell1-novncproxy-0\" (UID: \"21ff730f-c3e2-4cf0-8e52-8345907156f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.906163 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21ff730f-c3e2-4cf0-8e52-8345907156f1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"21ff730f-c3e2-4cf0-8e52-8345907156f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.906212 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/21ff730f-c3e2-4cf0-8e52-8345907156f1-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"21ff730f-c3e2-4cf0-8e52-8345907156f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:26:07 crc kubenswrapper[4672]: I1206 09:26:07.906272 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21ff730f-c3e2-4cf0-8e52-8345907156f1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"21ff730f-c3e2-4cf0-8e52-8345907156f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:26:08 crc kubenswrapper[4672]: I1206 09:26:08.007963 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21ff730f-c3e2-4cf0-8e52-8345907156f1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"21ff730f-c3e2-4cf0-8e52-8345907156f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:26:08 crc kubenswrapper[4672]: I1206 09:26:08.008029 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/21ff730f-c3e2-4cf0-8e52-8345907156f1-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"21ff730f-c3e2-4cf0-8e52-8345907156f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:26:08 crc kubenswrapper[4672]: I1206 09:26:08.008082 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21ff730f-c3e2-4cf0-8e52-8345907156f1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"21ff730f-c3e2-4cf0-8e52-8345907156f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:26:08 crc kubenswrapper[4672]: I1206 09:26:08.008147 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/21ff730f-c3e2-4cf0-8e52-8345907156f1-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"21ff730f-c3e2-4cf0-8e52-8345907156f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:26:08 crc kubenswrapper[4672]: I1206 09:26:08.008184 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bfnx\" (UniqueName: \"kubernetes.io/projected/21ff730f-c3e2-4cf0-8e52-8345907156f1-kube-api-access-2bfnx\") pod \"nova-cell1-novncproxy-0\" (UID: \"21ff730f-c3e2-4cf0-8e52-8345907156f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:26:08 crc kubenswrapper[4672]: I1206 09:26:08.011369 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21ff730f-c3e2-4cf0-8e52-8345907156f1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"21ff730f-c3e2-4cf0-8e52-8345907156f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:26:08 crc kubenswrapper[4672]: I1206 09:26:08.011742 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/21ff730f-c3e2-4cf0-8e52-8345907156f1-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"21ff730f-c3e2-4cf0-8e52-8345907156f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:26:08 crc kubenswrapper[4672]: I1206 09:26:08.012289 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/21ff730f-c3e2-4cf0-8e52-8345907156f1-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"21ff730f-c3e2-4cf0-8e52-8345907156f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:26:08 crc kubenswrapper[4672]: I1206 09:26:08.015156 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21ff730f-c3e2-4cf0-8e52-8345907156f1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"21ff730f-c3e2-4cf0-8e52-8345907156f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:26:08 crc kubenswrapper[4672]: I1206 09:26:08.025578 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bfnx\" (UniqueName: \"kubernetes.io/projected/21ff730f-c3e2-4cf0-8e52-8345907156f1-kube-api-access-2bfnx\") pod \"nova-cell1-novncproxy-0\" (UID: \"21ff730f-c3e2-4cf0-8e52-8345907156f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:26:08 crc kubenswrapper[4672]: I1206 09:26:08.194569 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:26:08 crc kubenswrapper[4672]: I1206 09:26:08.585568 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5" path="/var/lib/kubelet/pods/a5a0e3ef-3a22-4cef-9a6e-1c24d582e7c5/volumes" Dec 06 09:26:08 crc kubenswrapper[4672]: W1206 09:26:08.684170 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21ff730f_c3e2_4cf0_8e52_8345907156f1.slice/crio-abe0d0ed6646ae7635b5d976480f149f33d22076eea127fba2a6f9353c476493 WatchSource:0}: Error finding container abe0d0ed6646ae7635b5d976480f149f33d22076eea127fba2a6f9353c476493: Status 404 returned error can't find the container with id abe0d0ed6646ae7635b5d976480f149f33d22076eea127fba2a6f9353c476493 Dec 06 09:26:08 crc kubenswrapper[4672]: I1206 09:26:08.685758 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 09:26:09 crc kubenswrapper[4672]: I1206 09:26:09.008859 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 06 09:26:09 crc kubenswrapper[4672]: I1206 09:26:09.011340 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 06 09:26:09 crc kubenswrapper[4672]: I1206 09:26:09.022192 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 06 09:26:09 crc kubenswrapper[4672]: I1206 09:26:09.029825 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 06 09:26:09 crc kubenswrapper[4672]: I1206 09:26:09.535232 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"21ff730f-c3e2-4cf0-8e52-8345907156f1","Type":"ContainerStarted","Data":"e8622502053b360c58a87c4e5f473b9c2608ac9a186c4591be8d72d0fb974761"} Dec 06 09:26:09 crc kubenswrapper[4672]: I1206 09:26:09.535285 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"21ff730f-c3e2-4cf0-8e52-8345907156f1","Type":"ContainerStarted","Data":"abe0d0ed6646ae7635b5d976480f149f33d22076eea127fba2a6f9353c476493"} Dec 06 09:26:09 crc kubenswrapper[4672]: I1206 09:26:09.535736 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 06 09:26:09 crc kubenswrapper[4672]: I1206 09:26:09.541117 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 06 09:26:09 crc kubenswrapper[4672]: I1206 09:26:09.557145 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.557129834 podStartE2EDuration="2.557129834s" podCreationTimestamp="2025-12-06 09:26:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:26:09.554423064 +0000 UTC m=+1187.298683351" watchObservedRunningTime="2025-12-06 09:26:09.557129834 +0000 UTC m=+1187.301390121" Dec 06 09:26:09 crc kubenswrapper[4672]: I1206 09:26:09.760648 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8c565c95f-vfw8s"] Dec 06 09:26:09 crc kubenswrapper[4672]: I1206 09:26:09.763228 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c565c95f-vfw8s" Dec 06 09:26:09 crc kubenswrapper[4672]: I1206 09:26:09.778130 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c565c95f-vfw8s"] Dec 06 09:26:09 crc kubenswrapper[4672]: I1206 09:26:09.843919 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0abba4d9-6af7-4aaf-894e-b442873ec67f-ovsdbserver-nb\") pod \"dnsmasq-dns-8c565c95f-vfw8s\" (UID: \"0abba4d9-6af7-4aaf-894e-b442873ec67f\") " pod="openstack/dnsmasq-dns-8c565c95f-vfw8s" Dec 06 09:26:09 crc kubenswrapper[4672]: I1206 09:26:09.844191 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0abba4d9-6af7-4aaf-894e-b442873ec67f-ovsdbserver-sb\") pod \"dnsmasq-dns-8c565c95f-vfw8s\" (UID: \"0abba4d9-6af7-4aaf-894e-b442873ec67f\") " pod="openstack/dnsmasq-dns-8c565c95f-vfw8s" Dec 06 09:26:09 crc kubenswrapper[4672]: I1206 09:26:09.844304 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjv2w\" (UniqueName: \"kubernetes.io/projected/0abba4d9-6af7-4aaf-894e-b442873ec67f-kube-api-access-kjv2w\") pod \"dnsmasq-dns-8c565c95f-vfw8s\" (UID: \"0abba4d9-6af7-4aaf-894e-b442873ec67f\") " pod="openstack/dnsmasq-dns-8c565c95f-vfw8s" Dec 06 09:26:09 crc kubenswrapper[4672]: I1206 09:26:09.844449 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0abba4d9-6af7-4aaf-894e-b442873ec67f-dns-svc\") pod \"dnsmasq-dns-8c565c95f-vfw8s\" (UID: \"0abba4d9-6af7-4aaf-894e-b442873ec67f\") " pod="openstack/dnsmasq-dns-8c565c95f-vfw8s" Dec 06 09:26:09 crc kubenswrapper[4672]: I1206 09:26:09.844669 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0abba4d9-6af7-4aaf-894e-b442873ec67f-config\") pod \"dnsmasq-dns-8c565c95f-vfw8s\" (UID: \"0abba4d9-6af7-4aaf-894e-b442873ec67f\") " pod="openstack/dnsmasq-dns-8c565c95f-vfw8s" Dec 06 09:26:09 crc kubenswrapper[4672]: I1206 09:26:09.946426 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0abba4d9-6af7-4aaf-894e-b442873ec67f-ovsdbserver-nb\") pod \"dnsmasq-dns-8c565c95f-vfw8s\" (UID: \"0abba4d9-6af7-4aaf-894e-b442873ec67f\") " pod="openstack/dnsmasq-dns-8c565c95f-vfw8s" Dec 06 09:26:09 crc kubenswrapper[4672]: I1206 09:26:09.946468 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0abba4d9-6af7-4aaf-894e-b442873ec67f-ovsdbserver-sb\") pod \"dnsmasq-dns-8c565c95f-vfw8s\" (UID: \"0abba4d9-6af7-4aaf-894e-b442873ec67f\") " pod="openstack/dnsmasq-dns-8c565c95f-vfw8s" Dec 06 09:26:09 crc kubenswrapper[4672]: I1206 09:26:09.946493 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjv2w\" (UniqueName: \"kubernetes.io/projected/0abba4d9-6af7-4aaf-894e-b442873ec67f-kube-api-access-kjv2w\") pod \"dnsmasq-dns-8c565c95f-vfw8s\" (UID: \"0abba4d9-6af7-4aaf-894e-b442873ec67f\") " pod="openstack/dnsmasq-dns-8c565c95f-vfw8s" Dec 06 09:26:09 crc kubenswrapper[4672]: I1206 09:26:09.946518 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0abba4d9-6af7-4aaf-894e-b442873ec67f-dns-svc\") pod \"dnsmasq-dns-8c565c95f-vfw8s\" (UID: \"0abba4d9-6af7-4aaf-894e-b442873ec67f\") " pod="openstack/dnsmasq-dns-8c565c95f-vfw8s" Dec 06 09:26:09 crc kubenswrapper[4672]: I1206 09:26:09.946589 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0abba4d9-6af7-4aaf-894e-b442873ec67f-config\") pod \"dnsmasq-dns-8c565c95f-vfw8s\" (UID: \"0abba4d9-6af7-4aaf-894e-b442873ec67f\") " pod="openstack/dnsmasq-dns-8c565c95f-vfw8s" Dec 06 09:26:09 crc kubenswrapper[4672]: I1206 09:26:09.947383 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0abba4d9-6af7-4aaf-894e-b442873ec67f-config\") pod \"dnsmasq-dns-8c565c95f-vfw8s\" (UID: \"0abba4d9-6af7-4aaf-894e-b442873ec67f\") " pod="openstack/dnsmasq-dns-8c565c95f-vfw8s" Dec 06 09:26:09 crc kubenswrapper[4672]: I1206 09:26:09.947398 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0abba4d9-6af7-4aaf-894e-b442873ec67f-ovsdbserver-nb\") pod \"dnsmasq-dns-8c565c95f-vfw8s\" (UID: \"0abba4d9-6af7-4aaf-894e-b442873ec67f\") " pod="openstack/dnsmasq-dns-8c565c95f-vfw8s" Dec 06 09:26:09 crc kubenswrapper[4672]: I1206 09:26:09.947504 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0abba4d9-6af7-4aaf-894e-b442873ec67f-ovsdbserver-sb\") pod \"dnsmasq-dns-8c565c95f-vfw8s\" (UID: \"0abba4d9-6af7-4aaf-894e-b442873ec67f\") " pod="openstack/dnsmasq-dns-8c565c95f-vfw8s" Dec 06 09:26:09 crc kubenswrapper[4672]: I1206 09:26:09.947739 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0abba4d9-6af7-4aaf-894e-b442873ec67f-dns-svc\") pod \"dnsmasq-dns-8c565c95f-vfw8s\" (UID: \"0abba4d9-6af7-4aaf-894e-b442873ec67f\") " pod="openstack/dnsmasq-dns-8c565c95f-vfw8s" Dec 06 09:26:09 crc kubenswrapper[4672]: I1206 09:26:09.966517 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjv2w\" (UniqueName: \"kubernetes.io/projected/0abba4d9-6af7-4aaf-894e-b442873ec67f-kube-api-access-kjv2w\") pod \"dnsmasq-dns-8c565c95f-vfw8s\" (UID: \"0abba4d9-6af7-4aaf-894e-b442873ec67f\") " pod="openstack/dnsmasq-dns-8c565c95f-vfw8s" Dec 06 09:26:10 crc kubenswrapper[4672]: I1206 09:26:10.084857 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c565c95f-vfw8s" Dec 06 09:26:10 crc kubenswrapper[4672]: I1206 09:26:10.637595 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c565c95f-vfw8s"] Dec 06 09:26:11 crc kubenswrapper[4672]: I1206 09:26:11.586451 4672 generic.go:334] "Generic (PLEG): container finished" podID="0abba4d9-6af7-4aaf-894e-b442873ec67f" containerID="9649a3eb21fd71ac961f736dcd52f86cdbafb2e60481952a632e744886613759" exitCode=0 Dec 06 09:26:11 crc kubenswrapper[4672]: I1206 09:26:11.586541 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c565c95f-vfw8s" event={"ID":"0abba4d9-6af7-4aaf-894e-b442873ec67f","Type":"ContainerDied","Data":"9649a3eb21fd71ac961f736dcd52f86cdbafb2e60481952a632e744886613759"} Dec 06 09:26:11 crc kubenswrapper[4672]: I1206 09:26:11.586955 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c565c95f-vfw8s" event={"ID":"0abba4d9-6af7-4aaf-894e-b442873ec67f","Type":"ContainerStarted","Data":"8b2f14d2dfbbcb8e6c7496b8f876f2efc7afa3a9d1d42b23801e64e719275e07"} Dec 06 09:26:12 crc kubenswrapper[4672]: I1206 09:26:12.319417 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:26:12 crc kubenswrapper[4672]: I1206 09:26:12.319953 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:26:12 crc kubenswrapper[4672]: I1206 09:26:12.350498 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:26:12 crc kubenswrapper[4672]: I1206 09:26:12.597534 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a34da5c4-fd91-4b56-a534-9ee9350bb7d0" containerName="nova-api-log" containerID="cri-o://f9a9b5ab3e9798522998d9f8a4b26e61cf394499b857291cffd146b24425be98" gracePeriod=30 Dec 06 09:26:12 crc kubenswrapper[4672]: I1206 09:26:12.598345 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c565c95f-vfw8s" event={"ID":"0abba4d9-6af7-4aaf-894e-b442873ec67f","Type":"ContainerStarted","Data":"8eac14ca1d8e5b54f6841d14cb10323cd282a8e9aa7674babd4fd918681bf96a"} Dec 06 09:26:12 crc kubenswrapper[4672]: I1206 09:26:12.598371 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8c565c95f-vfw8s" Dec 06 09:26:12 crc kubenswrapper[4672]: I1206 09:26:12.598614 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a34da5c4-fd91-4b56-a534-9ee9350bb7d0" containerName="nova-api-api" containerID="cri-o://80e425f38c39ae44a432e681991e30d937835dc0916e12cb1be05682e502a85e" gracePeriod=30 Dec 06 09:26:12 crc kubenswrapper[4672]: I1206 09:26:12.626565 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8c565c95f-vfw8s" podStartSLOduration=3.62654727 podStartE2EDuration="3.62654727s" podCreationTimestamp="2025-12-06 09:26:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:26:12.616990365 +0000 UTC m=+1190.361250652" watchObservedRunningTime="2025-12-06 09:26:12.62654727 +0000 UTC m=+1190.370807557" Dec 06 09:26:12 crc kubenswrapper[4672]: E1206 09:26:12.680249 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda34da5c4_fd91_4b56_a534_9ee9350bb7d0.slice/crio-conmon-f9a9b5ab3e9798522998d9f8a4b26e61cf394499b857291cffd146b24425be98.scope\": RecentStats: unable to find data in memory cache]" Dec 06 09:26:12 crc kubenswrapper[4672]: I1206 09:26:12.703257 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:26:12 crc kubenswrapper[4672]: I1206 09:26:12.703785 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9cf51826-9cd4-4e4a-9cc5-316ddef360a3" containerName="ceilometer-central-agent" containerID="cri-o://e46341045ab043c958b8069fef460fb38a81aaa5bbb7edd6520c1ea8b2d567c5" gracePeriod=30 Dec 06 09:26:12 crc kubenswrapper[4672]: I1206 09:26:12.704236 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9cf51826-9cd4-4e4a-9cc5-316ddef360a3" containerName="proxy-httpd" containerID="cri-o://797d71a1bca14ee691d9f38683504ad0d42b08dd101f39ff6b32e5418468ceb6" gracePeriod=30 Dec 06 09:26:12 crc kubenswrapper[4672]: I1206 09:26:12.704425 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9cf51826-9cd4-4e4a-9cc5-316ddef360a3" containerName="sg-core" containerID="cri-o://9b9d3569045b442df2f659d3795653d6ec784a0f4f89e5759cdb93380f81e29a" gracePeriod=30 Dec 06 09:26:12 crc kubenswrapper[4672]: I1206 09:26:12.704542 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9cf51826-9cd4-4e4a-9cc5-316ddef360a3" containerName="ceilometer-notification-agent" containerID="cri-o://de5e4bd0a42ccb46675587314062f7381e8ee5e2872add27c039be6e99e309a2" gracePeriod=30 Dec 06 09:26:12 crc kubenswrapper[4672]: I1206 09:26:12.711017 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="9cf51826-9cd4-4e4a-9cc5-316ddef360a3" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.179:3000/\": read tcp 10.217.0.2:40522->10.217.0.179:3000: read: connection reset by peer" Dec 06 09:26:13 crc kubenswrapper[4672]: I1206 09:26:13.194936 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:26:13 crc kubenswrapper[4672]: I1206 09:26:13.607038 4672 generic.go:334] "Generic (PLEG): container finished" podID="9cf51826-9cd4-4e4a-9cc5-316ddef360a3" containerID="797d71a1bca14ee691d9f38683504ad0d42b08dd101f39ff6b32e5418468ceb6" exitCode=0 Dec 06 09:26:13 crc kubenswrapper[4672]: I1206 09:26:13.607064 4672 generic.go:334] "Generic (PLEG): container finished" podID="9cf51826-9cd4-4e4a-9cc5-316ddef360a3" containerID="9b9d3569045b442df2f659d3795653d6ec784a0f4f89e5759cdb93380f81e29a" exitCode=2 Dec 06 09:26:13 crc kubenswrapper[4672]: I1206 09:26:13.607071 4672 generic.go:334] "Generic (PLEG): container finished" podID="9cf51826-9cd4-4e4a-9cc5-316ddef360a3" containerID="e46341045ab043c958b8069fef460fb38a81aaa5bbb7edd6520c1ea8b2d567c5" exitCode=0 Dec 06 09:26:13 crc kubenswrapper[4672]: I1206 09:26:13.607113 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cf51826-9cd4-4e4a-9cc5-316ddef360a3","Type":"ContainerDied","Data":"797d71a1bca14ee691d9f38683504ad0d42b08dd101f39ff6b32e5418468ceb6"} Dec 06 09:26:13 crc kubenswrapper[4672]: I1206 09:26:13.607145 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cf51826-9cd4-4e4a-9cc5-316ddef360a3","Type":"ContainerDied","Data":"9b9d3569045b442df2f659d3795653d6ec784a0f4f89e5759cdb93380f81e29a"} Dec 06 09:26:13 crc kubenswrapper[4672]: I1206 09:26:13.607157 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cf51826-9cd4-4e4a-9cc5-316ddef360a3","Type":"ContainerDied","Data":"e46341045ab043c958b8069fef460fb38a81aaa5bbb7edd6520c1ea8b2d567c5"} Dec 06 09:26:13 crc kubenswrapper[4672]: I1206 09:26:13.608923 4672 generic.go:334] "Generic (PLEG): container finished" podID="a34da5c4-fd91-4b56-a534-9ee9350bb7d0" containerID="f9a9b5ab3e9798522998d9f8a4b26e61cf394499b857291cffd146b24425be98" exitCode=143 Dec 06 09:26:13 crc kubenswrapper[4672]: I1206 09:26:13.608987 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a34da5c4-fd91-4b56-a534-9ee9350bb7d0","Type":"ContainerDied","Data":"f9a9b5ab3e9798522998d9f8a4b26e61cf394499b857291cffd146b24425be98"} Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.220419 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.383043 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a34da5c4-fd91-4b56-a534-9ee9350bb7d0-combined-ca-bundle\") pod \"a34da5c4-fd91-4b56-a534-9ee9350bb7d0\" (UID: \"a34da5c4-fd91-4b56-a534-9ee9350bb7d0\") " Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.383293 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a34da5c4-fd91-4b56-a534-9ee9350bb7d0-logs\") pod \"a34da5c4-fd91-4b56-a534-9ee9350bb7d0\" (UID: \"a34da5c4-fd91-4b56-a534-9ee9350bb7d0\") " Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.383379 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fqsl\" (UniqueName: \"kubernetes.io/projected/a34da5c4-fd91-4b56-a534-9ee9350bb7d0-kube-api-access-5fqsl\") pod \"a34da5c4-fd91-4b56-a534-9ee9350bb7d0\" (UID: \"a34da5c4-fd91-4b56-a534-9ee9350bb7d0\") " Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.383443 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a34da5c4-fd91-4b56-a534-9ee9350bb7d0-config-data\") pod \"a34da5c4-fd91-4b56-a534-9ee9350bb7d0\" (UID: \"a34da5c4-fd91-4b56-a534-9ee9350bb7d0\") " Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.384131 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a34da5c4-fd91-4b56-a534-9ee9350bb7d0-logs" (OuterVolumeSpecName: "logs") pod "a34da5c4-fd91-4b56-a534-9ee9350bb7d0" (UID: "a34da5c4-fd91-4b56-a534-9ee9350bb7d0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.390747 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a34da5c4-fd91-4b56-a534-9ee9350bb7d0-kube-api-access-5fqsl" (OuterVolumeSpecName: "kube-api-access-5fqsl") pod "a34da5c4-fd91-4b56-a534-9ee9350bb7d0" (UID: "a34da5c4-fd91-4b56-a534-9ee9350bb7d0"). InnerVolumeSpecName "kube-api-access-5fqsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.429495 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a34da5c4-fd91-4b56-a534-9ee9350bb7d0-config-data" (OuterVolumeSpecName: "config-data") pod "a34da5c4-fd91-4b56-a534-9ee9350bb7d0" (UID: "a34da5c4-fd91-4b56-a534-9ee9350bb7d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.435361 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a34da5c4-fd91-4b56-a534-9ee9350bb7d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a34da5c4-fd91-4b56-a534-9ee9350bb7d0" (UID: "a34da5c4-fd91-4b56-a534-9ee9350bb7d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.487178 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a34da5c4-fd91-4b56-a534-9ee9350bb7d0-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.487221 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a34da5c4-fd91-4b56-a534-9ee9350bb7d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.487236 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a34da5c4-fd91-4b56-a534-9ee9350bb7d0-logs\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.487249 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fqsl\" (UniqueName: \"kubernetes.io/projected/a34da5c4-fd91-4b56-a534-9ee9350bb7d0-kube-api-access-5fqsl\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.645356 4672 generic.go:334] "Generic (PLEG): container finished" podID="a34da5c4-fd91-4b56-a534-9ee9350bb7d0" containerID="80e425f38c39ae44a432e681991e30d937835dc0916e12cb1be05682e502a85e" exitCode=0 Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.645574 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a34da5c4-fd91-4b56-a534-9ee9350bb7d0","Type":"ContainerDied","Data":"80e425f38c39ae44a432e681991e30d937835dc0916e12cb1be05682e502a85e"} Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.646056 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a34da5c4-fd91-4b56-a534-9ee9350bb7d0","Type":"ContainerDied","Data":"00d960578b0d7598ae45e5ea5f66c025e4c77fb3ee0dbfe0a34b85cbf80b3100"} Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.646152 4672 scope.go:117] "RemoveContainer" containerID="80e425f38c39ae44a432e681991e30d937835dc0916e12cb1be05682e502a85e" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.645776 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.667841 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.675655 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.677114 4672 scope.go:117] "RemoveContainer" containerID="f9a9b5ab3e9798522998d9f8a4b26e61cf394499b857291cffd146b24425be98" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.694590 4672 scope.go:117] "RemoveContainer" containerID="80e425f38c39ae44a432e681991e30d937835dc0916e12cb1be05682e502a85e" Dec 06 09:26:16 crc kubenswrapper[4672]: E1206 09:26:16.695410 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80e425f38c39ae44a432e681991e30d937835dc0916e12cb1be05682e502a85e\": container with ID starting with 80e425f38c39ae44a432e681991e30d937835dc0916e12cb1be05682e502a85e not found: ID does not exist" containerID="80e425f38c39ae44a432e681991e30d937835dc0916e12cb1be05682e502a85e" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.695439 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80e425f38c39ae44a432e681991e30d937835dc0916e12cb1be05682e502a85e"} err="failed to get container status \"80e425f38c39ae44a432e681991e30d937835dc0916e12cb1be05682e502a85e\": rpc error: code = NotFound desc = could not find container \"80e425f38c39ae44a432e681991e30d937835dc0916e12cb1be05682e502a85e\": container with ID starting with 80e425f38c39ae44a432e681991e30d937835dc0916e12cb1be05682e502a85e not found: ID does not exist" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.695462 4672 scope.go:117] "RemoveContainer" containerID="f9a9b5ab3e9798522998d9f8a4b26e61cf394499b857291cffd146b24425be98" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.695821 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 06 09:26:16 crc kubenswrapper[4672]: E1206 09:26:16.695873 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9a9b5ab3e9798522998d9f8a4b26e61cf394499b857291cffd146b24425be98\": container with ID starting with f9a9b5ab3e9798522998d9f8a4b26e61cf394499b857291cffd146b24425be98 not found: ID does not exist" containerID="f9a9b5ab3e9798522998d9f8a4b26e61cf394499b857291cffd146b24425be98" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.696017 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9a9b5ab3e9798522998d9f8a4b26e61cf394499b857291cffd146b24425be98"} err="failed to get container status \"f9a9b5ab3e9798522998d9f8a4b26e61cf394499b857291cffd146b24425be98\": rpc error: code = NotFound desc = could not find container \"f9a9b5ab3e9798522998d9f8a4b26e61cf394499b857291cffd146b24425be98\": container with ID starting with f9a9b5ab3e9798522998d9f8a4b26e61cf394499b857291cffd146b24425be98 not found: ID does not exist" Dec 06 09:26:16 crc kubenswrapper[4672]: E1206 09:26:16.696327 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a34da5c4-fd91-4b56-a534-9ee9350bb7d0" containerName="nova-api-log" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.696396 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="a34da5c4-fd91-4b56-a534-9ee9350bb7d0" containerName="nova-api-log" Dec 06 09:26:16 crc kubenswrapper[4672]: E1206 09:26:16.696463 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a34da5c4-fd91-4b56-a534-9ee9350bb7d0" containerName="nova-api-api" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.696520 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="a34da5c4-fd91-4b56-a534-9ee9350bb7d0" containerName="nova-api-api" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.696780 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="a34da5c4-fd91-4b56-a534-9ee9350bb7d0" containerName="nova-api-api" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.697044 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="a34da5c4-fd91-4b56-a534-9ee9350bb7d0" containerName="nova-api-log" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.697998 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.703872 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.704126 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.704257 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.719482 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.803161 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69c392a-5d64-424c-855d-b4321548387c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d69c392a-5d64-424c-855d-b4321548387c\") " pod="openstack/nova-api-0" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.803327 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d69c392a-5d64-424c-855d-b4321548387c-logs\") pod \"nova-api-0\" (UID: \"d69c392a-5d64-424c-855d-b4321548387c\") " pod="openstack/nova-api-0" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.803372 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d69c392a-5d64-424c-855d-b4321548387c-config-data\") pod \"nova-api-0\" (UID: \"d69c392a-5d64-424c-855d-b4321548387c\") " pod="openstack/nova-api-0" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.803399 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7df87\" (UniqueName: \"kubernetes.io/projected/d69c392a-5d64-424c-855d-b4321548387c-kube-api-access-7df87\") pod \"nova-api-0\" (UID: \"d69c392a-5d64-424c-855d-b4321548387c\") " pod="openstack/nova-api-0" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.803462 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d69c392a-5d64-424c-855d-b4321548387c-public-tls-certs\") pod \"nova-api-0\" (UID: \"d69c392a-5d64-424c-855d-b4321548387c\") " pod="openstack/nova-api-0" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.803543 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d69c392a-5d64-424c-855d-b4321548387c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d69c392a-5d64-424c-855d-b4321548387c\") " pod="openstack/nova-api-0" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.905640 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d69c392a-5d64-424c-855d-b4321548387c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d69c392a-5d64-424c-855d-b4321548387c\") " pod="openstack/nova-api-0" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.905699 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69c392a-5d64-424c-855d-b4321548387c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d69c392a-5d64-424c-855d-b4321548387c\") " pod="openstack/nova-api-0" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.905768 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d69c392a-5d64-424c-855d-b4321548387c-logs\") pod \"nova-api-0\" (UID: \"d69c392a-5d64-424c-855d-b4321548387c\") " pod="openstack/nova-api-0" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.905806 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d69c392a-5d64-424c-855d-b4321548387c-config-data\") pod \"nova-api-0\" (UID: \"d69c392a-5d64-424c-855d-b4321548387c\") " pod="openstack/nova-api-0" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.905832 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7df87\" (UniqueName: \"kubernetes.io/projected/d69c392a-5d64-424c-855d-b4321548387c-kube-api-access-7df87\") pod \"nova-api-0\" (UID: \"d69c392a-5d64-424c-855d-b4321548387c\") " pod="openstack/nova-api-0" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.905897 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d69c392a-5d64-424c-855d-b4321548387c-public-tls-certs\") pod \"nova-api-0\" (UID: \"d69c392a-5d64-424c-855d-b4321548387c\") " pod="openstack/nova-api-0" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.906680 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d69c392a-5d64-424c-855d-b4321548387c-logs\") pod \"nova-api-0\" (UID: \"d69c392a-5d64-424c-855d-b4321548387c\") " pod="openstack/nova-api-0" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.911506 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d69c392a-5d64-424c-855d-b4321548387c-public-tls-certs\") pod \"nova-api-0\" (UID: \"d69c392a-5d64-424c-855d-b4321548387c\") " pod="openstack/nova-api-0" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.911555 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d69c392a-5d64-424c-855d-b4321548387c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d69c392a-5d64-424c-855d-b4321548387c\") " pod="openstack/nova-api-0" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.911593 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d69c392a-5d64-424c-855d-b4321548387c-config-data\") pod \"nova-api-0\" (UID: \"d69c392a-5d64-424c-855d-b4321548387c\") " pod="openstack/nova-api-0" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.914486 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69c392a-5d64-424c-855d-b4321548387c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d69c392a-5d64-424c-855d-b4321548387c\") " pod="openstack/nova-api-0" Dec 06 09:26:16 crc kubenswrapper[4672]: I1206 09:26:16.934718 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7df87\" (UniqueName: \"kubernetes.io/projected/d69c392a-5d64-424c-855d-b4321548387c-kube-api-access-7df87\") pod \"nova-api-0\" (UID: \"d69c392a-5d64-424c-855d-b4321548387c\") " pod="openstack/nova-api-0" Dec 06 09:26:17 crc kubenswrapper[4672]: I1206 09:26:17.027731 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:26:17 crc kubenswrapper[4672]: I1206 09:26:17.500048 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:26:17 crc kubenswrapper[4672]: I1206 09:26:17.654094 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d69c392a-5d64-424c-855d-b4321548387c","Type":"ContainerStarted","Data":"007c69727b750d18d21e4d422a07a3ec70245beb43b036d6d4d8cdee0e040a5e"} Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.179217 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.194992 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.260346 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.324889 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-ceilometer-tls-certs\") pod \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.324945 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-run-httpd\") pod \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.324973 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-config-data\") pod \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.325010 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-log-httpd\") pod \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.325076 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-sg-core-conf-yaml\") pod \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.325105 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-combined-ca-bundle\") pod \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.325165 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-scripts\") pod \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.325226 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5x4w\" (UniqueName: \"kubernetes.io/projected/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-kube-api-access-t5x4w\") pod \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\" (UID: \"9cf51826-9cd4-4e4a-9cc5-316ddef360a3\") " Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.325530 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9cf51826-9cd4-4e4a-9cc5-316ddef360a3" (UID: "9cf51826-9cd4-4e4a-9cc5-316ddef360a3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.326879 4672 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.327961 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9cf51826-9cd4-4e4a-9cc5-316ddef360a3" (UID: "9cf51826-9cd4-4e4a-9cc5-316ddef360a3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.330274 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-kube-api-access-t5x4w" (OuterVolumeSpecName: "kube-api-access-t5x4w") pod "9cf51826-9cd4-4e4a-9cc5-316ddef360a3" (UID: "9cf51826-9cd4-4e4a-9cc5-316ddef360a3"). InnerVolumeSpecName "kube-api-access-t5x4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.330477 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-scripts" (OuterVolumeSpecName: "scripts") pod "9cf51826-9cd4-4e4a-9cc5-316ddef360a3" (UID: "9cf51826-9cd4-4e4a-9cc5-316ddef360a3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.349586 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9cf51826-9cd4-4e4a-9cc5-316ddef360a3" (UID: "9cf51826-9cd4-4e4a-9cc5-316ddef360a3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.404997 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "9cf51826-9cd4-4e4a-9cc5-316ddef360a3" (UID: "9cf51826-9cd4-4e4a-9cc5-316ddef360a3"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.408771 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9cf51826-9cd4-4e4a-9cc5-316ddef360a3" (UID: "9cf51826-9cd4-4e4a-9cc5-316ddef360a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.428638 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5x4w\" (UniqueName: \"kubernetes.io/projected/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-kube-api-access-t5x4w\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.428772 4672 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.428829 4672 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.428906 4672 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.428966 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.429313 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.435188 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-config-data" (OuterVolumeSpecName: "config-data") pod "9cf51826-9cd4-4e4a-9cc5-316ddef360a3" (UID: "9cf51826-9cd4-4e4a-9cc5-316ddef360a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.531196 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cf51826-9cd4-4e4a-9cc5-316ddef360a3-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.573186 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a34da5c4-fd91-4b56-a534-9ee9350bb7d0" path="/var/lib/kubelet/pods/a34da5c4-fd91-4b56-a534-9ee9350bb7d0/volumes" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.669002 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d69c392a-5d64-424c-855d-b4321548387c","Type":"ContainerStarted","Data":"6b223c0a1b010398bda08418c8ceb2367ba360569959279bf7de8d9f52bc657f"} Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.669041 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d69c392a-5d64-424c-855d-b4321548387c","Type":"ContainerStarted","Data":"57c412fbfbedf1c6b525935e3eed8b5e354c10d4bcb88f9623fc338651195466"} Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.673071 4672 generic.go:334] "Generic (PLEG): container finished" podID="9cf51826-9cd4-4e4a-9cc5-316ddef360a3" containerID="de5e4bd0a42ccb46675587314062f7381e8ee5e2872add27c039be6e99e309a2" exitCode=0 Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.674965 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.675988 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cf51826-9cd4-4e4a-9cc5-316ddef360a3","Type":"ContainerDied","Data":"de5e4bd0a42ccb46675587314062f7381e8ee5e2872add27c039be6e99e309a2"} Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.676055 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cf51826-9cd4-4e4a-9cc5-316ddef360a3","Type":"ContainerDied","Data":"0ecd71fb211d3b646a6f5a6b9485425c51f04ad8553df33f23fbf4f19f1c8490"} Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.676089 4672 scope.go:117] "RemoveContainer" containerID="797d71a1bca14ee691d9f38683504ad0d42b08dd101f39ff6b32e5418468ceb6" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.698449 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.698429292 podStartE2EDuration="2.698429292s" podCreationTimestamp="2025-12-06 09:26:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:26:18.690450118 +0000 UTC m=+1196.434710405" watchObservedRunningTime="2025-12-06 09:26:18.698429292 +0000 UTC m=+1196.442689589" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.714863 4672 scope.go:117] "RemoveContainer" containerID="9b9d3569045b442df2f659d3795653d6ec784a0f4f89e5759cdb93380f81e29a" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.715008 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.730235 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.742918 4672 scope.go:117] "RemoveContainer" containerID="de5e4bd0a42ccb46675587314062f7381e8ee5e2872add27c039be6e99e309a2" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.750449 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.786635 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:26:18 crc kubenswrapper[4672]: E1206 09:26:18.787054 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cf51826-9cd4-4e4a-9cc5-316ddef360a3" containerName="sg-core" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.787067 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cf51826-9cd4-4e4a-9cc5-316ddef360a3" containerName="sg-core" Dec 06 09:26:18 crc kubenswrapper[4672]: E1206 09:26:18.787085 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cf51826-9cd4-4e4a-9cc5-316ddef360a3" containerName="ceilometer-central-agent" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.787091 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cf51826-9cd4-4e4a-9cc5-316ddef360a3" containerName="ceilometer-central-agent" Dec 06 09:26:18 crc kubenswrapper[4672]: E1206 09:26:18.787103 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cf51826-9cd4-4e4a-9cc5-316ddef360a3" containerName="ceilometer-notification-agent" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.787110 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cf51826-9cd4-4e4a-9cc5-316ddef360a3" containerName="ceilometer-notification-agent" Dec 06 09:26:18 crc kubenswrapper[4672]: E1206 09:26:18.787121 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cf51826-9cd4-4e4a-9cc5-316ddef360a3" containerName="proxy-httpd" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.787126 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cf51826-9cd4-4e4a-9cc5-316ddef360a3" containerName="proxy-httpd" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.817582 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cf51826-9cd4-4e4a-9cc5-316ddef360a3" containerName="ceilometer-central-agent" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.817648 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cf51826-9cd4-4e4a-9cc5-316ddef360a3" containerName="ceilometer-notification-agent" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.817672 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cf51826-9cd4-4e4a-9cc5-316ddef360a3" containerName="proxy-httpd" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.817688 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cf51826-9cd4-4e4a-9cc5-316ddef360a3" containerName="sg-core" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.819939 4672 scope.go:117] "RemoveContainer" containerID="e46341045ab043c958b8069fef460fb38a81aaa5bbb7edd6520c1ea8b2d567c5" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.829871 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.839740 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.840075 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.842146 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.850990 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.856978 4672 scope.go:117] "RemoveContainer" containerID="797d71a1bca14ee691d9f38683504ad0d42b08dd101f39ff6b32e5418468ceb6" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.864063 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ab86bc2-089c-46d4-9c2c-a05140110779-scripts\") pod \"ceilometer-0\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " pod="openstack/ceilometer-0" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.864220 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ab86bc2-089c-46d4-9c2c-a05140110779-log-httpd\") pod \"ceilometer-0\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " pod="openstack/ceilometer-0" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.864296 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkpd8\" (UniqueName: \"kubernetes.io/projected/4ab86bc2-089c-46d4-9c2c-a05140110779-kube-api-access-zkpd8\") pod \"ceilometer-0\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " pod="openstack/ceilometer-0" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.864327 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ab86bc2-089c-46d4-9c2c-a05140110779-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " pod="openstack/ceilometer-0" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.864376 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ab86bc2-089c-46d4-9c2c-a05140110779-run-httpd\") pod \"ceilometer-0\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " pod="openstack/ceilometer-0" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.864396 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ab86bc2-089c-46d4-9c2c-a05140110779-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " pod="openstack/ceilometer-0" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.864420 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ab86bc2-089c-46d4-9c2c-a05140110779-config-data\") pod \"ceilometer-0\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " pod="openstack/ceilometer-0" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.864541 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ab86bc2-089c-46d4-9c2c-a05140110779-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " pod="openstack/ceilometer-0" Dec 06 09:26:18 crc kubenswrapper[4672]: E1206 09:26:18.869109 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"797d71a1bca14ee691d9f38683504ad0d42b08dd101f39ff6b32e5418468ceb6\": container with ID starting with 797d71a1bca14ee691d9f38683504ad0d42b08dd101f39ff6b32e5418468ceb6 not found: ID does not exist" containerID="797d71a1bca14ee691d9f38683504ad0d42b08dd101f39ff6b32e5418468ceb6" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.869160 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"797d71a1bca14ee691d9f38683504ad0d42b08dd101f39ff6b32e5418468ceb6"} err="failed to get container status \"797d71a1bca14ee691d9f38683504ad0d42b08dd101f39ff6b32e5418468ceb6\": rpc error: code = NotFound desc = could not find container \"797d71a1bca14ee691d9f38683504ad0d42b08dd101f39ff6b32e5418468ceb6\": container with ID starting with 797d71a1bca14ee691d9f38683504ad0d42b08dd101f39ff6b32e5418468ceb6 not found: ID does not exist" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.869188 4672 scope.go:117] "RemoveContainer" containerID="9b9d3569045b442df2f659d3795653d6ec784a0f4f89e5759cdb93380f81e29a" Dec 06 09:26:18 crc kubenswrapper[4672]: E1206 09:26:18.869882 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b9d3569045b442df2f659d3795653d6ec784a0f4f89e5759cdb93380f81e29a\": container with ID starting with 9b9d3569045b442df2f659d3795653d6ec784a0f4f89e5759cdb93380f81e29a not found: ID does not exist" containerID="9b9d3569045b442df2f659d3795653d6ec784a0f4f89e5759cdb93380f81e29a" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.869935 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b9d3569045b442df2f659d3795653d6ec784a0f4f89e5759cdb93380f81e29a"} err="failed to get container status \"9b9d3569045b442df2f659d3795653d6ec784a0f4f89e5759cdb93380f81e29a\": rpc error: code = NotFound desc = could not find container \"9b9d3569045b442df2f659d3795653d6ec784a0f4f89e5759cdb93380f81e29a\": container with ID starting with 9b9d3569045b442df2f659d3795653d6ec784a0f4f89e5759cdb93380f81e29a not found: ID does not exist" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.869982 4672 scope.go:117] "RemoveContainer" containerID="de5e4bd0a42ccb46675587314062f7381e8ee5e2872add27c039be6e99e309a2" Dec 06 09:26:18 crc kubenswrapper[4672]: E1206 09:26:18.870438 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de5e4bd0a42ccb46675587314062f7381e8ee5e2872add27c039be6e99e309a2\": container with ID starting with de5e4bd0a42ccb46675587314062f7381e8ee5e2872add27c039be6e99e309a2 not found: ID does not exist" containerID="de5e4bd0a42ccb46675587314062f7381e8ee5e2872add27c039be6e99e309a2" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.870478 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de5e4bd0a42ccb46675587314062f7381e8ee5e2872add27c039be6e99e309a2"} err="failed to get container status \"de5e4bd0a42ccb46675587314062f7381e8ee5e2872add27c039be6e99e309a2\": rpc error: code = NotFound desc = could not find container \"de5e4bd0a42ccb46675587314062f7381e8ee5e2872add27c039be6e99e309a2\": container with ID starting with de5e4bd0a42ccb46675587314062f7381e8ee5e2872add27c039be6e99e309a2 not found: ID does not exist" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.870534 4672 scope.go:117] "RemoveContainer" containerID="e46341045ab043c958b8069fef460fb38a81aaa5bbb7edd6520c1ea8b2d567c5" Dec 06 09:26:18 crc kubenswrapper[4672]: E1206 09:26:18.870955 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e46341045ab043c958b8069fef460fb38a81aaa5bbb7edd6520c1ea8b2d567c5\": container with ID starting with e46341045ab043c958b8069fef460fb38a81aaa5bbb7edd6520c1ea8b2d567c5 not found: ID does not exist" containerID="e46341045ab043c958b8069fef460fb38a81aaa5bbb7edd6520c1ea8b2d567c5" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.871002 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e46341045ab043c958b8069fef460fb38a81aaa5bbb7edd6520c1ea8b2d567c5"} err="failed to get container status \"e46341045ab043c958b8069fef460fb38a81aaa5bbb7edd6520c1ea8b2d567c5\": rpc error: code = NotFound desc = could not find container \"e46341045ab043c958b8069fef460fb38a81aaa5bbb7edd6520c1ea8b2d567c5\": container with ID starting with e46341045ab043c958b8069fef460fb38a81aaa5bbb7edd6520c1ea8b2d567c5 not found: ID does not exist" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.979169 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ab86bc2-089c-46d4-9c2c-a05140110779-log-httpd\") pod \"ceilometer-0\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " pod="openstack/ceilometer-0" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.979537 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkpd8\" (UniqueName: \"kubernetes.io/projected/4ab86bc2-089c-46d4-9c2c-a05140110779-kube-api-access-zkpd8\") pod \"ceilometer-0\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " pod="openstack/ceilometer-0" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.979561 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ab86bc2-089c-46d4-9c2c-a05140110779-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " pod="openstack/ceilometer-0" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.979585 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ab86bc2-089c-46d4-9c2c-a05140110779-run-httpd\") pod \"ceilometer-0\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " pod="openstack/ceilometer-0" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.979619 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ab86bc2-089c-46d4-9c2c-a05140110779-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " pod="openstack/ceilometer-0" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.979634 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ab86bc2-089c-46d4-9c2c-a05140110779-config-data\") pod \"ceilometer-0\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " pod="openstack/ceilometer-0" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.979680 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ab86bc2-089c-46d4-9c2c-a05140110779-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " pod="openstack/ceilometer-0" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.979735 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ab86bc2-089c-46d4-9c2c-a05140110779-scripts\") pod \"ceilometer-0\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " pod="openstack/ceilometer-0" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.980778 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ab86bc2-089c-46d4-9c2c-a05140110779-log-httpd\") pod \"ceilometer-0\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " pod="openstack/ceilometer-0" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.983313 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ab86bc2-089c-46d4-9c2c-a05140110779-scripts\") pod \"ceilometer-0\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " pod="openstack/ceilometer-0" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.984328 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ab86bc2-089c-46d4-9c2c-a05140110779-run-httpd\") pod \"ceilometer-0\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " pod="openstack/ceilometer-0" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.984435 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ab86bc2-089c-46d4-9c2c-a05140110779-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " pod="openstack/ceilometer-0" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.985307 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ab86bc2-089c-46d4-9c2c-a05140110779-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " pod="openstack/ceilometer-0" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.986740 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ab86bc2-089c-46d4-9c2c-a05140110779-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " pod="openstack/ceilometer-0" Dec 06 09:26:18 crc kubenswrapper[4672]: I1206 09:26:18.990208 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ab86bc2-089c-46d4-9c2c-a05140110779-config-data\") pod \"ceilometer-0\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " pod="openstack/ceilometer-0" Dec 06 09:26:19 crc kubenswrapper[4672]: I1206 09:26:19.000815 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-mtfvz"] Dec 06 09:26:19 crc kubenswrapper[4672]: I1206 09:26:19.008361 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mtfvz" Dec 06 09:26:19 crc kubenswrapper[4672]: I1206 09:26:19.012205 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 06 09:26:19 crc kubenswrapper[4672]: I1206 09:26:19.013385 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mtfvz"] Dec 06 09:26:19 crc kubenswrapper[4672]: I1206 09:26:19.015046 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 06 09:26:19 crc kubenswrapper[4672]: I1206 09:26:19.057489 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkpd8\" (UniqueName: \"kubernetes.io/projected/4ab86bc2-089c-46d4-9c2c-a05140110779-kube-api-access-zkpd8\") pod \"ceilometer-0\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " pod="openstack/ceilometer-0" Dec 06 09:26:19 crc kubenswrapper[4672]: I1206 09:26:19.081823 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9531e27a-bb7e-4700-9bd9-5008c3c7b12f-scripts\") pod \"nova-cell1-cell-mapping-mtfvz\" (UID: \"9531e27a-bb7e-4700-9bd9-5008c3c7b12f\") " pod="openstack/nova-cell1-cell-mapping-mtfvz" Dec 06 09:26:19 crc kubenswrapper[4672]: I1206 09:26:19.081862 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9531e27a-bb7e-4700-9bd9-5008c3c7b12f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mtfvz\" (UID: \"9531e27a-bb7e-4700-9bd9-5008c3c7b12f\") " pod="openstack/nova-cell1-cell-mapping-mtfvz" Dec 06 09:26:19 crc kubenswrapper[4672]: I1206 09:26:19.081890 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9531e27a-bb7e-4700-9bd9-5008c3c7b12f-config-data\") pod \"nova-cell1-cell-mapping-mtfvz\" (UID: \"9531e27a-bb7e-4700-9bd9-5008c3c7b12f\") " pod="openstack/nova-cell1-cell-mapping-mtfvz" Dec 06 09:26:19 crc kubenswrapper[4672]: I1206 09:26:19.081954 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmzcf\" (UniqueName: \"kubernetes.io/projected/9531e27a-bb7e-4700-9bd9-5008c3c7b12f-kube-api-access-jmzcf\") pod \"nova-cell1-cell-mapping-mtfvz\" (UID: \"9531e27a-bb7e-4700-9bd9-5008c3c7b12f\") " pod="openstack/nova-cell1-cell-mapping-mtfvz" Dec 06 09:26:19 crc kubenswrapper[4672]: I1206 09:26:19.163913 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:26:19 crc kubenswrapper[4672]: I1206 09:26:19.183873 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9531e27a-bb7e-4700-9bd9-5008c3c7b12f-scripts\") pod \"nova-cell1-cell-mapping-mtfvz\" (UID: \"9531e27a-bb7e-4700-9bd9-5008c3c7b12f\") " pod="openstack/nova-cell1-cell-mapping-mtfvz" Dec 06 09:26:19 crc kubenswrapper[4672]: I1206 09:26:19.184407 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9531e27a-bb7e-4700-9bd9-5008c3c7b12f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mtfvz\" (UID: \"9531e27a-bb7e-4700-9bd9-5008c3c7b12f\") " pod="openstack/nova-cell1-cell-mapping-mtfvz" Dec 06 09:26:19 crc kubenswrapper[4672]: I1206 09:26:19.184457 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9531e27a-bb7e-4700-9bd9-5008c3c7b12f-config-data\") pod \"nova-cell1-cell-mapping-mtfvz\" (UID: \"9531e27a-bb7e-4700-9bd9-5008c3c7b12f\") " pod="openstack/nova-cell1-cell-mapping-mtfvz" Dec 06 09:26:19 crc kubenswrapper[4672]: I1206 09:26:19.184619 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmzcf\" (UniqueName: \"kubernetes.io/projected/9531e27a-bb7e-4700-9bd9-5008c3c7b12f-kube-api-access-jmzcf\") pod \"nova-cell1-cell-mapping-mtfvz\" (UID: \"9531e27a-bb7e-4700-9bd9-5008c3c7b12f\") " pod="openstack/nova-cell1-cell-mapping-mtfvz" Dec 06 09:26:19 crc kubenswrapper[4672]: I1206 09:26:19.191393 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9531e27a-bb7e-4700-9bd9-5008c3c7b12f-scripts\") pod \"nova-cell1-cell-mapping-mtfvz\" (UID: \"9531e27a-bb7e-4700-9bd9-5008c3c7b12f\") " pod="openstack/nova-cell1-cell-mapping-mtfvz" Dec 06 09:26:19 crc kubenswrapper[4672]: I1206 09:26:19.192068 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9531e27a-bb7e-4700-9bd9-5008c3c7b12f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mtfvz\" (UID: \"9531e27a-bb7e-4700-9bd9-5008c3c7b12f\") " pod="openstack/nova-cell1-cell-mapping-mtfvz" Dec 06 09:26:19 crc kubenswrapper[4672]: I1206 09:26:19.193825 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9531e27a-bb7e-4700-9bd9-5008c3c7b12f-config-data\") pod \"nova-cell1-cell-mapping-mtfvz\" (UID: \"9531e27a-bb7e-4700-9bd9-5008c3c7b12f\") " pod="openstack/nova-cell1-cell-mapping-mtfvz" Dec 06 09:26:19 crc kubenswrapper[4672]: I1206 09:26:19.202406 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmzcf\" (UniqueName: \"kubernetes.io/projected/9531e27a-bb7e-4700-9bd9-5008c3c7b12f-kube-api-access-jmzcf\") pod \"nova-cell1-cell-mapping-mtfvz\" (UID: \"9531e27a-bb7e-4700-9bd9-5008c3c7b12f\") " pod="openstack/nova-cell1-cell-mapping-mtfvz" Dec 06 09:26:19 crc kubenswrapper[4672]: I1206 09:26:19.412156 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mtfvz" Dec 06 09:26:19 crc kubenswrapper[4672]: W1206 09:26:19.632887 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ab86bc2_089c_46d4_9c2c_a05140110779.slice/crio-a75c7ebcf73ef883ff1c97ca59b881d6695c71f9987d9938ca5feab3ba378c5b WatchSource:0}: Error finding container a75c7ebcf73ef883ff1c97ca59b881d6695c71f9987d9938ca5feab3ba378c5b: Status 404 returned error can't find the container with id a75c7ebcf73ef883ff1c97ca59b881d6695c71f9987d9938ca5feab3ba378c5b Dec 06 09:26:19 crc kubenswrapper[4672]: I1206 09:26:19.634055 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:26:19 crc kubenswrapper[4672]: I1206 09:26:19.634940 4672 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 09:26:19 crc kubenswrapper[4672]: I1206 09:26:19.683854 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ab86bc2-089c-46d4-9c2c-a05140110779","Type":"ContainerStarted","Data":"a75c7ebcf73ef883ff1c97ca59b881d6695c71f9987d9938ca5feab3ba378c5b"} Dec 06 09:26:19 crc kubenswrapper[4672]: I1206 09:26:19.852001 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mtfvz"] Dec 06 09:26:20 crc kubenswrapper[4672]: I1206 09:26:20.086808 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8c565c95f-vfw8s" Dec 06 09:26:20 crc kubenswrapper[4672]: I1206 09:26:20.167148 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bfd54d96c-j66pm"] Dec 06 09:26:20 crc kubenswrapper[4672]: I1206 09:26:20.167383 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bfd54d96c-j66pm" podUID="c28bd8fe-e324-4fb3-9056-e15d5dc67b78" containerName="dnsmasq-dns" containerID="cri-o://3a3414f86377df1065630c8a1819ec4af10ac678f8ccacaf0a9cdedf02ad3a9c" gracePeriod=10 Dec 06 09:26:20 crc kubenswrapper[4672]: I1206 09:26:20.583245 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cf51826-9cd4-4e4a-9cc5-316ddef360a3" path="/var/lib/kubelet/pods/9cf51826-9cd4-4e4a-9cc5-316ddef360a3/volumes" Dec 06 09:26:20 crc kubenswrapper[4672]: I1206 09:26:20.716257 4672 generic.go:334] "Generic (PLEG): container finished" podID="c28bd8fe-e324-4fb3-9056-e15d5dc67b78" containerID="3a3414f86377df1065630c8a1819ec4af10ac678f8ccacaf0a9cdedf02ad3a9c" exitCode=0 Dec 06 09:26:20 crc kubenswrapper[4672]: I1206 09:26:20.716322 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bfd54d96c-j66pm" event={"ID":"c28bd8fe-e324-4fb3-9056-e15d5dc67b78","Type":"ContainerDied","Data":"3a3414f86377df1065630c8a1819ec4af10ac678f8ccacaf0a9cdedf02ad3a9c"} Dec 06 09:26:20 crc kubenswrapper[4672]: I1206 09:26:20.724844 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mtfvz" event={"ID":"9531e27a-bb7e-4700-9bd9-5008c3c7b12f","Type":"ContainerStarted","Data":"767ad21198037849acf6a95a7a19a08a7e36946276a177e4b827a921edb12e65"} Dec 06 09:26:20 crc kubenswrapper[4672]: I1206 09:26:20.724892 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mtfvz" event={"ID":"9531e27a-bb7e-4700-9bd9-5008c3c7b12f","Type":"ContainerStarted","Data":"5601e5fa03700efb8638a443db67e57be60a58ccb8830fa0b82736b7ebca2e71"} Dec 06 09:26:20 crc kubenswrapper[4672]: I1206 09:26:20.728023 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bfd54d96c-j66pm" Dec 06 09:26:20 crc kubenswrapper[4672]: I1206 09:26:20.728526 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ab86bc2-089c-46d4-9c2c-a05140110779","Type":"ContainerStarted","Data":"e936120c2580c0bcc7e57e8663991ac5bc0f338b9ba9058f75b969936cd4c4df"} Dec 06 09:26:20 crc kubenswrapper[4672]: I1206 09:26:20.744938 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-mtfvz" podStartSLOduration=2.744920102 podStartE2EDuration="2.744920102s" podCreationTimestamp="2025-12-06 09:26:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:26:20.742054749 +0000 UTC m=+1198.486315036" watchObservedRunningTime="2025-12-06 09:26:20.744920102 +0000 UTC m=+1198.489180389" Dec 06 09:26:20 crc kubenswrapper[4672]: I1206 09:26:20.867015 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c28bd8fe-e324-4fb3-9056-e15d5dc67b78-ovsdbserver-nb\") pod \"c28bd8fe-e324-4fb3-9056-e15d5dc67b78\" (UID: \"c28bd8fe-e324-4fb3-9056-e15d5dc67b78\") " Dec 06 09:26:20 crc kubenswrapper[4672]: I1206 09:26:20.867090 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c28bd8fe-e324-4fb3-9056-e15d5dc67b78-dns-svc\") pod \"c28bd8fe-e324-4fb3-9056-e15d5dc67b78\" (UID: \"c28bd8fe-e324-4fb3-9056-e15d5dc67b78\") " Dec 06 09:26:20 crc kubenswrapper[4672]: I1206 09:26:20.867186 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nc8kq\" (UniqueName: \"kubernetes.io/projected/c28bd8fe-e324-4fb3-9056-e15d5dc67b78-kube-api-access-nc8kq\") pod \"c28bd8fe-e324-4fb3-9056-e15d5dc67b78\" (UID: \"c28bd8fe-e324-4fb3-9056-e15d5dc67b78\") " Dec 06 09:26:20 crc kubenswrapper[4672]: I1206 09:26:20.867249 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c28bd8fe-e324-4fb3-9056-e15d5dc67b78-ovsdbserver-sb\") pod \"c28bd8fe-e324-4fb3-9056-e15d5dc67b78\" (UID: \"c28bd8fe-e324-4fb3-9056-e15d5dc67b78\") " Dec 06 09:26:20 crc kubenswrapper[4672]: I1206 09:26:20.867337 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c28bd8fe-e324-4fb3-9056-e15d5dc67b78-config\") pod \"c28bd8fe-e324-4fb3-9056-e15d5dc67b78\" (UID: \"c28bd8fe-e324-4fb3-9056-e15d5dc67b78\") " Dec 06 09:26:20 crc kubenswrapper[4672]: I1206 09:26:20.873509 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c28bd8fe-e324-4fb3-9056-e15d5dc67b78-kube-api-access-nc8kq" (OuterVolumeSpecName: "kube-api-access-nc8kq") pod "c28bd8fe-e324-4fb3-9056-e15d5dc67b78" (UID: "c28bd8fe-e324-4fb3-9056-e15d5dc67b78"). InnerVolumeSpecName "kube-api-access-nc8kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:26:20 crc kubenswrapper[4672]: I1206 09:26:20.923501 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c28bd8fe-e324-4fb3-9056-e15d5dc67b78-config" (OuterVolumeSpecName: "config") pod "c28bd8fe-e324-4fb3-9056-e15d5dc67b78" (UID: "c28bd8fe-e324-4fb3-9056-e15d5dc67b78"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:26:20 crc kubenswrapper[4672]: I1206 09:26:20.928592 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c28bd8fe-e324-4fb3-9056-e15d5dc67b78-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c28bd8fe-e324-4fb3-9056-e15d5dc67b78" (UID: "c28bd8fe-e324-4fb3-9056-e15d5dc67b78"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:26:20 crc kubenswrapper[4672]: I1206 09:26:20.929214 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c28bd8fe-e324-4fb3-9056-e15d5dc67b78-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c28bd8fe-e324-4fb3-9056-e15d5dc67b78" (UID: "c28bd8fe-e324-4fb3-9056-e15d5dc67b78"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:26:20 crc kubenswrapper[4672]: I1206 09:26:20.942850 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c28bd8fe-e324-4fb3-9056-e15d5dc67b78-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c28bd8fe-e324-4fb3-9056-e15d5dc67b78" (UID: "c28bd8fe-e324-4fb3-9056-e15d5dc67b78"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:26:20 crc kubenswrapper[4672]: I1206 09:26:20.969764 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c28bd8fe-e324-4fb3-9056-e15d5dc67b78-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:20 crc kubenswrapper[4672]: I1206 09:26:20.969791 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c28bd8fe-e324-4fb3-9056-e15d5dc67b78-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:20 crc kubenswrapper[4672]: I1206 09:26:20.969801 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nc8kq\" (UniqueName: \"kubernetes.io/projected/c28bd8fe-e324-4fb3-9056-e15d5dc67b78-kube-api-access-nc8kq\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:20 crc kubenswrapper[4672]: I1206 09:26:20.969809 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c28bd8fe-e324-4fb3-9056-e15d5dc67b78-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:20 crc kubenswrapper[4672]: I1206 09:26:20.969819 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c28bd8fe-e324-4fb3-9056-e15d5dc67b78-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:21 crc kubenswrapper[4672]: I1206 09:26:21.742576 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bfd54d96c-j66pm" event={"ID":"c28bd8fe-e324-4fb3-9056-e15d5dc67b78","Type":"ContainerDied","Data":"91285225fd5495a2890a1b99ab6e38976d50bf64d571d8ecc277b8f36b099a0b"} Dec 06 09:26:21 crc kubenswrapper[4672]: I1206 09:26:21.743190 4672 scope.go:117] "RemoveContainer" containerID="3a3414f86377df1065630c8a1819ec4af10ac678f8ccacaf0a9cdedf02ad3a9c" Dec 06 09:26:21 crc kubenswrapper[4672]: I1206 09:26:21.743133 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bfd54d96c-j66pm" Dec 06 09:26:21 crc kubenswrapper[4672]: I1206 09:26:21.757960 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ab86bc2-089c-46d4-9c2c-a05140110779","Type":"ContainerStarted","Data":"fe312428c5b7cbb369d1801b2a999534b6fb673812b268823eff70f7676b869e"} Dec 06 09:26:21 crc kubenswrapper[4672]: I1206 09:26:21.814882 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bfd54d96c-j66pm"] Dec 06 09:26:21 crc kubenswrapper[4672]: I1206 09:26:21.837828 4672 scope.go:117] "RemoveContainer" containerID="b5cc9d5611ffdd1b7e3607894e6d8c1e3b295332316db1bce88af0e291228e5b" Dec 06 09:26:21 crc kubenswrapper[4672]: I1206 09:26:21.844792 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bfd54d96c-j66pm"] Dec 06 09:26:22 crc kubenswrapper[4672]: I1206 09:26:22.574632 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c28bd8fe-e324-4fb3-9056-e15d5dc67b78" path="/var/lib/kubelet/pods/c28bd8fe-e324-4fb3-9056-e15d5dc67b78/volumes" Dec 06 09:26:22 crc kubenswrapper[4672]: I1206 09:26:22.776187 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ab86bc2-089c-46d4-9c2c-a05140110779","Type":"ContainerStarted","Data":"50ef7bdad1a9bf111054fbd7bb4178b2a45b503438a5979010c382d3f38acb1b"} Dec 06 09:26:23 crc kubenswrapper[4672]: I1206 09:26:23.789379 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ab86bc2-089c-46d4-9c2c-a05140110779","Type":"ContainerStarted","Data":"fbb25ae67a88e3e2e2bae85c668babac0e3d42d08abe8b87f58e1bae3e8886cf"} Dec 06 09:26:23 crc kubenswrapper[4672]: I1206 09:26:23.790559 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 09:26:23 crc kubenswrapper[4672]: I1206 09:26:23.823142 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.343010165 podStartE2EDuration="5.823109394s" podCreationTimestamp="2025-12-06 09:26:18 +0000 UTC" firstStartedPulling="2025-12-06 09:26:19.634734834 +0000 UTC m=+1197.378995121" lastFinishedPulling="2025-12-06 09:26:23.114834063 +0000 UTC m=+1200.859094350" observedRunningTime="2025-12-06 09:26:23.82022174 +0000 UTC m=+1201.564482037" watchObservedRunningTime="2025-12-06 09:26:23.823109394 +0000 UTC m=+1201.567369701" Dec 06 09:26:25 crc kubenswrapper[4672]: I1206 09:26:25.812441 4672 generic.go:334] "Generic (PLEG): container finished" podID="9531e27a-bb7e-4700-9bd9-5008c3c7b12f" containerID="767ad21198037849acf6a95a7a19a08a7e36946276a177e4b827a921edb12e65" exitCode=0 Dec 06 09:26:25 crc kubenswrapper[4672]: I1206 09:26:25.812724 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mtfvz" event={"ID":"9531e27a-bb7e-4700-9bd9-5008c3c7b12f","Type":"ContainerDied","Data":"767ad21198037849acf6a95a7a19a08a7e36946276a177e4b827a921edb12e65"} Dec 06 09:26:27 crc kubenswrapper[4672]: I1206 09:26:27.028272 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 09:26:27 crc kubenswrapper[4672]: I1206 09:26:27.028634 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 09:26:27 crc kubenswrapper[4672]: I1206 09:26:27.217365 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mtfvz" Dec 06 09:26:27 crc kubenswrapper[4672]: I1206 09:26:27.284528 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmzcf\" (UniqueName: \"kubernetes.io/projected/9531e27a-bb7e-4700-9bd9-5008c3c7b12f-kube-api-access-jmzcf\") pod \"9531e27a-bb7e-4700-9bd9-5008c3c7b12f\" (UID: \"9531e27a-bb7e-4700-9bd9-5008c3c7b12f\") " Dec 06 09:26:27 crc kubenswrapper[4672]: I1206 09:26:27.284642 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9531e27a-bb7e-4700-9bd9-5008c3c7b12f-combined-ca-bundle\") pod \"9531e27a-bb7e-4700-9bd9-5008c3c7b12f\" (UID: \"9531e27a-bb7e-4700-9bd9-5008c3c7b12f\") " Dec 06 09:26:27 crc kubenswrapper[4672]: I1206 09:26:27.284753 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9531e27a-bb7e-4700-9bd9-5008c3c7b12f-config-data\") pod \"9531e27a-bb7e-4700-9bd9-5008c3c7b12f\" (UID: \"9531e27a-bb7e-4700-9bd9-5008c3c7b12f\") " Dec 06 09:26:27 crc kubenswrapper[4672]: I1206 09:26:27.284811 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9531e27a-bb7e-4700-9bd9-5008c3c7b12f-scripts\") pod \"9531e27a-bb7e-4700-9bd9-5008c3c7b12f\" (UID: \"9531e27a-bb7e-4700-9bd9-5008c3c7b12f\") " Dec 06 09:26:27 crc kubenswrapper[4672]: I1206 09:26:27.313738 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9531e27a-bb7e-4700-9bd9-5008c3c7b12f-kube-api-access-jmzcf" (OuterVolumeSpecName: "kube-api-access-jmzcf") pod "9531e27a-bb7e-4700-9bd9-5008c3c7b12f" (UID: "9531e27a-bb7e-4700-9bd9-5008c3c7b12f"). InnerVolumeSpecName "kube-api-access-jmzcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:26:27 crc kubenswrapper[4672]: I1206 09:26:27.315839 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9531e27a-bb7e-4700-9bd9-5008c3c7b12f-scripts" (OuterVolumeSpecName: "scripts") pod "9531e27a-bb7e-4700-9bd9-5008c3c7b12f" (UID: "9531e27a-bb7e-4700-9bd9-5008c3c7b12f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:26:27 crc kubenswrapper[4672]: I1206 09:26:27.315978 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9531e27a-bb7e-4700-9bd9-5008c3c7b12f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9531e27a-bb7e-4700-9bd9-5008c3c7b12f" (UID: "9531e27a-bb7e-4700-9bd9-5008c3c7b12f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:26:27 crc kubenswrapper[4672]: I1206 09:26:27.316031 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9531e27a-bb7e-4700-9bd9-5008c3c7b12f-config-data" (OuterVolumeSpecName: "config-data") pod "9531e27a-bb7e-4700-9bd9-5008c3c7b12f" (UID: "9531e27a-bb7e-4700-9bd9-5008c3c7b12f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:26:27 crc kubenswrapper[4672]: I1206 09:26:27.386836 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9531e27a-bb7e-4700-9bd9-5008c3c7b12f-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:27 crc kubenswrapper[4672]: I1206 09:26:27.386867 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmzcf\" (UniqueName: \"kubernetes.io/projected/9531e27a-bb7e-4700-9bd9-5008c3c7b12f-kube-api-access-jmzcf\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:27 crc kubenswrapper[4672]: I1206 09:26:27.386878 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9531e27a-bb7e-4700-9bd9-5008c3c7b12f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:27 crc kubenswrapper[4672]: I1206 09:26:27.386887 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9531e27a-bb7e-4700-9bd9-5008c3c7b12f-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:27 crc kubenswrapper[4672]: I1206 09:26:27.834941 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mtfvz" event={"ID":"9531e27a-bb7e-4700-9bd9-5008c3c7b12f","Type":"ContainerDied","Data":"5601e5fa03700efb8638a443db67e57be60a58ccb8830fa0b82736b7ebca2e71"} Dec 06 09:26:27 crc kubenswrapper[4672]: I1206 09:26:27.835000 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5601e5fa03700efb8638a443db67e57be60a58ccb8830fa0b82736b7ebca2e71" Dec 06 09:26:27 crc kubenswrapper[4672]: I1206 09:26:27.835104 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mtfvz" Dec 06 09:26:27 crc kubenswrapper[4672]: I1206 09:26:27.998309 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:26:27 crc kubenswrapper[4672]: I1206 09:26:27.998584 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d69c392a-5d64-424c-855d-b4321548387c" containerName="nova-api-log" containerID="cri-o://57c412fbfbedf1c6b525935e3eed8b5e354c10d4bcb88f9623fc338651195466" gracePeriod=30 Dec 06 09:26:27 crc kubenswrapper[4672]: I1206 09:26:27.998685 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d69c392a-5d64-424c-855d-b4321548387c" containerName="nova-api-api" containerID="cri-o://6b223c0a1b010398bda08418c8ceb2367ba360569959279bf7de8d9f52bc657f" gracePeriod=30 Dec 06 09:26:28 crc kubenswrapper[4672]: I1206 09:26:28.005497 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d69c392a-5d64-424c-855d-b4321548387c" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.182:8774/\": EOF" Dec 06 09:26:28 crc kubenswrapper[4672]: I1206 09:26:28.005553 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d69c392a-5d64-424c-855d-b4321548387c" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.182:8774/\": EOF" Dec 06 09:26:28 crc kubenswrapper[4672]: I1206 09:26:28.031884 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 09:26:28 crc kubenswrapper[4672]: I1206 09:26:28.032125 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2b05d0f1-5b9d-4e82-84ff-8addbc45aec2" containerName="nova-scheduler-scheduler" containerID="cri-o://e07590524f183820b0a8a6e77c268ad94429c6cd54e7127b1081a709225ce187" gracePeriod=30 Dec 06 09:26:28 crc kubenswrapper[4672]: I1206 09:26:28.123343 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:26:28 crc kubenswrapper[4672]: I1206 09:26:28.123658 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8a8fc694-01bf-4882-a9f8-07d026a37ee2" containerName="nova-metadata-log" containerID="cri-o://1b633af106ee2192aa25533c340e84906a577eb6eebbec06978bc6e3a755b39c" gracePeriod=30 Dec 06 09:26:28 crc kubenswrapper[4672]: I1206 09:26:28.124032 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8a8fc694-01bf-4882-a9f8-07d026a37ee2" containerName="nova-metadata-metadata" containerID="cri-o://c1e2396203e3e0186e0e875737f8010bd4e89c783e31dae43aa25a3c81ec7818" gracePeriod=30 Dec 06 09:26:28 crc kubenswrapper[4672]: I1206 09:26:28.844320 4672 generic.go:334] "Generic (PLEG): container finished" podID="d69c392a-5d64-424c-855d-b4321548387c" containerID="57c412fbfbedf1c6b525935e3eed8b5e354c10d4bcb88f9623fc338651195466" exitCode=143 Dec 06 09:26:28 crc kubenswrapper[4672]: I1206 09:26:28.844354 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d69c392a-5d64-424c-855d-b4321548387c","Type":"ContainerDied","Data":"57c412fbfbedf1c6b525935e3eed8b5e354c10d4bcb88f9623fc338651195466"} Dec 06 09:26:28 crc kubenswrapper[4672]: I1206 09:26:28.846421 4672 generic.go:334] "Generic (PLEG): container finished" podID="8a8fc694-01bf-4882-a9f8-07d026a37ee2" containerID="1b633af106ee2192aa25533c340e84906a577eb6eebbec06978bc6e3a755b39c" exitCode=143 Dec 06 09:26:28 crc kubenswrapper[4672]: I1206 09:26:28.846453 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8a8fc694-01bf-4882-a9f8-07d026a37ee2","Type":"ContainerDied","Data":"1b633af106ee2192aa25533c340e84906a577eb6eebbec06978bc6e3a755b39c"} Dec 06 09:26:30 crc kubenswrapper[4672]: E1206 09:26:30.799993 4672 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e07590524f183820b0a8a6e77c268ad94429c6cd54e7127b1081a709225ce187 is running failed: container process not found" containerID="e07590524f183820b0a8a6e77c268ad94429c6cd54e7127b1081a709225ce187" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 06 09:26:30 crc kubenswrapper[4672]: E1206 09:26:30.803889 4672 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e07590524f183820b0a8a6e77c268ad94429c6cd54e7127b1081a709225ce187 is running failed: container process not found" containerID="e07590524f183820b0a8a6e77c268ad94429c6cd54e7127b1081a709225ce187" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 06 09:26:30 crc kubenswrapper[4672]: E1206 09:26:30.808113 4672 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e07590524f183820b0a8a6e77c268ad94429c6cd54e7127b1081a709225ce187 is running failed: container process not found" containerID="e07590524f183820b0a8a6e77c268ad94429c6cd54e7127b1081a709225ce187" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 06 09:26:30 crc kubenswrapper[4672]: E1206 09:26:30.808174 4672 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e07590524f183820b0a8a6e77c268ad94429c6cd54e7127b1081a709225ce187 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="2b05d0f1-5b9d-4e82-84ff-8addbc45aec2" containerName="nova-scheduler-scheduler" Dec 06 09:26:30 crc kubenswrapper[4672]: I1206 09:26:30.867181 4672 generic.go:334] "Generic (PLEG): container finished" podID="2b05d0f1-5b9d-4e82-84ff-8addbc45aec2" containerID="e07590524f183820b0a8a6e77c268ad94429c6cd54e7127b1081a709225ce187" exitCode=0 Dec 06 09:26:30 crc kubenswrapper[4672]: I1206 09:26:30.867242 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2b05d0f1-5b9d-4e82-84ff-8addbc45aec2","Type":"ContainerDied","Data":"e07590524f183820b0a8a6e77c268ad94429c6cd54e7127b1081a709225ce187"} Dec 06 09:26:30 crc kubenswrapper[4672]: I1206 09:26:30.867291 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2b05d0f1-5b9d-4e82-84ff-8addbc45aec2","Type":"ContainerDied","Data":"b33c16d8ffbc119ad65281554f37c7c6e784d249ca3a5b686312a98b5fc0dad6"} Dec 06 09:26:30 crc kubenswrapper[4672]: I1206 09:26:30.867308 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b33c16d8ffbc119ad65281554f37c7c6e784d249ca3a5b686312a98b5fc0dad6" Dec 06 09:26:30 crc kubenswrapper[4672]: I1206 09:26:30.932382 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 09:26:30 crc kubenswrapper[4672]: I1206 09:26:30.943286 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b05d0f1-5b9d-4e82-84ff-8addbc45aec2-combined-ca-bundle\") pod \"2b05d0f1-5b9d-4e82-84ff-8addbc45aec2\" (UID: \"2b05d0f1-5b9d-4e82-84ff-8addbc45aec2\") " Dec 06 09:26:30 crc kubenswrapper[4672]: I1206 09:26:30.943470 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b05d0f1-5b9d-4e82-84ff-8addbc45aec2-config-data\") pod \"2b05d0f1-5b9d-4e82-84ff-8addbc45aec2\" (UID: \"2b05d0f1-5b9d-4e82-84ff-8addbc45aec2\") " Dec 06 09:26:30 crc kubenswrapper[4672]: I1206 09:26:30.943500 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fk85f\" (UniqueName: \"kubernetes.io/projected/2b05d0f1-5b9d-4e82-84ff-8addbc45aec2-kube-api-access-fk85f\") pod \"2b05d0f1-5b9d-4e82-84ff-8addbc45aec2\" (UID: \"2b05d0f1-5b9d-4e82-84ff-8addbc45aec2\") " Dec 06 09:26:30 crc kubenswrapper[4672]: I1206 09:26:30.959018 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b05d0f1-5b9d-4e82-84ff-8addbc45aec2-kube-api-access-fk85f" (OuterVolumeSpecName: "kube-api-access-fk85f") pod "2b05d0f1-5b9d-4e82-84ff-8addbc45aec2" (UID: "2b05d0f1-5b9d-4e82-84ff-8addbc45aec2"). InnerVolumeSpecName "kube-api-access-fk85f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.013833 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b05d0f1-5b9d-4e82-84ff-8addbc45aec2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b05d0f1-5b9d-4e82-84ff-8addbc45aec2" (UID: "2b05d0f1-5b9d-4e82-84ff-8addbc45aec2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.031725 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b05d0f1-5b9d-4e82-84ff-8addbc45aec2-config-data" (OuterVolumeSpecName: "config-data") pod "2b05d0f1-5b9d-4e82-84ff-8addbc45aec2" (UID: "2b05d0f1-5b9d-4e82-84ff-8addbc45aec2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.045262 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b05d0f1-5b9d-4e82-84ff-8addbc45aec2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.045462 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b05d0f1-5b9d-4e82-84ff-8addbc45aec2-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.045523 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fk85f\" (UniqueName: \"kubernetes.io/projected/2b05d0f1-5b9d-4e82-84ff-8addbc45aec2-kube-api-access-fk85f\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.288110 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="8a8fc694-01bf-4882-a9f8-07d026a37ee2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.174:8775/\": read tcp 10.217.0.2:58934->10.217.0.174:8775: read: connection reset by peer" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.288110 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="8a8fc694-01bf-4882-a9f8-07d026a37ee2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.174:8775/\": read tcp 10.217.0.2:58920->10.217.0.174:8775: read: connection reset by peer" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.670507 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.758527 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a8fc694-01bf-4882-a9f8-07d026a37ee2-combined-ca-bundle\") pod \"8a8fc694-01bf-4882-a9f8-07d026a37ee2\" (UID: \"8a8fc694-01bf-4882-a9f8-07d026a37ee2\") " Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.758671 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jck7x\" (UniqueName: \"kubernetes.io/projected/8a8fc694-01bf-4882-a9f8-07d026a37ee2-kube-api-access-jck7x\") pod \"8a8fc694-01bf-4882-a9f8-07d026a37ee2\" (UID: \"8a8fc694-01bf-4882-a9f8-07d026a37ee2\") " Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.758711 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a8fc694-01bf-4882-a9f8-07d026a37ee2-config-data\") pod \"8a8fc694-01bf-4882-a9f8-07d026a37ee2\" (UID: \"8a8fc694-01bf-4882-a9f8-07d026a37ee2\") " Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.758890 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a8fc694-01bf-4882-a9f8-07d026a37ee2-nova-metadata-tls-certs\") pod \"8a8fc694-01bf-4882-a9f8-07d026a37ee2\" (UID: \"8a8fc694-01bf-4882-a9f8-07d026a37ee2\") " Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.758923 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a8fc694-01bf-4882-a9f8-07d026a37ee2-logs\") pod \"8a8fc694-01bf-4882-a9f8-07d026a37ee2\" (UID: \"8a8fc694-01bf-4882-a9f8-07d026a37ee2\") " Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.759811 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a8fc694-01bf-4882-a9f8-07d026a37ee2-logs" (OuterVolumeSpecName: "logs") pod "8a8fc694-01bf-4882-a9f8-07d026a37ee2" (UID: "8a8fc694-01bf-4882-a9f8-07d026a37ee2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.770960 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a8fc694-01bf-4882-a9f8-07d026a37ee2-kube-api-access-jck7x" (OuterVolumeSpecName: "kube-api-access-jck7x") pod "8a8fc694-01bf-4882-a9f8-07d026a37ee2" (UID: "8a8fc694-01bf-4882-a9f8-07d026a37ee2"). InnerVolumeSpecName "kube-api-access-jck7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.816281 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a8fc694-01bf-4882-a9f8-07d026a37ee2-config-data" (OuterVolumeSpecName: "config-data") pod "8a8fc694-01bf-4882-a9f8-07d026a37ee2" (UID: "8a8fc694-01bf-4882-a9f8-07d026a37ee2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.836031 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a8fc694-01bf-4882-a9f8-07d026a37ee2-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "8a8fc694-01bf-4882-a9f8-07d026a37ee2" (UID: "8a8fc694-01bf-4882-a9f8-07d026a37ee2"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.846242 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a8fc694-01bf-4882-a9f8-07d026a37ee2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a8fc694-01bf-4882-a9f8-07d026a37ee2" (UID: "8a8fc694-01bf-4882-a9f8-07d026a37ee2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.860452 4672 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a8fc694-01bf-4882-a9f8-07d026a37ee2-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.860481 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a8fc694-01bf-4882-a9f8-07d026a37ee2-logs\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.860508 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a8fc694-01bf-4882-a9f8-07d026a37ee2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.860518 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jck7x\" (UniqueName: \"kubernetes.io/projected/8a8fc694-01bf-4882-a9f8-07d026a37ee2-kube-api-access-jck7x\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.860526 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a8fc694-01bf-4882-a9f8-07d026a37ee2-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.878472 4672 generic.go:334] "Generic (PLEG): container finished" podID="8a8fc694-01bf-4882-a9f8-07d026a37ee2" containerID="c1e2396203e3e0186e0e875737f8010bd4e89c783e31dae43aa25a3c81ec7818" exitCode=0 Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.878519 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8a8fc694-01bf-4882-a9f8-07d026a37ee2","Type":"ContainerDied","Data":"c1e2396203e3e0186e0e875737f8010bd4e89c783e31dae43aa25a3c81ec7818"} Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.878552 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.878571 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.878589 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8a8fc694-01bf-4882-a9f8-07d026a37ee2","Type":"ContainerDied","Data":"1a4341bb5f5009f04bdadc83e7efa94ffe8e8e417e952befa1ae7d47f7297b98"} Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.878621 4672 scope.go:117] "RemoveContainer" containerID="c1e2396203e3e0186e0e875737f8010bd4e89c783e31dae43aa25a3c81ec7818" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.920406 4672 scope.go:117] "RemoveContainer" containerID="1b633af106ee2192aa25533c340e84906a577eb6eebbec06978bc6e3a755b39c" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.948053 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.962240 4672 scope.go:117] "RemoveContainer" containerID="c1e2396203e3e0186e0e875737f8010bd4e89c783e31dae43aa25a3c81ec7818" Dec 06 09:26:31 crc kubenswrapper[4672]: E1206 09:26:31.962724 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1e2396203e3e0186e0e875737f8010bd4e89c783e31dae43aa25a3c81ec7818\": container with ID starting with c1e2396203e3e0186e0e875737f8010bd4e89c783e31dae43aa25a3c81ec7818 not found: ID does not exist" containerID="c1e2396203e3e0186e0e875737f8010bd4e89c783e31dae43aa25a3c81ec7818" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.962751 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1e2396203e3e0186e0e875737f8010bd4e89c783e31dae43aa25a3c81ec7818"} err="failed to get container status \"c1e2396203e3e0186e0e875737f8010bd4e89c783e31dae43aa25a3c81ec7818\": rpc error: code = NotFound desc = could not find container \"c1e2396203e3e0186e0e875737f8010bd4e89c783e31dae43aa25a3c81ec7818\": container with ID starting with c1e2396203e3e0186e0e875737f8010bd4e89c783e31dae43aa25a3c81ec7818 not found: ID does not exist" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.962771 4672 scope.go:117] "RemoveContainer" containerID="1b633af106ee2192aa25533c340e84906a577eb6eebbec06978bc6e3a755b39c" Dec 06 09:26:31 crc kubenswrapper[4672]: E1206 09:26:31.963089 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b633af106ee2192aa25533c340e84906a577eb6eebbec06978bc6e3a755b39c\": container with ID starting with 1b633af106ee2192aa25533c340e84906a577eb6eebbec06978bc6e3a755b39c not found: ID does not exist" containerID="1b633af106ee2192aa25533c340e84906a577eb6eebbec06978bc6e3a755b39c" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.963118 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b633af106ee2192aa25533c340e84906a577eb6eebbec06978bc6e3a755b39c"} err="failed to get container status \"1b633af106ee2192aa25533c340e84906a577eb6eebbec06978bc6e3a755b39c\": rpc error: code = NotFound desc = could not find container \"1b633af106ee2192aa25533c340e84906a577eb6eebbec06978bc6e3a755b39c\": container with ID starting with 1b633af106ee2192aa25533c340e84906a577eb6eebbec06978bc6e3a755b39c not found: ID does not exist" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.963526 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.970985 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.978566 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.993135 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 09:26:31 crc kubenswrapper[4672]: E1206 09:26:31.993509 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a8fc694-01bf-4882-a9f8-07d026a37ee2" containerName="nova-metadata-metadata" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.993526 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a8fc694-01bf-4882-a9f8-07d026a37ee2" containerName="nova-metadata-metadata" Dec 06 09:26:31 crc kubenswrapper[4672]: E1206 09:26:31.993540 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c28bd8fe-e324-4fb3-9056-e15d5dc67b78" containerName="init" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.993546 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c28bd8fe-e324-4fb3-9056-e15d5dc67b78" containerName="init" Dec 06 09:26:31 crc kubenswrapper[4672]: E1206 09:26:31.993586 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c28bd8fe-e324-4fb3-9056-e15d5dc67b78" containerName="dnsmasq-dns" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.993607 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c28bd8fe-e324-4fb3-9056-e15d5dc67b78" containerName="dnsmasq-dns" Dec 06 09:26:31 crc kubenswrapper[4672]: E1206 09:26:31.993620 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a8fc694-01bf-4882-a9f8-07d026a37ee2" containerName="nova-metadata-log" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.993627 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a8fc694-01bf-4882-a9f8-07d026a37ee2" containerName="nova-metadata-log" Dec 06 09:26:31 crc kubenswrapper[4672]: E1206 09:26:31.993637 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9531e27a-bb7e-4700-9bd9-5008c3c7b12f" containerName="nova-manage" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.993643 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9531e27a-bb7e-4700-9bd9-5008c3c7b12f" containerName="nova-manage" Dec 06 09:26:31 crc kubenswrapper[4672]: E1206 09:26:31.993652 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b05d0f1-5b9d-4e82-84ff-8addbc45aec2" containerName="nova-scheduler-scheduler" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.993657 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b05d0f1-5b9d-4e82-84ff-8addbc45aec2" containerName="nova-scheduler-scheduler" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.993849 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="9531e27a-bb7e-4700-9bd9-5008c3c7b12f" containerName="nova-manage" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.993864 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a8fc694-01bf-4882-a9f8-07d026a37ee2" containerName="nova-metadata-log" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.993875 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a8fc694-01bf-4882-a9f8-07d026a37ee2" containerName="nova-metadata-metadata" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.993894 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="c28bd8fe-e324-4fb3-9056-e15d5dc67b78" containerName="dnsmasq-dns" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.993904 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b05d0f1-5b9d-4e82-84ff-8addbc45aec2" containerName="nova-scheduler-scheduler" Dec 06 09:26:31 crc kubenswrapper[4672]: I1206 09:26:31.994521 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.000874 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.003118 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.004523 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.008735 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.018146 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.018169 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.019104 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.063398 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2610b3a3-94e4-4583-b42a-739e7dd1bfc7-config-data\") pod \"nova-metadata-0\" (UID: \"2610b3a3-94e4-4583-b42a-739e7dd1bfc7\") " pod="openstack/nova-metadata-0" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.063494 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0222d9d-628a-423d-b12a-377e94b3ac5c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e0222d9d-628a-423d-b12a-377e94b3ac5c\") " pod="openstack/nova-scheduler-0" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.063652 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2610b3a3-94e4-4583-b42a-739e7dd1bfc7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2610b3a3-94e4-4583-b42a-739e7dd1bfc7\") " pod="openstack/nova-metadata-0" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.063716 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2610b3a3-94e4-4583-b42a-739e7dd1bfc7-logs\") pod \"nova-metadata-0\" (UID: \"2610b3a3-94e4-4583-b42a-739e7dd1bfc7\") " pod="openstack/nova-metadata-0" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.063754 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnwx5\" (UniqueName: \"kubernetes.io/projected/2610b3a3-94e4-4583-b42a-739e7dd1bfc7-kube-api-access-xnwx5\") pod \"nova-metadata-0\" (UID: \"2610b3a3-94e4-4583-b42a-739e7dd1bfc7\") " pod="openstack/nova-metadata-0" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.063881 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ndxn\" (UniqueName: \"kubernetes.io/projected/e0222d9d-628a-423d-b12a-377e94b3ac5c-kube-api-access-5ndxn\") pod \"nova-scheduler-0\" (UID: \"e0222d9d-628a-423d-b12a-377e94b3ac5c\") " pod="openstack/nova-scheduler-0" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.063984 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2610b3a3-94e4-4583-b42a-739e7dd1bfc7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2610b3a3-94e4-4583-b42a-739e7dd1bfc7\") " pod="openstack/nova-metadata-0" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.064118 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0222d9d-628a-423d-b12a-377e94b3ac5c-config-data\") pod \"nova-scheduler-0\" (UID: \"e0222d9d-628a-423d-b12a-377e94b3ac5c\") " pod="openstack/nova-scheduler-0" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.164827 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2610b3a3-94e4-4583-b42a-739e7dd1bfc7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2610b3a3-94e4-4583-b42a-739e7dd1bfc7\") " pod="openstack/nova-metadata-0" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.164895 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0222d9d-628a-423d-b12a-377e94b3ac5c-config-data\") pod \"nova-scheduler-0\" (UID: \"e0222d9d-628a-423d-b12a-377e94b3ac5c\") " pod="openstack/nova-scheduler-0" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.164920 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2610b3a3-94e4-4583-b42a-739e7dd1bfc7-config-data\") pod \"nova-metadata-0\" (UID: \"2610b3a3-94e4-4583-b42a-739e7dd1bfc7\") " pod="openstack/nova-metadata-0" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.164975 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0222d9d-628a-423d-b12a-377e94b3ac5c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e0222d9d-628a-423d-b12a-377e94b3ac5c\") " pod="openstack/nova-scheduler-0" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.164992 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2610b3a3-94e4-4583-b42a-739e7dd1bfc7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2610b3a3-94e4-4583-b42a-739e7dd1bfc7\") " pod="openstack/nova-metadata-0" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.165007 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2610b3a3-94e4-4583-b42a-739e7dd1bfc7-logs\") pod \"nova-metadata-0\" (UID: \"2610b3a3-94e4-4583-b42a-739e7dd1bfc7\") " pod="openstack/nova-metadata-0" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.165026 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnwx5\" (UniqueName: \"kubernetes.io/projected/2610b3a3-94e4-4583-b42a-739e7dd1bfc7-kube-api-access-xnwx5\") pod \"nova-metadata-0\" (UID: \"2610b3a3-94e4-4583-b42a-739e7dd1bfc7\") " pod="openstack/nova-metadata-0" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.165058 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ndxn\" (UniqueName: \"kubernetes.io/projected/e0222d9d-628a-423d-b12a-377e94b3ac5c-kube-api-access-5ndxn\") pod \"nova-scheduler-0\" (UID: \"e0222d9d-628a-423d-b12a-377e94b3ac5c\") " pod="openstack/nova-scheduler-0" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.167174 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2610b3a3-94e4-4583-b42a-739e7dd1bfc7-logs\") pod \"nova-metadata-0\" (UID: \"2610b3a3-94e4-4583-b42a-739e7dd1bfc7\") " pod="openstack/nova-metadata-0" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.169982 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2610b3a3-94e4-4583-b42a-739e7dd1bfc7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2610b3a3-94e4-4583-b42a-739e7dd1bfc7\") " pod="openstack/nova-metadata-0" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.170509 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0222d9d-628a-423d-b12a-377e94b3ac5c-config-data\") pod \"nova-scheduler-0\" (UID: \"e0222d9d-628a-423d-b12a-377e94b3ac5c\") " pod="openstack/nova-scheduler-0" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.171073 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2610b3a3-94e4-4583-b42a-739e7dd1bfc7-config-data\") pod \"nova-metadata-0\" (UID: \"2610b3a3-94e4-4583-b42a-739e7dd1bfc7\") " pod="openstack/nova-metadata-0" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.171487 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2610b3a3-94e4-4583-b42a-739e7dd1bfc7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2610b3a3-94e4-4583-b42a-739e7dd1bfc7\") " pod="openstack/nova-metadata-0" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.174904 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0222d9d-628a-423d-b12a-377e94b3ac5c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e0222d9d-628a-423d-b12a-377e94b3ac5c\") " pod="openstack/nova-scheduler-0" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.184636 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ndxn\" (UniqueName: \"kubernetes.io/projected/e0222d9d-628a-423d-b12a-377e94b3ac5c-kube-api-access-5ndxn\") pod \"nova-scheduler-0\" (UID: \"e0222d9d-628a-423d-b12a-377e94b3ac5c\") " pod="openstack/nova-scheduler-0" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.184741 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnwx5\" (UniqueName: \"kubernetes.io/projected/2610b3a3-94e4-4583-b42a-739e7dd1bfc7-kube-api-access-xnwx5\") pod \"nova-metadata-0\" (UID: \"2610b3a3-94e4-4583-b42a-739e7dd1bfc7\") " pod="openstack/nova-metadata-0" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.311292 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.328396 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.576357 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b05d0f1-5b9d-4e82-84ff-8addbc45aec2" path="/var/lib/kubelet/pods/2b05d0f1-5b9d-4e82-84ff-8addbc45aec2/volumes" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.582886 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a8fc694-01bf-4882-a9f8-07d026a37ee2" path="/var/lib/kubelet/pods/8a8fc694-01bf-4882-a9f8-07d026a37ee2/volumes" Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.765724 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 09:26:32 crc kubenswrapper[4672]: W1206 09:26:32.862118 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2610b3a3_94e4_4583_b42a_739e7dd1bfc7.slice/crio-f1d93a28bc496d7ea34923322000b5de393298c12d8d06dac119ca2144a5b019 WatchSource:0}: Error finding container f1d93a28bc496d7ea34923322000b5de393298c12d8d06dac119ca2144a5b019: Status 404 returned error can't find the container with id f1d93a28bc496d7ea34923322000b5de393298c12d8d06dac119ca2144a5b019 Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.862714 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.906482 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e0222d9d-628a-423d-b12a-377e94b3ac5c","Type":"ContainerStarted","Data":"3bdc58cdaa5282e00f34ab10eeb7c32c5c8a9231db458d1a411d1a7d154b94bf"} Dec 06 09:26:32 crc kubenswrapper[4672]: I1206 09:26:32.909507 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2610b3a3-94e4-4583-b42a-739e7dd1bfc7","Type":"ContainerStarted","Data":"f1d93a28bc496d7ea34923322000b5de393298c12d8d06dac119ca2144a5b019"} Dec 06 09:26:33 crc kubenswrapper[4672]: I1206 09:26:33.888888 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:26:33 crc kubenswrapper[4672]: I1206 09:26:33.920231 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d69c392a-5d64-424c-855d-b4321548387c-config-data\") pod \"d69c392a-5d64-424c-855d-b4321548387c\" (UID: \"d69c392a-5d64-424c-855d-b4321548387c\") " Dec 06 09:26:33 crc kubenswrapper[4672]: I1206 09:26:33.920331 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d69c392a-5d64-424c-855d-b4321548387c-public-tls-certs\") pod \"d69c392a-5d64-424c-855d-b4321548387c\" (UID: \"d69c392a-5d64-424c-855d-b4321548387c\") " Dec 06 09:26:33 crc kubenswrapper[4672]: I1206 09:26:33.920444 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7df87\" (UniqueName: \"kubernetes.io/projected/d69c392a-5d64-424c-855d-b4321548387c-kube-api-access-7df87\") pod \"d69c392a-5d64-424c-855d-b4321548387c\" (UID: \"d69c392a-5d64-424c-855d-b4321548387c\") " Dec 06 09:26:33 crc kubenswrapper[4672]: I1206 09:26:33.920473 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d69c392a-5d64-424c-855d-b4321548387c-internal-tls-certs\") pod \"d69c392a-5d64-424c-855d-b4321548387c\" (UID: \"d69c392a-5d64-424c-855d-b4321548387c\") " Dec 06 09:26:33 crc kubenswrapper[4672]: I1206 09:26:33.920518 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d69c392a-5d64-424c-855d-b4321548387c-logs\") pod \"d69c392a-5d64-424c-855d-b4321548387c\" (UID: \"d69c392a-5d64-424c-855d-b4321548387c\") " Dec 06 09:26:33 crc kubenswrapper[4672]: I1206 09:26:33.920567 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69c392a-5d64-424c-855d-b4321548387c-combined-ca-bundle\") pod \"d69c392a-5d64-424c-855d-b4321548387c\" (UID: \"d69c392a-5d64-424c-855d-b4321548387c\") " Dec 06 09:26:33 crc kubenswrapper[4672]: I1206 09:26:33.926392 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d69c392a-5d64-424c-855d-b4321548387c-logs" (OuterVolumeSpecName: "logs") pod "d69c392a-5d64-424c-855d-b4321548387c" (UID: "d69c392a-5d64-424c-855d-b4321548387c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:26:33 crc kubenswrapper[4672]: I1206 09:26:33.930890 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2610b3a3-94e4-4583-b42a-739e7dd1bfc7","Type":"ContainerStarted","Data":"9b9d05f9a4855da833390f65361711d102c3adb8692093504c3bd3c923f6b146"} Dec 06 09:26:33 crc kubenswrapper[4672]: I1206 09:26:33.930930 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2610b3a3-94e4-4583-b42a-739e7dd1bfc7","Type":"ContainerStarted","Data":"9e06160964ca74bd0a6534dcfdb90a38b0d0a07de92f5b402fd88dd00f9f3fb5"} Dec 06 09:26:33 crc kubenswrapper[4672]: I1206 09:26:33.935696 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e0222d9d-628a-423d-b12a-377e94b3ac5c","Type":"ContainerStarted","Data":"a04ca902dcfe4688357c475b683bcffa4b04cb0aee3b2b670106487100c92cee"} Dec 06 09:26:33 crc kubenswrapper[4672]: I1206 09:26:33.941283 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d69c392a-5d64-424c-855d-b4321548387c-kube-api-access-7df87" (OuterVolumeSpecName: "kube-api-access-7df87") pod "d69c392a-5d64-424c-855d-b4321548387c" (UID: "d69c392a-5d64-424c-855d-b4321548387c"). InnerVolumeSpecName "kube-api-access-7df87". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:26:33 crc kubenswrapper[4672]: I1206 09:26:33.947311 4672 generic.go:334] "Generic (PLEG): container finished" podID="d69c392a-5d64-424c-855d-b4321548387c" containerID="6b223c0a1b010398bda08418c8ceb2367ba360569959279bf7de8d9f52bc657f" exitCode=0 Dec 06 09:26:33 crc kubenswrapper[4672]: I1206 09:26:33.947363 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d69c392a-5d64-424c-855d-b4321548387c","Type":"ContainerDied","Data":"6b223c0a1b010398bda08418c8ceb2367ba360569959279bf7de8d9f52bc657f"} Dec 06 09:26:33 crc kubenswrapper[4672]: I1206 09:26:33.947395 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d69c392a-5d64-424c-855d-b4321548387c","Type":"ContainerDied","Data":"007c69727b750d18d21e4d422a07a3ec70245beb43b036d6d4d8cdee0e040a5e"} Dec 06 09:26:33 crc kubenswrapper[4672]: I1206 09:26:33.947414 4672 scope.go:117] "RemoveContainer" containerID="6b223c0a1b010398bda08418c8ceb2367ba360569959279bf7de8d9f52bc657f" Dec 06 09:26:33 crc kubenswrapper[4672]: I1206 09:26:33.949756 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:26:33 crc kubenswrapper[4672]: I1206 09:26:33.962333 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.9623137760000002 podStartE2EDuration="2.962313776s" podCreationTimestamp="2025-12-06 09:26:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:26:33.953288304 +0000 UTC m=+1211.697548591" watchObservedRunningTime="2025-12-06 09:26:33.962313776 +0000 UTC m=+1211.706574073" Dec 06 09:26:33 crc kubenswrapper[4672]: I1206 09:26:33.980470 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d69c392a-5d64-424c-855d-b4321548387c-config-data" (OuterVolumeSpecName: "config-data") pod "d69c392a-5d64-424c-855d-b4321548387c" (UID: "d69c392a-5d64-424c-855d-b4321548387c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:26:33 crc kubenswrapper[4672]: I1206 09:26:33.986001 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d69c392a-5d64-424c-855d-b4321548387c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d69c392a-5d64-424c-855d-b4321548387c" (UID: "d69c392a-5d64-424c-855d-b4321548387c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:26:33 crc kubenswrapper[4672]: I1206 09:26:33.987845 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d69c392a-5d64-424c-855d-b4321548387c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d69c392a-5d64-424c-855d-b4321548387c" (UID: "d69c392a-5d64-424c-855d-b4321548387c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:26:33 crc kubenswrapper[4672]: I1206 09:26:33.989829 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.989812943 podStartE2EDuration="2.989812943s" podCreationTimestamp="2025-12-06 09:26:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:26:33.976220053 +0000 UTC m=+1211.720480340" watchObservedRunningTime="2025-12-06 09:26:33.989812943 +0000 UTC m=+1211.734073230" Dec 06 09:26:33 crc kubenswrapper[4672]: I1206 09:26:33.991839 4672 scope.go:117] "RemoveContainer" containerID="57c412fbfbedf1c6b525935e3eed8b5e354c10d4bcb88f9623fc338651195466" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.010138 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d69c392a-5d64-424c-855d-b4321548387c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d69c392a-5d64-424c-855d-b4321548387c" (UID: "d69c392a-5d64-424c-855d-b4321548387c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.020870 4672 scope.go:117] "RemoveContainer" containerID="6b223c0a1b010398bda08418c8ceb2367ba360569959279bf7de8d9f52bc657f" Dec 06 09:26:34 crc kubenswrapper[4672]: E1206 09:26:34.021464 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b223c0a1b010398bda08418c8ceb2367ba360569959279bf7de8d9f52bc657f\": container with ID starting with 6b223c0a1b010398bda08418c8ceb2367ba360569959279bf7de8d9f52bc657f not found: ID does not exist" containerID="6b223c0a1b010398bda08418c8ceb2367ba360569959279bf7de8d9f52bc657f" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.021499 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b223c0a1b010398bda08418c8ceb2367ba360569959279bf7de8d9f52bc657f"} err="failed to get container status \"6b223c0a1b010398bda08418c8ceb2367ba360569959279bf7de8d9f52bc657f\": rpc error: code = NotFound desc = could not find container \"6b223c0a1b010398bda08418c8ceb2367ba360569959279bf7de8d9f52bc657f\": container with ID starting with 6b223c0a1b010398bda08418c8ceb2367ba360569959279bf7de8d9f52bc657f not found: ID does not exist" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.021519 4672 scope.go:117] "RemoveContainer" containerID="57c412fbfbedf1c6b525935e3eed8b5e354c10d4bcb88f9623fc338651195466" Dec 06 09:26:34 crc kubenswrapper[4672]: E1206 09:26:34.022632 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57c412fbfbedf1c6b525935e3eed8b5e354c10d4bcb88f9623fc338651195466\": container with ID starting with 57c412fbfbedf1c6b525935e3eed8b5e354c10d4bcb88f9623fc338651195466 not found: ID does not exist" containerID="57c412fbfbedf1c6b525935e3eed8b5e354c10d4bcb88f9623fc338651195466" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.022657 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57c412fbfbedf1c6b525935e3eed8b5e354c10d4bcb88f9623fc338651195466"} err="failed to get container status \"57c412fbfbedf1c6b525935e3eed8b5e354c10d4bcb88f9623fc338651195466\": rpc error: code = NotFound desc = could not find container \"57c412fbfbedf1c6b525935e3eed8b5e354c10d4bcb88f9623fc338651195466\": container with ID starting with 57c412fbfbedf1c6b525935e3eed8b5e354c10d4bcb88f9623fc338651195466 not found: ID does not exist" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.023013 4672 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d69c392a-5d64-424c-855d-b4321548387c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.023101 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7df87\" (UniqueName: \"kubernetes.io/projected/d69c392a-5d64-424c-855d-b4321548387c-kube-api-access-7df87\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.023135 4672 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d69c392a-5d64-424c-855d-b4321548387c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.023145 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d69c392a-5d64-424c-855d-b4321548387c-logs\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.023175 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69c392a-5d64-424c-855d-b4321548387c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.023184 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d69c392a-5d64-424c-855d-b4321548387c-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.287105 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.296965 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.306403 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 06 09:26:34 crc kubenswrapper[4672]: E1206 09:26:34.306752 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d69c392a-5d64-424c-855d-b4321548387c" containerName="nova-api-api" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.306768 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d69c392a-5d64-424c-855d-b4321548387c" containerName="nova-api-api" Dec 06 09:26:34 crc kubenswrapper[4672]: E1206 09:26:34.306787 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d69c392a-5d64-424c-855d-b4321548387c" containerName="nova-api-log" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.306792 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d69c392a-5d64-424c-855d-b4321548387c" containerName="nova-api-log" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.306948 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="d69c392a-5d64-424c-855d-b4321548387c" containerName="nova-api-api" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.306964 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="d69c392a-5d64-424c-855d-b4321548387c" containerName="nova-api-log" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.307792 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.310544 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.310740 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.319197 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.322122 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.429471 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1-logs\") pod \"nova-api-0\" (UID: \"8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1\") " pod="openstack/nova-api-0" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.429820 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1-public-tls-certs\") pod \"nova-api-0\" (UID: \"8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1\") " pod="openstack/nova-api-0" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.429965 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2d59\" (UniqueName: \"kubernetes.io/projected/8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1-kube-api-access-j2d59\") pod \"nova-api-0\" (UID: \"8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1\") " pod="openstack/nova-api-0" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.430120 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1\") " pod="openstack/nova-api-0" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.430225 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1\") " pod="openstack/nova-api-0" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.430325 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1-config-data\") pod \"nova-api-0\" (UID: \"8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1\") " pod="openstack/nova-api-0" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.532112 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1\") " pod="openstack/nova-api-0" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.532276 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1\") " pod="openstack/nova-api-0" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.532384 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1-config-data\") pod \"nova-api-0\" (UID: \"8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1\") " pod="openstack/nova-api-0" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.532586 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1-logs\") pod \"nova-api-0\" (UID: \"8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1\") " pod="openstack/nova-api-0" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.532835 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1-public-tls-certs\") pod \"nova-api-0\" (UID: \"8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1\") " pod="openstack/nova-api-0" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.533466 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2d59\" (UniqueName: \"kubernetes.io/projected/8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1-kube-api-access-j2d59\") pod \"nova-api-0\" (UID: \"8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1\") " pod="openstack/nova-api-0" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.532967 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1-logs\") pod \"nova-api-0\" (UID: \"8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1\") " pod="openstack/nova-api-0" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.539235 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1\") " pod="openstack/nova-api-0" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.539331 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1\") " pod="openstack/nova-api-0" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.539451 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1-config-data\") pod \"nova-api-0\" (UID: \"8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1\") " pod="openstack/nova-api-0" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.545389 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1-public-tls-certs\") pod \"nova-api-0\" (UID: \"8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1\") " pod="openstack/nova-api-0" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.555667 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2d59\" (UniqueName: \"kubernetes.io/projected/8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1-kube-api-access-j2d59\") pod \"nova-api-0\" (UID: \"8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1\") " pod="openstack/nova-api-0" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.570683 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d69c392a-5d64-424c-855d-b4321548387c" path="/var/lib/kubelet/pods/d69c392a-5d64-424c-855d-b4321548387c/volumes" Dec 06 09:26:34 crc kubenswrapper[4672]: I1206 09:26:34.632689 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:26:35 crc kubenswrapper[4672]: I1206 09:26:35.096540 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:26:35 crc kubenswrapper[4672]: W1206 09:26:35.100225 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8eb0159c_5fe5_4ec4_9f3c_ba851fedf3f1.slice/crio-77f952c0f5674fe402e3e2b3ef6b38ac81d8c24629c774fffbc97585b64f4b16 WatchSource:0}: Error finding container 77f952c0f5674fe402e3e2b3ef6b38ac81d8c24629c774fffbc97585b64f4b16: Status 404 returned error can't find the container with id 77f952c0f5674fe402e3e2b3ef6b38ac81d8c24629c774fffbc97585b64f4b16 Dec 06 09:26:35 crc kubenswrapper[4672]: I1206 09:26:35.986279 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1","Type":"ContainerStarted","Data":"3ea62c2cef8709f0226d8369fec755f73896dad1fc9db5899d21543c96ee923c"} Dec 06 09:26:35 crc kubenswrapper[4672]: I1206 09:26:35.986920 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1","Type":"ContainerStarted","Data":"60575ad4ee332fd973872cd963aa7bced3a753d3e4e074a0be4f3218f7f0bbd3"} Dec 06 09:26:35 crc kubenswrapper[4672]: I1206 09:26:35.986938 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1","Type":"ContainerStarted","Data":"77f952c0f5674fe402e3e2b3ef6b38ac81d8c24629c774fffbc97585b64f4b16"} Dec 06 09:26:36 crc kubenswrapper[4672]: I1206 09:26:36.024740 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.024710974 podStartE2EDuration="2.024710974s" podCreationTimestamp="2025-12-06 09:26:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:26:36.013202808 +0000 UTC m=+1213.757463135" watchObservedRunningTime="2025-12-06 09:26:36.024710974 +0000 UTC m=+1213.768971301" Dec 06 09:26:37 crc kubenswrapper[4672]: I1206 09:26:37.312072 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 06 09:26:37 crc kubenswrapper[4672]: I1206 09:26:37.329242 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 09:26:37 crc kubenswrapper[4672]: I1206 09:26:37.329290 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 09:26:42 crc kubenswrapper[4672]: I1206 09:26:42.311543 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 06 09:26:42 crc kubenswrapper[4672]: I1206 09:26:42.320050 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:26:42 crc kubenswrapper[4672]: I1206 09:26:42.320253 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:26:42 crc kubenswrapper[4672]: I1206 09:26:42.320363 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 09:26:42 crc kubenswrapper[4672]: I1206 09:26:42.320930 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a101a6d3a9ea73e6619b9412aec8733c3ef377249e41f4c656d15ff2d987965d"} pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 09:26:42 crc kubenswrapper[4672]: I1206 09:26:42.321096 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" containerID="cri-o://a101a6d3a9ea73e6619b9412aec8733c3ef377249e41f4c656d15ff2d987965d" gracePeriod=600 Dec 06 09:26:42 crc kubenswrapper[4672]: I1206 09:26:42.329367 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 06 09:26:42 crc kubenswrapper[4672]: I1206 09:26:42.329422 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 06 09:26:42 crc kubenswrapper[4672]: I1206 09:26:42.355748 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 06 09:26:43 crc kubenswrapper[4672]: I1206 09:26:43.060637 4672 generic.go:334] "Generic (PLEG): container finished" podID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerID="a101a6d3a9ea73e6619b9412aec8733c3ef377249e41f4c656d15ff2d987965d" exitCode=0 Dec 06 09:26:43 crc kubenswrapper[4672]: I1206 09:26:43.064726 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerDied","Data":"a101a6d3a9ea73e6619b9412aec8733c3ef377249e41f4c656d15ff2d987965d"} Dec 06 09:26:43 crc kubenswrapper[4672]: I1206 09:26:43.064800 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerStarted","Data":"928ca0b7127fa7d124fdf57161f7517b407b723d73a8c32f6adf1f0ea4548786"} Dec 06 09:26:43 crc kubenswrapper[4672]: I1206 09:26:43.064819 4672 scope.go:117] "RemoveContainer" containerID="6dc0e941a4dd3e79f056ce0d1f08eb3aa888fb31efcafdbd3ecc3f28c01b9f06" Dec 06 09:26:43 crc kubenswrapper[4672]: I1206 09:26:43.106559 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 06 09:26:43 crc kubenswrapper[4672]: I1206 09:26:43.344775 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2610b3a3-94e4-4583-b42a-739e7dd1bfc7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.186:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 09:26:43 crc kubenswrapper[4672]: I1206 09:26:43.344820 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2610b3a3-94e4-4583-b42a-739e7dd1bfc7" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.186:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 09:26:44 crc kubenswrapper[4672]: I1206 09:26:44.634097 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 09:26:44 crc kubenswrapper[4672]: I1206 09:26:44.634525 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 09:26:45 crc kubenswrapper[4672]: I1206 09:26:45.652824 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.187:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 09:26:45 crc kubenswrapper[4672]: I1206 09:26:45.652834 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.187:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 09:26:49 crc kubenswrapper[4672]: I1206 09:26:49.177522 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 06 09:26:52 crc kubenswrapper[4672]: I1206 09:26:52.337444 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 06 09:26:52 crc kubenswrapper[4672]: I1206 09:26:52.338122 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 06 09:26:52 crc kubenswrapper[4672]: I1206 09:26:52.348785 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 06 09:26:52 crc kubenswrapper[4672]: I1206 09:26:52.355254 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 06 09:26:54 crc kubenswrapper[4672]: I1206 09:26:54.641005 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 06 09:26:54 crc kubenswrapper[4672]: I1206 09:26:54.641964 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 06 09:26:54 crc kubenswrapper[4672]: I1206 09:26:54.643138 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 06 09:26:54 crc kubenswrapper[4672]: I1206 09:26:54.643274 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 06 09:26:54 crc kubenswrapper[4672]: I1206 09:26:54.652131 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 06 09:26:54 crc kubenswrapper[4672]: I1206 09:26:54.654404 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 06 09:27:02 crc kubenswrapper[4672]: I1206 09:27:02.753192 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 09:27:04 crc kubenswrapper[4672]: I1206 09:27:04.258347 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 09:27:07 crc kubenswrapper[4672]: I1206 09:27:07.470537 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="54ae723f-36b7-4991-9439-23af064249fa" containerName="rabbitmq" containerID="cri-o://e7d4689ccc9708e66c57367ceb978a99f26aa88a6ba0b89b1a5be2d209c0238f" gracePeriod=604796 Dec 06 09:27:08 crc kubenswrapper[4672]: I1206 09:27:08.629525 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="1bbe623e-19ec-49f2-bfa4-65728b94d035" containerName="rabbitmq" containerID="cri-o://c96b7335657d68f744cd6906081458b139e9f155f899c5b2acdd86893bf9b88f" gracePeriod=604796 Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.041917 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.116791 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/54ae723f-36b7-4991-9439-23af064249fa-rabbitmq-tls\") pod \"54ae723f-36b7-4991-9439-23af064249fa\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.116827 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/54ae723f-36b7-4991-9439-23af064249fa-plugins-conf\") pod \"54ae723f-36b7-4991-9439-23af064249fa\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.116842 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/54ae723f-36b7-4991-9439-23af064249fa-rabbitmq-confd\") pod \"54ae723f-36b7-4991-9439-23af064249fa\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.116858 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54ae723f-36b7-4991-9439-23af064249fa-config-data\") pod \"54ae723f-36b7-4991-9439-23af064249fa\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.116963 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/54ae723f-36b7-4991-9439-23af064249fa-server-conf\") pod \"54ae723f-36b7-4991-9439-23af064249fa\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.116996 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/54ae723f-36b7-4991-9439-23af064249fa-rabbitmq-plugins\") pod \"54ae723f-36b7-4991-9439-23af064249fa\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.117012 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"54ae723f-36b7-4991-9439-23af064249fa\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.117082 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq4fg\" (UniqueName: \"kubernetes.io/projected/54ae723f-36b7-4991-9439-23af064249fa-kube-api-access-qq4fg\") pod \"54ae723f-36b7-4991-9439-23af064249fa\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.117115 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/54ae723f-36b7-4991-9439-23af064249fa-pod-info\") pod \"54ae723f-36b7-4991-9439-23af064249fa\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.117135 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/54ae723f-36b7-4991-9439-23af064249fa-rabbitmq-erlang-cookie\") pod \"54ae723f-36b7-4991-9439-23af064249fa\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.117155 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/54ae723f-36b7-4991-9439-23af064249fa-erlang-cookie-secret\") pod \"54ae723f-36b7-4991-9439-23af064249fa\" (UID: \"54ae723f-36b7-4991-9439-23af064249fa\") " Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.117713 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54ae723f-36b7-4991-9439-23af064249fa-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "54ae723f-36b7-4991-9439-23af064249fa" (UID: "54ae723f-36b7-4991-9439-23af064249fa"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.118830 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54ae723f-36b7-4991-9439-23af064249fa-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "54ae723f-36b7-4991-9439-23af064249fa" (UID: "54ae723f-36b7-4991-9439-23af064249fa"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.118090 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54ae723f-36b7-4991-9439-23af064249fa-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "54ae723f-36b7-4991-9439-23af064249fa" (UID: "54ae723f-36b7-4991-9439-23af064249fa"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.142207 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/54ae723f-36b7-4991-9439-23af064249fa-pod-info" (OuterVolumeSpecName: "pod-info") pod "54ae723f-36b7-4991-9439-23af064249fa" (UID: "54ae723f-36b7-4991-9439-23af064249fa"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.142207 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "54ae723f-36b7-4991-9439-23af064249fa" (UID: "54ae723f-36b7-4991-9439-23af064249fa"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.142276 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54ae723f-36b7-4991-9439-23af064249fa-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "54ae723f-36b7-4991-9439-23af064249fa" (UID: "54ae723f-36b7-4991-9439-23af064249fa"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.142342 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54ae723f-36b7-4991-9439-23af064249fa-kube-api-access-qq4fg" (OuterVolumeSpecName: "kube-api-access-qq4fg") pod "54ae723f-36b7-4991-9439-23af064249fa" (UID: "54ae723f-36b7-4991-9439-23af064249fa"). InnerVolumeSpecName "kube-api-access-qq4fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.186894 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54ae723f-36b7-4991-9439-23af064249fa-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "54ae723f-36b7-4991-9439-23af064249fa" (UID: "54ae723f-36b7-4991-9439-23af064249fa"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.196429 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54ae723f-36b7-4991-9439-23af064249fa-config-data" (OuterVolumeSpecName: "config-data") pod "54ae723f-36b7-4991-9439-23af064249fa" (UID: "54ae723f-36b7-4991-9439-23af064249fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.220828 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq4fg\" (UniqueName: \"kubernetes.io/projected/54ae723f-36b7-4991-9439-23af064249fa-kube-api-access-qq4fg\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.221213 4672 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/54ae723f-36b7-4991-9439-23af064249fa-pod-info\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.221391 4672 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/54ae723f-36b7-4991-9439-23af064249fa-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.221497 4672 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/54ae723f-36b7-4991-9439-23af064249fa-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.222084 4672 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/54ae723f-36b7-4991-9439-23af064249fa-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.222217 4672 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/54ae723f-36b7-4991-9439-23af064249fa-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.222300 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54ae723f-36b7-4991-9439-23af064249fa-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.222379 4672 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/54ae723f-36b7-4991-9439-23af064249fa-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.222481 4672 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.242205 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54ae723f-36b7-4991-9439-23af064249fa-server-conf" (OuterVolumeSpecName: "server-conf") pod "54ae723f-36b7-4991-9439-23af064249fa" (UID: "54ae723f-36b7-4991-9439-23af064249fa"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.256351 4672 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.310128 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54ae723f-36b7-4991-9439-23af064249fa-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "54ae723f-36b7-4991-9439-23af064249fa" (UID: "54ae723f-36b7-4991-9439-23af064249fa"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.324274 4672 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/54ae723f-36b7-4991-9439-23af064249fa-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.324314 4672 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/54ae723f-36b7-4991-9439-23af064249fa-server-conf\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.324324 4672 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.384399 4672 generic.go:334] "Generic (PLEG): container finished" podID="54ae723f-36b7-4991-9439-23af064249fa" containerID="e7d4689ccc9708e66c57367ceb978a99f26aa88a6ba0b89b1a5be2d209c0238f" exitCode=0 Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.384459 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.384450 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"54ae723f-36b7-4991-9439-23af064249fa","Type":"ContainerDied","Data":"e7d4689ccc9708e66c57367ceb978a99f26aa88a6ba0b89b1a5be2d209c0238f"} Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.384636 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"54ae723f-36b7-4991-9439-23af064249fa","Type":"ContainerDied","Data":"8058a379a7cb28e6983aab03c6bdda9846d95fec18852baf8148b9c9e1d55b53"} Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.384664 4672 scope.go:117] "RemoveContainer" containerID="e7d4689ccc9708e66c57367ceb978a99f26aa88a6ba0b89b1a5be2d209c0238f" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.413486 4672 scope.go:117] "RemoveContainer" containerID="89614a71a15f9d9f77af3aefe664b3203631d0d02c91b9ee18b19df2d3ae7473" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.428305 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.448727 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.463838 4672 scope.go:117] "RemoveContainer" containerID="e7d4689ccc9708e66c57367ceb978a99f26aa88a6ba0b89b1a5be2d209c0238f" Dec 06 09:27:14 crc kubenswrapper[4672]: E1206 09:27:14.470965 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7d4689ccc9708e66c57367ceb978a99f26aa88a6ba0b89b1a5be2d209c0238f\": container with ID starting with e7d4689ccc9708e66c57367ceb978a99f26aa88a6ba0b89b1a5be2d209c0238f not found: ID does not exist" containerID="e7d4689ccc9708e66c57367ceb978a99f26aa88a6ba0b89b1a5be2d209c0238f" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.471028 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7d4689ccc9708e66c57367ceb978a99f26aa88a6ba0b89b1a5be2d209c0238f"} err="failed to get container status \"e7d4689ccc9708e66c57367ceb978a99f26aa88a6ba0b89b1a5be2d209c0238f\": rpc error: code = NotFound desc = could not find container \"e7d4689ccc9708e66c57367ceb978a99f26aa88a6ba0b89b1a5be2d209c0238f\": container with ID starting with e7d4689ccc9708e66c57367ceb978a99f26aa88a6ba0b89b1a5be2d209c0238f not found: ID does not exist" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.471055 4672 scope.go:117] "RemoveContainer" containerID="89614a71a15f9d9f77af3aefe664b3203631d0d02c91b9ee18b19df2d3ae7473" Dec 06 09:27:14 crc kubenswrapper[4672]: E1206 09:27:14.473155 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89614a71a15f9d9f77af3aefe664b3203631d0d02c91b9ee18b19df2d3ae7473\": container with ID starting with 89614a71a15f9d9f77af3aefe664b3203631d0d02c91b9ee18b19df2d3ae7473 not found: ID does not exist" containerID="89614a71a15f9d9f77af3aefe664b3203631d0d02c91b9ee18b19df2d3ae7473" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.473182 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89614a71a15f9d9f77af3aefe664b3203631d0d02c91b9ee18b19df2d3ae7473"} err="failed to get container status \"89614a71a15f9d9f77af3aefe664b3203631d0d02c91b9ee18b19df2d3ae7473\": rpc error: code = NotFound desc = could not find container \"89614a71a15f9d9f77af3aefe664b3203631d0d02c91b9ee18b19df2d3ae7473\": container with ID starting with 89614a71a15f9d9f77af3aefe664b3203631d0d02c91b9ee18b19df2d3ae7473 not found: ID does not exist" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.481379 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 09:27:14 crc kubenswrapper[4672]: E1206 09:27:14.481745 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54ae723f-36b7-4991-9439-23af064249fa" containerName="rabbitmq" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.481760 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="54ae723f-36b7-4991-9439-23af064249fa" containerName="rabbitmq" Dec 06 09:27:14 crc kubenswrapper[4672]: E1206 09:27:14.481775 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54ae723f-36b7-4991-9439-23af064249fa" containerName="setup-container" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.481781 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="54ae723f-36b7-4991-9439-23af064249fa" containerName="setup-container" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.481935 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="54ae723f-36b7-4991-9439-23af064249fa" containerName="rabbitmq" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.482921 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.486060 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.486357 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.486472 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.486657 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.486813 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.486913 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.487095 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-5xzx4" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.499592 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.574166 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54ae723f-36b7-4991-9439-23af064249fa" path="/var/lib/kubelet/pods/54ae723f-36b7-4991-9439-23af064249fa/volumes" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.630427 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f3cf9f22-30ac-48ca-9d05-407868710c73-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.630488 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.630534 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f3cf9f22-30ac-48ca-9d05-407868710c73-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.630569 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f3cf9f22-30ac-48ca-9d05-407868710c73-config-data\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.630591 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f3cf9f22-30ac-48ca-9d05-407868710c73-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.630677 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f3cf9f22-30ac-48ca-9d05-407868710c73-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.630724 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f3cf9f22-30ac-48ca-9d05-407868710c73-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.630738 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f3cf9f22-30ac-48ca-9d05-407868710c73-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.630778 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f3cf9f22-30ac-48ca-9d05-407868710c73-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.630790 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpl6n\" (UniqueName: \"kubernetes.io/projected/f3cf9f22-30ac-48ca-9d05-407868710c73-kube-api-access-lpl6n\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.630810 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f3cf9f22-30ac-48ca-9d05-407868710c73-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.732033 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f3cf9f22-30ac-48ca-9d05-407868710c73-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.732105 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f3cf9f22-30ac-48ca-9d05-407868710c73-config-data\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.732132 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f3cf9f22-30ac-48ca-9d05-407868710c73-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.732201 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f3cf9f22-30ac-48ca-9d05-407868710c73-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.732260 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f3cf9f22-30ac-48ca-9d05-407868710c73-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.732282 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f3cf9f22-30ac-48ca-9d05-407868710c73-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.732327 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f3cf9f22-30ac-48ca-9d05-407868710c73-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.732351 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpl6n\" (UniqueName: \"kubernetes.io/projected/f3cf9f22-30ac-48ca-9d05-407868710c73-kube-api-access-lpl6n\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.732374 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f3cf9f22-30ac-48ca-9d05-407868710c73-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.732404 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f3cf9f22-30ac-48ca-9d05-407868710c73-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.732438 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.732830 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.732844 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f3cf9f22-30ac-48ca-9d05-407868710c73-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.733348 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f3cf9f22-30ac-48ca-9d05-407868710c73-config-data\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.733775 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f3cf9f22-30ac-48ca-9d05-407868710c73-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.734555 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f3cf9f22-30ac-48ca-9d05-407868710c73-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.735052 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f3cf9f22-30ac-48ca-9d05-407868710c73-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.737250 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f3cf9f22-30ac-48ca-9d05-407868710c73-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.742754 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f3cf9f22-30ac-48ca-9d05-407868710c73-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.751687 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f3cf9f22-30ac-48ca-9d05-407868710c73-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.759396 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f3cf9f22-30ac-48ca-9d05-407868710c73-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.763446 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpl6n\" (UniqueName: \"kubernetes.io/projected/f3cf9f22-30ac-48ca-9d05-407868710c73-kube-api-access-lpl6n\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:14 crc kubenswrapper[4672]: I1206 09:27:14.819354 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"f3cf9f22-30ac-48ca-9d05-407868710c73\") " pod="openstack/rabbitmq-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.103612 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.194650 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.342392 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1bbe623e-19ec-49f2-bfa4-65728b94d035-config-data\") pod \"1bbe623e-19ec-49f2-bfa4-65728b94d035\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.342450 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1bbe623e-19ec-49f2-bfa4-65728b94d035-server-conf\") pod \"1bbe623e-19ec-49f2-bfa4-65728b94d035\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.342468 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1bbe623e-19ec-49f2-bfa4-65728b94d035-rabbitmq-confd\") pod \"1bbe623e-19ec-49f2-bfa4-65728b94d035\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.342559 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"1bbe623e-19ec-49f2-bfa4-65728b94d035\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.342591 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1bbe623e-19ec-49f2-bfa4-65728b94d035-plugins-conf\") pod \"1bbe623e-19ec-49f2-bfa4-65728b94d035\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.342650 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1bbe623e-19ec-49f2-bfa4-65728b94d035-pod-info\") pod \"1bbe623e-19ec-49f2-bfa4-65728b94d035\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.342691 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1bbe623e-19ec-49f2-bfa4-65728b94d035-rabbitmq-erlang-cookie\") pod \"1bbe623e-19ec-49f2-bfa4-65728b94d035\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.342712 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbf2q\" (UniqueName: \"kubernetes.io/projected/1bbe623e-19ec-49f2-bfa4-65728b94d035-kube-api-access-nbf2q\") pod \"1bbe623e-19ec-49f2-bfa4-65728b94d035\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.342755 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1bbe623e-19ec-49f2-bfa4-65728b94d035-rabbitmq-tls\") pod \"1bbe623e-19ec-49f2-bfa4-65728b94d035\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.342786 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1bbe623e-19ec-49f2-bfa4-65728b94d035-rabbitmq-plugins\") pod \"1bbe623e-19ec-49f2-bfa4-65728b94d035\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.342808 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1bbe623e-19ec-49f2-bfa4-65728b94d035-erlang-cookie-secret\") pod \"1bbe623e-19ec-49f2-bfa4-65728b94d035\" (UID: \"1bbe623e-19ec-49f2-bfa4-65728b94d035\") " Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.344330 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bbe623e-19ec-49f2-bfa4-65728b94d035-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "1bbe623e-19ec-49f2-bfa4-65728b94d035" (UID: "1bbe623e-19ec-49f2-bfa4-65728b94d035"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.347968 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1bbe623e-19ec-49f2-bfa4-65728b94d035-pod-info" (OuterVolumeSpecName: "pod-info") pod "1bbe623e-19ec-49f2-bfa4-65728b94d035" (UID: "1bbe623e-19ec-49f2-bfa4-65728b94d035"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.348267 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bbe623e-19ec-49f2-bfa4-65728b94d035-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "1bbe623e-19ec-49f2-bfa4-65728b94d035" (UID: "1bbe623e-19ec-49f2-bfa4-65728b94d035"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.348561 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bbe623e-19ec-49f2-bfa4-65728b94d035-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "1bbe623e-19ec-49f2-bfa4-65728b94d035" (UID: "1bbe623e-19ec-49f2-bfa4-65728b94d035"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.349471 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "1bbe623e-19ec-49f2-bfa4-65728b94d035" (UID: "1bbe623e-19ec-49f2-bfa4-65728b94d035"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.350711 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bbe623e-19ec-49f2-bfa4-65728b94d035-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "1bbe623e-19ec-49f2-bfa4-65728b94d035" (UID: "1bbe623e-19ec-49f2-bfa4-65728b94d035"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.351586 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bbe623e-19ec-49f2-bfa4-65728b94d035-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "1bbe623e-19ec-49f2-bfa4-65728b94d035" (UID: "1bbe623e-19ec-49f2-bfa4-65728b94d035"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.361388 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bbe623e-19ec-49f2-bfa4-65728b94d035-kube-api-access-nbf2q" (OuterVolumeSpecName: "kube-api-access-nbf2q") pod "1bbe623e-19ec-49f2-bfa4-65728b94d035" (UID: "1bbe623e-19ec-49f2-bfa4-65728b94d035"). InnerVolumeSpecName "kube-api-access-nbf2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.394340 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bbe623e-19ec-49f2-bfa4-65728b94d035-config-data" (OuterVolumeSpecName: "config-data") pod "1bbe623e-19ec-49f2-bfa4-65728b94d035" (UID: "1bbe623e-19ec-49f2-bfa4-65728b94d035"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.398969 4672 generic.go:334] "Generic (PLEG): container finished" podID="1bbe623e-19ec-49f2-bfa4-65728b94d035" containerID="c96b7335657d68f744cd6906081458b139e9f155f899c5b2acdd86893bf9b88f" exitCode=0 Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.399035 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1bbe623e-19ec-49f2-bfa4-65728b94d035","Type":"ContainerDied","Data":"c96b7335657d68f744cd6906081458b139e9f155f899c5b2acdd86893bf9b88f"} Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.399058 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1bbe623e-19ec-49f2-bfa4-65728b94d035","Type":"ContainerDied","Data":"629e8a99471dbe0c3c436af3076b7d79db4d171f2c92685ff437d4ec1106b5b9"} Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.399073 4672 scope.go:117] "RemoveContainer" containerID="c96b7335657d68f744cd6906081458b139e9f155f899c5b2acdd86893bf9b88f" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.399180 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.411388 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bbe623e-19ec-49f2-bfa4-65728b94d035-server-conf" (OuterVolumeSpecName: "server-conf") pod "1bbe623e-19ec-49f2-bfa4-65728b94d035" (UID: "1bbe623e-19ec-49f2-bfa4-65728b94d035"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.438786 4672 scope.go:117] "RemoveContainer" containerID="0146bf3c3518e6366d9ba504ec80e4a862ae7f202781214ec92564f51d9798e2" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.446198 4672 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.446229 4672 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1bbe623e-19ec-49f2-bfa4-65728b94d035-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.446239 4672 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1bbe623e-19ec-49f2-bfa4-65728b94d035-pod-info\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.446250 4672 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1bbe623e-19ec-49f2-bfa4-65728b94d035-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.446260 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbf2q\" (UniqueName: \"kubernetes.io/projected/1bbe623e-19ec-49f2-bfa4-65728b94d035-kube-api-access-nbf2q\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.446268 4672 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1bbe623e-19ec-49f2-bfa4-65728b94d035-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.446277 4672 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1bbe623e-19ec-49f2-bfa4-65728b94d035-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.446285 4672 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1bbe623e-19ec-49f2-bfa4-65728b94d035-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.446293 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1bbe623e-19ec-49f2-bfa4-65728b94d035-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.446303 4672 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1bbe623e-19ec-49f2-bfa4-65728b94d035-server-conf\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.470772 4672 scope.go:117] "RemoveContainer" containerID="c96b7335657d68f744cd6906081458b139e9f155f899c5b2acdd86893bf9b88f" Dec 06 09:27:15 crc kubenswrapper[4672]: E1206 09:27:15.476590 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c96b7335657d68f744cd6906081458b139e9f155f899c5b2acdd86893bf9b88f\": container with ID starting with c96b7335657d68f744cd6906081458b139e9f155f899c5b2acdd86893bf9b88f not found: ID does not exist" containerID="c96b7335657d68f744cd6906081458b139e9f155f899c5b2acdd86893bf9b88f" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.476666 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c96b7335657d68f744cd6906081458b139e9f155f899c5b2acdd86893bf9b88f"} err="failed to get container status \"c96b7335657d68f744cd6906081458b139e9f155f899c5b2acdd86893bf9b88f\": rpc error: code = NotFound desc = could not find container \"c96b7335657d68f744cd6906081458b139e9f155f899c5b2acdd86893bf9b88f\": container with ID starting with c96b7335657d68f744cd6906081458b139e9f155f899c5b2acdd86893bf9b88f not found: ID does not exist" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.476699 4672 scope.go:117] "RemoveContainer" containerID="0146bf3c3518e6366d9ba504ec80e4a862ae7f202781214ec92564f51d9798e2" Dec 06 09:27:15 crc kubenswrapper[4672]: E1206 09:27:15.477066 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0146bf3c3518e6366d9ba504ec80e4a862ae7f202781214ec92564f51d9798e2\": container with ID starting with 0146bf3c3518e6366d9ba504ec80e4a862ae7f202781214ec92564f51d9798e2 not found: ID does not exist" containerID="0146bf3c3518e6366d9ba504ec80e4a862ae7f202781214ec92564f51d9798e2" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.477102 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0146bf3c3518e6366d9ba504ec80e4a862ae7f202781214ec92564f51d9798e2"} err="failed to get container status \"0146bf3c3518e6366d9ba504ec80e4a862ae7f202781214ec92564f51d9798e2\": rpc error: code = NotFound desc = could not find container \"0146bf3c3518e6366d9ba504ec80e4a862ae7f202781214ec92564f51d9798e2\": container with ID starting with 0146bf3c3518e6366d9ba504ec80e4a862ae7f202781214ec92564f51d9798e2 not found: ID does not exist" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.481041 4672 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.482828 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bbe623e-19ec-49f2-bfa4-65728b94d035-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "1bbe623e-19ec-49f2-bfa4-65728b94d035" (UID: "1bbe623e-19ec-49f2-bfa4-65728b94d035"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.548379 4672 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1bbe623e-19ec-49f2-bfa4-65728b94d035-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.548412 4672 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.589673 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.773294 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.783955 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.800543 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 09:27:15 crc kubenswrapper[4672]: E1206 09:27:15.801360 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bbe623e-19ec-49f2-bfa4-65728b94d035" containerName="setup-container" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.801380 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bbe623e-19ec-49f2-bfa4-65728b94d035" containerName="setup-container" Dec 06 09:27:15 crc kubenswrapper[4672]: E1206 09:27:15.801405 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bbe623e-19ec-49f2-bfa4-65728b94d035" containerName="rabbitmq" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.801412 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bbe623e-19ec-49f2-bfa4-65728b94d035" containerName="rabbitmq" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.801577 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bbe623e-19ec-49f2-bfa4-65728b94d035" containerName="rabbitmq" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.802469 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.804829 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.805043 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.805475 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.805657 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.805812 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.805977 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.806287 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-l5sdg" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.819982 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.855554 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3f71615b-1205-44b2-b4aa-c03548716486-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.855591 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3f71615b-1205-44b2-b4aa-c03548716486-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.855651 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3f71615b-1205-44b2-b4aa-c03548716486-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.855677 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3f71615b-1205-44b2-b4aa-c03548716486-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.855699 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3f71615b-1205-44b2-b4aa-c03548716486-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.855892 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3f71615b-1205-44b2-b4aa-c03548716486-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.855927 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3f71615b-1205-44b2-b4aa-c03548716486-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.856007 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zhpt\" (UniqueName: \"kubernetes.io/projected/3f71615b-1205-44b2-b4aa-c03548716486-kube-api-access-6zhpt\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.856037 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.856058 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3f71615b-1205-44b2-b4aa-c03548716486-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.856088 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3f71615b-1205-44b2-b4aa-c03548716486-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.957413 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3f71615b-1205-44b2-b4aa-c03548716486-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.957452 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3f71615b-1205-44b2-b4aa-c03548716486-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.957508 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zhpt\" (UniqueName: \"kubernetes.io/projected/3f71615b-1205-44b2-b4aa-c03548716486-kube-api-access-6zhpt\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.957545 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.957573 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3f71615b-1205-44b2-b4aa-c03548716486-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.957770 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.958191 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3f71615b-1205-44b2-b4aa-c03548716486-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.958238 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3f71615b-1205-44b2-b4aa-c03548716486-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.958934 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3f71615b-1205-44b2-b4aa-c03548716486-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.959413 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3f71615b-1205-44b2-b4aa-c03548716486-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.959569 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3f71615b-1205-44b2-b4aa-c03548716486-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.959667 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3f71615b-1205-44b2-b4aa-c03548716486-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.959702 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3f71615b-1205-44b2-b4aa-c03548716486-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.959739 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3f71615b-1205-44b2-b4aa-c03548716486-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.959764 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3f71615b-1205-44b2-b4aa-c03548716486-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.960523 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3f71615b-1205-44b2-b4aa-c03548716486-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.960809 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3f71615b-1205-44b2-b4aa-c03548716486-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.964313 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3f71615b-1205-44b2-b4aa-c03548716486-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.964771 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3f71615b-1205-44b2-b4aa-c03548716486-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.964872 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3f71615b-1205-44b2-b4aa-c03548716486-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.971238 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3f71615b-1205-44b2-b4aa-c03548716486-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.984092 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zhpt\" (UniqueName: \"kubernetes.io/projected/3f71615b-1205-44b2-b4aa-c03548716486-kube-api-access-6zhpt\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:15 crc kubenswrapper[4672]: I1206 09:27:15.988221 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f71615b-1205-44b2-b4aa-c03548716486\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:16 crc kubenswrapper[4672]: I1206 09:27:16.130054 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:16 crc kubenswrapper[4672]: I1206 09:27:16.406950 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 09:27:16 crc kubenswrapper[4672]: I1206 09:27:16.413266 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f3cf9f22-30ac-48ca-9d05-407868710c73","Type":"ContainerStarted","Data":"669bc0802bb5178857c68a7f477f7723b849420b8308351eced501487769fa64"} Dec 06 09:27:16 crc kubenswrapper[4672]: W1206 09:27:16.415662 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f71615b_1205_44b2_b4aa_c03548716486.slice/crio-7887421342e3bcfab48896af4b19820a494c7d80c7e984757aa7d41631da502a WatchSource:0}: Error finding container 7887421342e3bcfab48896af4b19820a494c7d80c7e984757aa7d41631da502a: Status 404 returned error can't find the container with id 7887421342e3bcfab48896af4b19820a494c7d80c7e984757aa7d41631da502a Dec 06 09:27:16 crc kubenswrapper[4672]: I1206 09:27:16.567628 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bbe623e-19ec-49f2-bfa4-65728b94d035" path="/var/lib/kubelet/pods/1bbe623e-19ec-49f2-bfa4-65728b94d035/volumes" Dec 06 09:27:17 crc kubenswrapper[4672]: I1206 09:27:17.424637 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3f71615b-1205-44b2-b4aa-c03548716486","Type":"ContainerStarted","Data":"7887421342e3bcfab48896af4b19820a494c7d80c7e984757aa7d41631da502a"} Dec 06 09:27:17 crc kubenswrapper[4672]: I1206 09:27:17.426516 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f3cf9f22-30ac-48ca-9d05-407868710c73","Type":"ContainerStarted","Data":"a58e59f3087910789f4c09f4391850be06c566c3464160e453533a09ae57925a"} Dec 06 09:27:18 crc kubenswrapper[4672]: I1206 09:27:18.438348 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3f71615b-1205-44b2-b4aa-c03548716486","Type":"ContainerStarted","Data":"55692ea67f7a9e733685901b276040278ab90788c3cce45b3bb3b638eb5b77a1"} Dec 06 09:27:18 crc kubenswrapper[4672]: I1206 09:27:18.645663 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75ff95b8d7-sgwmd"] Dec 06 09:27:18 crc kubenswrapper[4672]: I1206 09:27:18.647114 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" Dec 06 09:27:18 crc kubenswrapper[4672]: I1206 09:27:18.649139 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 06 09:27:18 crc kubenswrapper[4672]: I1206 09:27:18.674987 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75ff95b8d7-sgwmd"] Dec 06 09:27:18 crc kubenswrapper[4672]: I1206 09:27:18.711274 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0ea4505d-157e-4390-bcc0-4dd53c9ee787-openstack-edpm-ipam\") pod \"dnsmasq-dns-75ff95b8d7-sgwmd\" (UID: \"0ea4505d-157e-4390-bcc0-4dd53c9ee787\") " pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" Dec 06 09:27:18 crc kubenswrapper[4672]: I1206 09:27:18.711319 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ea4505d-157e-4390-bcc0-4dd53c9ee787-ovsdbserver-nb\") pod \"dnsmasq-dns-75ff95b8d7-sgwmd\" (UID: \"0ea4505d-157e-4390-bcc0-4dd53c9ee787\") " pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" Dec 06 09:27:18 crc kubenswrapper[4672]: I1206 09:27:18.711341 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ea4505d-157e-4390-bcc0-4dd53c9ee787-ovsdbserver-sb\") pod \"dnsmasq-dns-75ff95b8d7-sgwmd\" (UID: \"0ea4505d-157e-4390-bcc0-4dd53c9ee787\") " pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" Dec 06 09:27:18 crc kubenswrapper[4672]: I1206 09:27:18.711374 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltfsq\" (UniqueName: \"kubernetes.io/projected/0ea4505d-157e-4390-bcc0-4dd53c9ee787-kube-api-access-ltfsq\") pod \"dnsmasq-dns-75ff95b8d7-sgwmd\" (UID: \"0ea4505d-157e-4390-bcc0-4dd53c9ee787\") " pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" Dec 06 09:27:18 crc kubenswrapper[4672]: I1206 09:27:18.711411 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ea4505d-157e-4390-bcc0-4dd53c9ee787-dns-svc\") pod \"dnsmasq-dns-75ff95b8d7-sgwmd\" (UID: \"0ea4505d-157e-4390-bcc0-4dd53c9ee787\") " pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" Dec 06 09:27:18 crc kubenswrapper[4672]: I1206 09:27:18.711464 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ea4505d-157e-4390-bcc0-4dd53c9ee787-config\") pod \"dnsmasq-dns-75ff95b8d7-sgwmd\" (UID: \"0ea4505d-157e-4390-bcc0-4dd53c9ee787\") " pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" Dec 06 09:27:18 crc kubenswrapper[4672]: I1206 09:27:18.812445 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0ea4505d-157e-4390-bcc0-4dd53c9ee787-openstack-edpm-ipam\") pod \"dnsmasq-dns-75ff95b8d7-sgwmd\" (UID: \"0ea4505d-157e-4390-bcc0-4dd53c9ee787\") " pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" Dec 06 09:27:18 crc kubenswrapper[4672]: I1206 09:27:18.812790 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ea4505d-157e-4390-bcc0-4dd53c9ee787-ovsdbserver-nb\") pod \"dnsmasq-dns-75ff95b8d7-sgwmd\" (UID: \"0ea4505d-157e-4390-bcc0-4dd53c9ee787\") " pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" Dec 06 09:27:18 crc kubenswrapper[4672]: I1206 09:27:18.812813 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ea4505d-157e-4390-bcc0-4dd53c9ee787-ovsdbserver-sb\") pod \"dnsmasq-dns-75ff95b8d7-sgwmd\" (UID: \"0ea4505d-157e-4390-bcc0-4dd53c9ee787\") " pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" Dec 06 09:27:18 crc kubenswrapper[4672]: I1206 09:27:18.812861 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltfsq\" (UniqueName: \"kubernetes.io/projected/0ea4505d-157e-4390-bcc0-4dd53c9ee787-kube-api-access-ltfsq\") pod \"dnsmasq-dns-75ff95b8d7-sgwmd\" (UID: \"0ea4505d-157e-4390-bcc0-4dd53c9ee787\") " pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" Dec 06 09:27:18 crc kubenswrapper[4672]: I1206 09:27:18.812900 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ea4505d-157e-4390-bcc0-4dd53c9ee787-dns-svc\") pod \"dnsmasq-dns-75ff95b8d7-sgwmd\" (UID: \"0ea4505d-157e-4390-bcc0-4dd53c9ee787\") " pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" Dec 06 09:27:18 crc kubenswrapper[4672]: I1206 09:27:18.812951 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ea4505d-157e-4390-bcc0-4dd53c9ee787-config\") pod \"dnsmasq-dns-75ff95b8d7-sgwmd\" (UID: \"0ea4505d-157e-4390-bcc0-4dd53c9ee787\") " pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" Dec 06 09:27:18 crc kubenswrapper[4672]: I1206 09:27:18.813224 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0ea4505d-157e-4390-bcc0-4dd53c9ee787-openstack-edpm-ipam\") pod \"dnsmasq-dns-75ff95b8d7-sgwmd\" (UID: \"0ea4505d-157e-4390-bcc0-4dd53c9ee787\") " pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" Dec 06 09:27:18 crc kubenswrapper[4672]: I1206 09:27:18.814055 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ea4505d-157e-4390-bcc0-4dd53c9ee787-dns-svc\") pod \"dnsmasq-dns-75ff95b8d7-sgwmd\" (UID: \"0ea4505d-157e-4390-bcc0-4dd53c9ee787\") " pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" Dec 06 09:27:18 crc kubenswrapper[4672]: I1206 09:27:18.814081 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ea4505d-157e-4390-bcc0-4dd53c9ee787-ovsdbserver-sb\") pod \"dnsmasq-dns-75ff95b8d7-sgwmd\" (UID: \"0ea4505d-157e-4390-bcc0-4dd53c9ee787\") " pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" Dec 06 09:27:18 crc kubenswrapper[4672]: I1206 09:27:18.814440 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ea4505d-157e-4390-bcc0-4dd53c9ee787-ovsdbserver-nb\") pod \"dnsmasq-dns-75ff95b8d7-sgwmd\" (UID: \"0ea4505d-157e-4390-bcc0-4dd53c9ee787\") " pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" Dec 06 09:27:18 crc kubenswrapper[4672]: I1206 09:27:18.814629 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ea4505d-157e-4390-bcc0-4dd53c9ee787-config\") pod \"dnsmasq-dns-75ff95b8d7-sgwmd\" (UID: \"0ea4505d-157e-4390-bcc0-4dd53c9ee787\") " pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" Dec 06 09:27:18 crc kubenswrapper[4672]: I1206 09:27:18.840644 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltfsq\" (UniqueName: \"kubernetes.io/projected/0ea4505d-157e-4390-bcc0-4dd53c9ee787-kube-api-access-ltfsq\") pod \"dnsmasq-dns-75ff95b8d7-sgwmd\" (UID: \"0ea4505d-157e-4390-bcc0-4dd53c9ee787\") " pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" Dec 06 09:27:18 crc kubenswrapper[4672]: I1206 09:27:18.964106 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" Dec 06 09:27:19 crc kubenswrapper[4672]: I1206 09:27:19.433877 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75ff95b8d7-sgwmd"] Dec 06 09:27:19 crc kubenswrapper[4672]: W1206 09:27:19.438100 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ea4505d_157e_4390_bcc0_4dd53c9ee787.slice/crio-c3f7eff42592fa57a2bf5f012d7bd3b3312eac9f308e21bad6ff80183409feab WatchSource:0}: Error finding container c3f7eff42592fa57a2bf5f012d7bd3b3312eac9f308e21bad6ff80183409feab: Status 404 returned error can't find the container with id c3f7eff42592fa57a2bf5f012d7bd3b3312eac9f308e21bad6ff80183409feab Dec 06 09:27:19 crc kubenswrapper[4672]: I1206 09:27:19.450229 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" event={"ID":"0ea4505d-157e-4390-bcc0-4dd53c9ee787","Type":"ContainerStarted","Data":"c3f7eff42592fa57a2bf5f012d7bd3b3312eac9f308e21bad6ff80183409feab"} Dec 06 09:27:20 crc kubenswrapper[4672]: I1206 09:27:20.465348 4672 generic.go:334] "Generic (PLEG): container finished" podID="0ea4505d-157e-4390-bcc0-4dd53c9ee787" containerID="135fd471e734f409e57782c3a99b3a7306a4dce9b9c435ed94c442ee1fa6fe84" exitCode=0 Dec 06 09:27:20 crc kubenswrapper[4672]: I1206 09:27:20.465469 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" event={"ID":"0ea4505d-157e-4390-bcc0-4dd53c9ee787","Type":"ContainerDied","Data":"135fd471e734f409e57782c3a99b3a7306a4dce9b9c435ed94c442ee1fa6fe84"} Dec 06 09:27:21 crc kubenswrapper[4672]: I1206 09:27:21.488334 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" event={"ID":"0ea4505d-157e-4390-bcc0-4dd53c9ee787","Type":"ContainerStarted","Data":"84dc300346c71694a737d84651e75edf0ed4247979e1363d0cc019b4f3566529"} Dec 06 09:27:21 crc kubenswrapper[4672]: I1206 09:27:21.489895 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" Dec 06 09:27:21 crc kubenswrapper[4672]: I1206 09:27:21.521421 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" podStartSLOduration=3.521395903 podStartE2EDuration="3.521395903s" podCreationTimestamp="2025-12-06 09:27:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:27:21.518443683 +0000 UTC m=+1259.262703990" watchObservedRunningTime="2025-12-06 09:27:21.521395903 +0000 UTC m=+1259.265656220" Dec 06 09:27:28 crc kubenswrapper[4672]: I1206 09:27:28.966805 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.031515 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8c565c95f-vfw8s"] Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.031904 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8c565c95f-vfw8s" podUID="0abba4d9-6af7-4aaf-894e-b442873ec67f" containerName="dnsmasq-dns" containerID="cri-o://8eac14ca1d8e5b54f6841d14cb10323cd282a8e9aa7674babd4fd918681bf96a" gracePeriod=10 Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.198830 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54f644dcf9-td5cr"] Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.200370 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.236336 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3021c245-1d0d-4727-906b-26784c5e10bc-ovsdbserver-nb\") pod \"dnsmasq-dns-54f644dcf9-td5cr\" (UID: \"3021c245-1d0d-4727-906b-26784c5e10bc\") " pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.236562 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3021c245-1d0d-4727-906b-26784c5e10bc-config\") pod \"dnsmasq-dns-54f644dcf9-td5cr\" (UID: \"3021c245-1d0d-4727-906b-26784c5e10bc\") " pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.236591 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3021c245-1d0d-4727-906b-26784c5e10bc-openstack-edpm-ipam\") pod \"dnsmasq-dns-54f644dcf9-td5cr\" (UID: \"3021c245-1d0d-4727-906b-26784c5e10bc\") " pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.236663 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2cln\" (UniqueName: \"kubernetes.io/projected/3021c245-1d0d-4727-906b-26784c5e10bc-kube-api-access-w2cln\") pod \"dnsmasq-dns-54f644dcf9-td5cr\" (UID: \"3021c245-1d0d-4727-906b-26784c5e10bc\") " pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.236767 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3021c245-1d0d-4727-906b-26784c5e10bc-ovsdbserver-sb\") pod \"dnsmasq-dns-54f644dcf9-td5cr\" (UID: \"3021c245-1d0d-4727-906b-26784c5e10bc\") " pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.236798 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3021c245-1d0d-4727-906b-26784c5e10bc-dns-svc\") pod \"dnsmasq-dns-54f644dcf9-td5cr\" (UID: \"3021c245-1d0d-4727-906b-26784c5e10bc\") " pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.263085 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f644dcf9-td5cr"] Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.345276 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3021c245-1d0d-4727-906b-26784c5e10bc-config\") pod \"dnsmasq-dns-54f644dcf9-td5cr\" (UID: \"3021c245-1d0d-4727-906b-26784c5e10bc\") " pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.345363 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3021c245-1d0d-4727-906b-26784c5e10bc-openstack-edpm-ipam\") pod \"dnsmasq-dns-54f644dcf9-td5cr\" (UID: \"3021c245-1d0d-4727-906b-26784c5e10bc\") " pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.345416 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2cln\" (UniqueName: \"kubernetes.io/projected/3021c245-1d0d-4727-906b-26784c5e10bc-kube-api-access-w2cln\") pod \"dnsmasq-dns-54f644dcf9-td5cr\" (UID: \"3021c245-1d0d-4727-906b-26784c5e10bc\") " pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.345460 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3021c245-1d0d-4727-906b-26784c5e10bc-ovsdbserver-sb\") pod \"dnsmasq-dns-54f644dcf9-td5cr\" (UID: \"3021c245-1d0d-4727-906b-26784c5e10bc\") " pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.345485 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3021c245-1d0d-4727-906b-26784c5e10bc-dns-svc\") pod \"dnsmasq-dns-54f644dcf9-td5cr\" (UID: \"3021c245-1d0d-4727-906b-26784c5e10bc\") " pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.345555 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3021c245-1d0d-4727-906b-26784c5e10bc-ovsdbserver-nb\") pod \"dnsmasq-dns-54f644dcf9-td5cr\" (UID: \"3021c245-1d0d-4727-906b-26784c5e10bc\") " pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.346460 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3021c245-1d0d-4727-906b-26784c5e10bc-ovsdbserver-nb\") pod \"dnsmasq-dns-54f644dcf9-td5cr\" (UID: \"3021c245-1d0d-4727-906b-26784c5e10bc\") " pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.346802 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3021c245-1d0d-4727-906b-26784c5e10bc-dns-svc\") pod \"dnsmasq-dns-54f644dcf9-td5cr\" (UID: \"3021c245-1d0d-4727-906b-26784c5e10bc\") " pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.347190 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3021c245-1d0d-4727-906b-26784c5e10bc-openstack-edpm-ipam\") pod \"dnsmasq-dns-54f644dcf9-td5cr\" (UID: \"3021c245-1d0d-4727-906b-26784c5e10bc\") " pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.347853 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3021c245-1d0d-4727-906b-26784c5e10bc-ovsdbserver-sb\") pod \"dnsmasq-dns-54f644dcf9-td5cr\" (UID: \"3021c245-1d0d-4727-906b-26784c5e10bc\") " pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.349858 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3021c245-1d0d-4727-906b-26784c5e10bc-config\") pod \"dnsmasq-dns-54f644dcf9-td5cr\" (UID: \"3021c245-1d0d-4727-906b-26784c5e10bc\") " pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.364265 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2cln\" (UniqueName: \"kubernetes.io/projected/3021c245-1d0d-4727-906b-26784c5e10bc-kube-api-access-w2cln\") pod \"dnsmasq-dns-54f644dcf9-td5cr\" (UID: \"3021c245-1d0d-4727-906b-26784c5e10bc\") " pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.527732 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.532188 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c565c95f-vfw8s" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.581389 4672 generic.go:334] "Generic (PLEG): container finished" podID="0abba4d9-6af7-4aaf-894e-b442873ec67f" containerID="8eac14ca1d8e5b54f6841d14cb10323cd282a8e9aa7674babd4fd918681bf96a" exitCode=0 Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.581464 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c565c95f-vfw8s" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.581484 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c565c95f-vfw8s" event={"ID":"0abba4d9-6af7-4aaf-894e-b442873ec67f","Type":"ContainerDied","Data":"8eac14ca1d8e5b54f6841d14cb10323cd282a8e9aa7674babd4fd918681bf96a"} Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.581799 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c565c95f-vfw8s" event={"ID":"0abba4d9-6af7-4aaf-894e-b442873ec67f","Type":"ContainerDied","Data":"8b2f14d2dfbbcb8e6c7496b8f876f2efc7afa3a9d1d42b23801e64e719275e07"} Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.581854 4672 scope.go:117] "RemoveContainer" containerID="8eac14ca1d8e5b54f6841d14cb10323cd282a8e9aa7674babd4fd918681bf96a" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.622873 4672 scope.go:117] "RemoveContainer" containerID="9649a3eb21fd71ac961f736dcd52f86cdbafb2e60481952a632e744886613759" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.649483 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0abba4d9-6af7-4aaf-894e-b442873ec67f-dns-svc\") pod \"0abba4d9-6af7-4aaf-894e-b442873ec67f\" (UID: \"0abba4d9-6af7-4aaf-894e-b442873ec67f\") " Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.649562 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0abba4d9-6af7-4aaf-894e-b442873ec67f-config\") pod \"0abba4d9-6af7-4aaf-894e-b442873ec67f\" (UID: \"0abba4d9-6af7-4aaf-894e-b442873ec67f\") " Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.649679 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjv2w\" (UniqueName: \"kubernetes.io/projected/0abba4d9-6af7-4aaf-894e-b442873ec67f-kube-api-access-kjv2w\") pod \"0abba4d9-6af7-4aaf-894e-b442873ec67f\" (UID: \"0abba4d9-6af7-4aaf-894e-b442873ec67f\") " Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.650343 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0abba4d9-6af7-4aaf-894e-b442873ec67f-ovsdbserver-nb\") pod \"0abba4d9-6af7-4aaf-894e-b442873ec67f\" (UID: \"0abba4d9-6af7-4aaf-894e-b442873ec67f\") " Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.650727 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0abba4d9-6af7-4aaf-894e-b442873ec67f-ovsdbserver-sb\") pod \"0abba4d9-6af7-4aaf-894e-b442873ec67f\" (UID: \"0abba4d9-6af7-4aaf-894e-b442873ec67f\") " Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.668224 4672 scope.go:117] "RemoveContainer" containerID="8eac14ca1d8e5b54f6841d14cb10323cd282a8e9aa7674babd4fd918681bf96a" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.669795 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0abba4d9-6af7-4aaf-894e-b442873ec67f-kube-api-access-kjv2w" (OuterVolumeSpecName: "kube-api-access-kjv2w") pod "0abba4d9-6af7-4aaf-894e-b442873ec67f" (UID: "0abba4d9-6af7-4aaf-894e-b442873ec67f"). InnerVolumeSpecName "kube-api-access-kjv2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:27:29 crc kubenswrapper[4672]: E1206 09:27:29.669965 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8eac14ca1d8e5b54f6841d14cb10323cd282a8e9aa7674babd4fd918681bf96a\": container with ID starting with 8eac14ca1d8e5b54f6841d14cb10323cd282a8e9aa7674babd4fd918681bf96a not found: ID does not exist" containerID="8eac14ca1d8e5b54f6841d14cb10323cd282a8e9aa7674babd4fd918681bf96a" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.670005 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eac14ca1d8e5b54f6841d14cb10323cd282a8e9aa7674babd4fd918681bf96a"} err="failed to get container status \"8eac14ca1d8e5b54f6841d14cb10323cd282a8e9aa7674babd4fd918681bf96a\": rpc error: code = NotFound desc = could not find container \"8eac14ca1d8e5b54f6841d14cb10323cd282a8e9aa7674babd4fd918681bf96a\": container with ID starting with 8eac14ca1d8e5b54f6841d14cb10323cd282a8e9aa7674babd4fd918681bf96a not found: ID does not exist" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.670056 4672 scope.go:117] "RemoveContainer" containerID="9649a3eb21fd71ac961f736dcd52f86cdbafb2e60481952a632e744886613759" Dec 06 09:27:29 crc kubenswrapper[4672]: E1206 09:27:29.670338 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9649a3eb21fd71ac961f736dcd52f86cdbafb2e60481952a632e744886613759\": container with ID starting with 9649a3eb21fd71ac961f736dcd52f86cdbafb2e60481952a632e744886613759 not found: ID does not exist" containerID="9649a3eb21fd71ac961f736dcd52f86cdbafb2e60481952a632e744886613759" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.670368 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9649a3eb21fd71ac961f736dcd52f86cdbafb2e60481952a632e744886613759"} err="failed to get container status \"9649a3eb21fd71ac961f736dcd52f86cdbafb2e60481952a632e744886613759\": rpc error: code = NotFound desc = could not find container \"9649a3eb21fd71ac961f736dcd52f86cdbafb2e60481952a632e744886613759\": container with ID starting with 9649a3eb21fd71ac961f736dcd52f86cdbafb2e60481952a632e744886613759 not found: ID does not exist" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.703016 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0abba4d9-6af7-4aaf-894e-b442873ec67f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0abba4d9-6af7-4aaf-894e-b442873ec67f" (UID: "0abba4d9-6af7-4aaf-894e-b442873ec67f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.705898 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0abba4d9-6af7-4aaf-894e-b442873ec67f-config" (OuterVolumeSpecName: "config") pod "0abba4d9-6af7-4aaf-894e-b442873ec67f" (UID: "0abba4d9-6af7-4aaf-894e-b442873ec67f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.712109 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0abba4d9-6af7-4aaf-894e-b442873ec67f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0abba4d9-6af7-4aaf-894e-b442873ec67f" (UID: "0abba4d9-6af7-4aaf-894e-b442873ec67f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.742091 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0abba4d9-6af7-4aaf-894e-b442873ec67f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0abba4d9-6af7-4aaf-894e-b442873ec67f" (UID: "0abba4d9-6af7-4aaf-894e-b442873ec67f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.752804 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0abba4d9-6af7-4aaf-894e-b442873ec67f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.752831 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0abba4d9-6af7-4aaf-894e-b442873ec67f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.752846 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0abba4d9-6af7-4aaf-894e-b442873ec67f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.752856 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0abba4d9-6af7-4aaf-894e-b442873ec67f-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.752866 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjv2w\" (UniqueName: \"kubernetes.io/projected/0abba4d9-6af7-4aaf-894e-b442873ec67f-kube-api-access-kjv2w\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.918484 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8c565c95f-vfw8s"] Dec 06 09:27:29 crc kubenswrapper[4672]: I1206 09:27:29.925113 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8c565c95f-vfw8s"] Dec 06 09:27:30 crc kubenswrapper[4672]: I1206 09:27:30.036762 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f644dcf9-td5cr"] Dec 06 09:27:30 crc kubenswrapper[4672]: W1206 09:27:30.041061 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3021c245_1d0d_4727_906b_26784c5e10bc.slice/crio-209674e34a76b9fbeb4ab75a065667e3e443fe16557599425edf8a27bca87755 WatchSource:0}: Error finding container 209674e34a76b9fbeb4ab75a065667e3e443fe16557599425edf8a27bca87755: Status 404 returned error can't find the container with id 209674e34a76b9fbeb4ab75a065667e3e443fe16557599425edf8a27bca87755 Dec 06 09:27:30 crc kubenswrapper[4672]: I1206 09:27:30.567959 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0abba4d9-6af7-4aaf-894e-b442873ec67f" path="/var/lib/kubelet/pods/0abba4d9-6af7-4aaf-894e-b442873ec67f/volumes" Dec 06 09:27:30 crc kubenswrapper[4672]: I1206 09:27:30.589871 4672 generic.go:334] "Generic (PLEG): container finished" podID="3021c245-1d0d-4727-906b-26784c5e10bc" containerID="aac2eb8cb92e521fb47aef0baa2e09a0281592ee44653b6364f4856fe669fa0c" exitCode=0 Dec 06 09:27:30 crc kubenswrapper[4672]: I1206 09:27:30.589925 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" event={"ID":"3021c245-1d0d-4727-906b-26784c5e10bc","Type":"ContainerDied","Data":"aac2eb8cb92e521fb47aef0baa2e09a0281592ee44653b6364f4856fe669fa0c"} Dec 06 09:27:30 crc kubenswrapper[4672]: I1206 09:27:30.589974 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" event={"ID":"3021c245-1d0d-4727-906b-26784c5e10bc","Type":"ContainerStarted","Data":"209674e34a76b9fbeb4ab75a065667e3e443fe16557599425edf8a27bca87755"} Dec 06 09:27:31 crc kubenswrapper[4672]: I1206 09:27:31.606848 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" event={"ID":"3021c245-1d0d-4727-906b-26784c5e10bc","Type":"ContainerStarted","Data":"085952fa0187515779031389f1e54ce2ad21ae44257525de5dc2037206b4aff4"} Dec 06 09:27:31 crc kubenswrapper[4672]: I1206 09:27:31.607425 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" Dec 06 09:27:31 crc kubenswrapper[4672]: I1206 09:27:31.653207 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" podStartSLOduration=2.653170528 podStartE2EDuration="2.653170528s" podCreationTimestamp="2025-12-06 09:27:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:27:31.641476239 +0000 UTC m=+1269.385736536" watchObservedRunningTime="2025-12-06 09:27:31.653170528 +0000 UTC m=+1269.397430855" Dec 06 09:27:39 crc kubenswrapper[4672]: I1206 09:27:39.529475 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" Dec 06 09:27:39 crc kubenswrapper[4672]: I1206 09:27:39.628144 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75ff95b8d7-sgwmd"] Dec 06 09:27:39 crc kubenswrapper[4672]: I1206 09:27:39.628414 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" podUID="0ea4505d-157e-4390-bcc0-4dd53c9ee787" containerName="dnsmasq-dns" containerID="cri-o://84dc300346c71694a737d84651e75edf0ed4247979e1363d0cc019b4f3566529" gracePeriod=10 Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.101385 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.194820 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ea4505d-157e-4390-bcc0-4dd53c9ee787-config\") pod \"0ea4505d-157e-4390-bcc0-4dd53c9ee787\" (UID: \"0ea4505d-157e-4390-bcc0-4dd53c9ee787\") " Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.194990 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0ea4505d-157e-4390-bcc0-4dd53c9ee787-openstack-edpm-ipam\") pod \"0ea4505d-157e-4390-bcc0-4dd53c9ee787\" (UID: \"0ea4505d-157e-4390-bcc0-4dd53c9ee787\") " Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.195049 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ea4505d-157e-4390-bcc0-4dd53c9ee787-ovsdbserver-sb\") pod \"0ea4505d-157e-4390-bcc0-4dd53c9ee787\" (UID: \"0ea4505d-157e-4390-bcc0-4dd53c9ee787\") " Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.195082 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ea4505d-157e-4390-bcc0-4dd53c9ee787-dns-svc\") pod \"0ea4505d-157e-4390-bcc0-4dd53c9ee787\" (UID: \"0ea4505d-157e-4390-bcc0-4dd53c9ee787\") " Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.195105 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ea4505d-157e-4390-bcc0-4dd53c9ee787-ovsdbserver-nb\") pod \"0ea4505d-157e-4390-bcc0-4dd53c9ee787\" (UID: \"0ea4505d-157e-4390-bcc0-4dd53c9ee787\") " Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.195124 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltfsq\" (UniqueName: \"kubernetes.io/projected/0ea4505d-157e-4390-bcc0-4dd53c9ee787-kube-api-access-ltfsq\") pod \"0ea4505d-157e-4390-bcc0-4dd53c9ee787\" (UID: \"0ea4505d-157e-4390-bcc0-4dd53c9ee787\") " Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.202567 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ea4505d-157e-4390-bcc0-4dd53c9ee787-kube-api-access-ltfsq" (OuterVolumeSpecName: "kube-api-access-ltfsq") pod "0ea4505d-157e-4390-bcc0-4dd53c9ee787" (UID: "0ea4505d-157e-4390-bcc0-4dd53c9ee787"). InnerVolumeSpecName "kube-api-access-ltfsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.239774 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ea4505d-157e-4390-bcc0-4dd53c9ee787-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0ea4505d-157e-4390-bcc0-4dd53c9ee787" (UID: "0ea4505d-157e-4390-bcc0-4dd53c9ee787"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.252475 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ea4505d-157e-4390-bcc0-4dd53c9ee787-config" (OuterVolumeSpecName: "config") pod "0ea4505d-157e-4390-bcc0-4dd53c9ee787" (UID: "0ea4505d-157e-4390-bcc0-4dd53c9ee787"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.256222 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ea4505d-157e-4390-bcc0-4dd53c9ee787-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "0ea4505d-157e-4390-bcc0-4dd53c9ee787" (UID: "0ea4505d-157e-4390-bcc0-4dd53c9ee787"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.256940 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ea4505d-157e-4390-bcc0-4dd53c9ee787-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0ea4505d-157e-4390-bcc0-4dd53c9ee787" (UID: "0ea4505d-157e-4390-bcc0-4dd53c9ee787"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.291256 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ea4505d-157e-4390-bcc0-4dd53c9ee787-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0ea4505d-157e-4390-bcc0-4dd53c9ee787" (UID: "0ea4505d-157e-4390-bcc0-4dd53c9ee787"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.298160 4672 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0ea4505d-157e-4390-bcc0-4dd53c9ee787-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.298202 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ea4505d-157e-4390-bcc0-4dd53c9ee787-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.298213 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ea4505d-157e-4390-bcc0-4dd53c9ee787-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.298222 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ea4505d-157e-4390-bcc0-4dd53c9ee787-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.298232 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltfsq\" (UniqueName: \"kubernetes.io/projected/0ea4505d-157e-4390-bcc0-4dd53c9ee787-kube-api-access-ltfsq\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.298243 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ea4505d-157e-4390-bcc0-4dd53c9ee787-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.712422 4672 generic.go:334] "Generic (PLEG): container finished" podID="0ea4505d-157e-4390-bcc0-4dd53c9ee787" containerID="84dc300346c71694a737d84651e75edf0ed4247979e1363d0cc019b4f3566529" exitCode=0 Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.712505 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.712510 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" event={"ID":"0ea4505d-157e-4390-bcc0-4dd53c9ee787","Type":"ContainerDied","Data":"84dc300346c71694a737d84651e75edf0ed4247979e1363d0cc019b4f3566529"} Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.713938 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75ff95b8d7-sgwmd" event={"ID":"0ea4505d-157e-4390-bcc0-4dd53c9ee787","Type":"ContainerDied","Data":"c3f7eff42592fa57a2bf5f012d7bd3b3312eac9f308e21bad6ff80183409feab"} Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.713974 4672 scope.go:117] "RemoveContainer" containerID="84dc300346c71694a737d84651e75edf0ed4247979e1363d0cc019b4f3566529" Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.751311 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75ff95b8d7-sgwmd"] Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.775814 4672 scope.go:117] "RemoveContainer" containerID="135fd471e734f409e57782c3a99b3a7306a4dce9b9c435ed94c442ee1fa6fe84" Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.779463 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75ff95b8d7-sgwmd"] Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.855099 4672 scope.go:117] "RemoveContainer" containerID="84dc300346c71694a737d84651e75edf0ed4247979e1363d0cc019b4f3566529" Dec 06 09:27:40 crc kubenswrapper[4672]: E1206 09:27:40.855580 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84dc300346c71694a737d84651e75edf0ed4247979e1363d0cc019b4f3566529\": container with ID starting with 84dc300346c71694a737d84651e75edf0ed4247979e1363d0cc019b4f3566529 not found: ID does not exist" containerID="84dc300346c71694a737d84651e75edf0ed4247979e1363d0cc019b4f3566529" Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.855694 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84dc300346c71694a737d84651e75edf0ed4247979e1363d0cc019b4f3566529"} err="failed to get container status \"84dc300346c71694a737d84651e75edf0ed4247979e1363d0cc019b4f3566529\": rpc error: code = NotFound desc = could not find container \"84dc300346c71694a737d84651e75edf0ed4247979e1363d0cc019b4f3566529\": container with ID starting with 84dc300346c71694a737d84651e75edf0ed4247979e1363d0cc019b4f3566529 not found: ID does not exist" Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.855795 4672 scope.go:117] "RemoveContainer" containerID="135fd471e734f409e57782c3a99b3a7306a4dce9b9c435ed94c442ee1fa6fe84" Dec 06 09:27:40 crc kubenswrapper[4672]: E1206 09:27:40.856288 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"135fd471e734f409e57782c3a99b3a7306a4dce9b9c435ed94c442ee1fa6fe84\": container with ID starting with 135fd471e734f409e57782c3a99b3a7306a4dce9b9c435ed94c442ee1fa6fe84 not found: ID does not exist" containerID="135fd471e734f409e57782c3a99b3a7306a4dce9b9c435ed94c442ee1fa6fe84" Dec 06 09:27:40 crc kubenswrapper[4672]: I1206 09:27:40.856378 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"135fd471e734f409e57782c3a99b3a7306a4dce9b9c435ed94c442ee1fa6fe84"} err="failed to get container status \"135fd471e734f409e57782c3a99b3a7306a4dce9b9c435ed94c442ee1fa6fe84\": rpc error: code = NotFound desc = could not find container \"135fd471e734f409e57782c3a99b3a7306a4dce9b9c435ed94c442ee1fa6fe84\": container with ID starting with 135fd471e734f409e57782c3a99b3a7306a4dce9b9c435ed94c442ee1fa6fe84 not found: ID does not exist" Dec 06 09:27:42 crc kubenswrapper[4672]: I1206 09:27:42.577382 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ea4505d-157e-4390-bcc0-4dd53c9ee787" path="/var/lib/kubelet/pods/0ea4505d-157e-4390-bcc0-4dd53c9ee787/volumes" Dec 06 09:27:45 crc kubenswrapper[4672]: I1206 09:27:45.386662 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zj959"] Dec 06 09:27:45 crc kubenswrapper[4672]: E1206 09:27:45.387575 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0abba4d9-6af7-4aaf-894e-b442873ec67f" containerName="init" Dec 06 09:27:45 crc kubenswrapper[4672]: I1206 09:27:45.387596 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="0abba4d9-6af7-4aaf-894e-b442873ec67f" containerName="init" Dec 06 09:27:45 crc kubenswrapper[4672]: E1206 09:27:45.387648 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea4505d-157e-4390-bcc0-4dd53c9ee787" containerName="init" Dec 06 09:27:45 crc kubenswrapper[4672]: I1206 09:27:45.387662 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea4505d-157e-4390-bcc0-4dd53c9ee787" containerName="init" Dec 06 09:27:45 crc kubenswrapper[4672]: E1206 09:27:45.387715 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0abba4d9-6af7-4aaf-894e-b442873ec67f" containerName="dnsmasq-dns" Dec 06 09:27:45 crc kubenswrapper[4672]: I1206 09:27:45.387731 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="0abba4d9-6af7-4aaf-894e-b442873ec67f" containerName="dnsmasq-dns" Dec 06 09:27:45 crc kubenswrapper[4672]: E1206 09:27:45.387762 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea4505d-157e-4390-bcc0-4dd53c9ee787" containerName="dnsmasq-dns" Dec 06 09:27:45 crc kubenswrapper[4672]: I1206 09:27:45.387775 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea4505d-157e-4390-bcc0-4dd53c9ee787" containerName="dnsmasq-dns" Dec 06 09:27:45 crc kubenswrapper[4672]: I1206 09:27:45.388089 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ea4505d-157e-4390-bcc0-4dd53c9ee787" containerName="dnsmasq-dns" Dec 06 09:27:45 crc kubenswrapper[4672]: I1206 09:27:45.388122 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="0abba4d9-6af7-4aaf-894e-b442873ec67f" containerName="dnsmasq-dns" Dec 06 09:27:45 crc kubenswrapper[4672]: I1206 09:27:45.389272 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zj959" Dec 06 09:27:45 crc kubenswrapper[4672]: I1206 09:27:45.393995 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 09:27:45 crc kubenswrapper[4672]: I1206 09:27:45.394201 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p6qrb" Dec 06 09:27:45 crc kubenswrapper[4672]: I1206 09:27:45.398394 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 09:27:45 crc kubenswrapper[4672]: I1206 09:27:45.398861 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:27:45 crc kubenswrapper[4672]: I1206 09:27:45.407705 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zj959"] Dec 06 09:27:45 crc kubenswrapper[4672]: I1206 09:27:45.498938 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/750227a2-c497-4579-b34b-3ebb2a8d502b-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zj959\" (UID: \"750227a2-c497-4579-b34b-3ebb2a8d502b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zj959" Dec 06 09:27:45 crc kubenswrapper[4672]: I1206 09:27:45.498994 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/750227a2-c497-4579-b34b-3ebb2a8d502b-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zj959\" (UID: \"750227a2-c497-4579-b34b-3ebb2a8d502b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zj959" Dec 06 09:27:45 crc kubenswrapper[4672]: I1206 09:27:45.499115 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghwh5\" (UniqueName: \"kubernetes.io/projected/750227a2-c497-4579-b34b-3ebb2a8d502b-kube-api-access-ghwh5\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zj959\" (UID: \"750227a2-c497-4579-b34b-3ebb2a8d502b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zj959" Dec 06 09:27:45 crc kubenswrapper[4672]: I1206 09:27:45.499144 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750227a2-c497-4579-b34b-3ebb2a8d502b-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zj959\" (UID: \"750227a2-c497-4579-b34b-3ebb2a8d502b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zj959" Dec 06 09:27:45 crc kubenswrapper[4672]: I1206 09:27:45.601628 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/750227a2-c497-4579-b34b-3ebb2a8d502b-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zj959\" (UID: \"750227a2-c497-4579-b34b-3ebb2a8d502b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zj959" Dec 06 09:27:45 crc kubenswrapper[4672]: I1206 09:27:45.601710 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/750227a2-c497-4579-b34b-3ebb2a8d502b-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zj959\" (UID: \"750227a2-c497-4579-b34b-3ebb2a8d502b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zj959" Dec 06 09:27:45 crc kubenswrapper[4672]: I1206 09:27:45.602209 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghwh5\" (UniqueName: \"kubernetes.io/projected/750227a2-c497-4579-b34b-3ebb2a8d502b-kube-api-access-ghwh5\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zj959\" (UID: \"750227a2-c497-4579-b34b-3ebb2a8d502b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zj959" Dec 06 09:27:45 crc kubenswrapper[4672]: I1206 09:27:45.602260 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750227a2-c497-4579-b34b-3ebb2a8d502b-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zj959\" (UID: \"750227a2-c497-4579-b34b-3ebb2a8d502b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zj959" Dec 06 09:27:45 crc kubenswrapper[4672]: I1206 09:27:45.608755 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/750227a2-c497-4579-b34b-3ebb2a8d502b-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zj959\" (UID: \"750227a2-c497-4579-b34b-3ebb2a8d502b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zj959" Dec 06 09:27:45 crc kubenswrapper[4672]: I1206 09:27:45.612517 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/750227a2-c497-4579-b34b-3ebb2a8d502b-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zj959\" (UID: \"750227a2-c497-4579-b34b-3ebb2a8d502b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zj959" Dec 06 09:27:45 crc kubenswrapper[4672]: I1206 09:27:45.613645 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750227a2-c497-4579-b34b-3ebb2a8d502b-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zj959\" (UID: \"750227a2-c497-4579-b34b-3ebb2a8d502b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zj959" Dec 06 09:27:45 crc kubenswrapper[4672]: I1206 09:27:45.619484 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghwh5\" (UniqueName: \"kubernetes.io/projected/750227a2-c497-4579-b34b-3ebb2a8d502b-kube-api-access-ghwh5\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zj959\" (UID: \"750227a2-c497-4579-b34b-3ebb2a8d502b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zj959" Dec 06 09:27:45 crc kubenswrapper[4672]: I1206 09:27:45.730734 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zj959" Dec 06 09:27:46 crc kubenswrapper[4672]: I1206 09:27:46.539209 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zj959"] Dec 06 09:27:46 crc kubenswrapper[4672]: I1206 09:27:46.788635 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zj959" event={"ID":"750227a2-c497-4579-b34b-3ebb2a8d502b","Type":"ContainerStarted","Data":"3edcca62449d2e627c1b8c56deac770399353550ef6c376a4b0130c8a6ee13ec"} Dec 06 09:27:49 crc kubenswrapper[4672]: I1206 09:27:49.811999 4672 generic.go:334] "Generic (PLEG): container finished" podID="f3cf9f22-30ac-48ca-9d05-407868710c73" containerID="a58e59f3087910789f4c09f4391850be06c566c3464160e453533a09ae57925a" exitCode=0 Dec 06 09:27:49 crc kubenswrapper[4672]: I1206 09:27:49.812062 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f3cf9f22-30ac-48ca-9d05-407868710c73","Type":"ContainerDied","Data":"a58e59f3087910789f4c09f4391850be06c566c3464160e453533a09ae57925a"} Dec 06 09:27:50 crc kubenswrapper[4672]: I1206 09:27:50.831305 4672 generic.go:334] "Generic (PLEG): container finished" podID="3f71615b-1205-44b2-b4aa-c03548716486" containerID="55692ea67f7a9e733685901b276040278ab90788c3cce45b3bb3b638eb5b77a1" exitCode=0 Dec 06 09:27:50 crc kubenswrapper[4672]: I1206 09:27:50.831353 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3f71615b-1205-44b2-b4aa-c03548716486","Type":"ContainerDied","Data":"55692ea67f7a9e733685901b276040278ab90788c3cce45b3bb3b638eb5b77a1"} Dec 06 09:27:55 crc kubenswrapper[4672]: I1206 09:27:55.879356 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f3cf9f22-30ac-48ca-9d05-407868710c73","Type":"ContainerStarted","Data":"0eb826578882011f4e775dfbf7549b369bfea3b14bcf101099991f4fbbe2e126"} Dec 06 09:27:55 crc kubenswrapper[4672]: I1206 09:27:55.881030 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 06 09:27:55 crc kubenswrapper[4672]: I1206 09:27:55.883508 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zj959" event={"ID":"750227a2-c497-4579-b34b-3ebb2a8d502b","Type":"ContainerStarted","Data":"c8cc845e4d957fc76f47277fe9c4869e9a8d6dc9d4741f4f3b222aa757d9b3fb"} Dec 06 09:27:55 crc kubenswrapper[4672]: I1206 09:27:55.886776 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3f71615b-1205-44b2-b4aa-c03548716486","Type":"ContainerStarted","Data":"c73f9950a31ee2958743806360a2d50e0cc76c96bcd4080b79c9d1c2459b0ec3"} Dec 06 09:27:55 crc kubenswrapper[4672]: I1206 09:27:55.887011 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:27:55 crc kubenswrapper[4672]: I1206 09:27:55.948055 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=40.948040038 podStartE2EDuration="40.948040038s" podCreationTimestamp="2025-12-06 09:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:27:55.940077941 +0000 UTC m=+1293.684338228" watchObservedRunningTime="2025-12-06 09:27:55.948040038 +0000 UTC m=+1293.692300325" Dec 06 09:27:55 crc kubenswrapper[4672]: I1206 09:27:55.948663 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=41.948659386 podStartE2EDuration="41.948659386s" podCreationTimestamp="2025-12-06 09:27:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:27:55.915860741 +0000 UTC m=+1293.660121048" watchObservedRunningTime="2025-12-06 09:27:55.948659386 +0000 UTC m=+1293.692919673" Dec 06 09:27:55 crc kubenswrapper[4672]: I1206 09:27:55.973496 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zj959" podStartSLOduration=2.0542781469999998 podStartE2EDuration="10.973483652s" podCreationTimestamp="2025-12-06 09:27:45 +0000 UTC" firstStartedPulling="2025-12-06 09:27:46.547118057 +0000 UTC m=+1284.291378344" lastFinishedPulling="2025-12-06 09:27:55.466323552 +0000 UTC m=+1293.210583849" observedRunningTime="2025-12-06 09:27:55.968098235 +0000 UTC m=+1293.712358542" watchObservedRunningTime="2025-12-06 09:27:55.973483652 +0000 UTC m=+1293.717743939" Dec 06 09:28:05 crc kubenswrapper[4672]: I1206 09:28:05.107063 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 06 09:28:06 crc kubenswrapper[4672]: I1206 09:28:06.133833 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 06 09:28:06 crc kubenswrapper[4672]: I1206 09:28:06.979725 4672 generic.go:334] "Generic (PLEG): container finished" podID="750227a2-c497-4579-b34b-3ebb2a8d502b" containerID="c8cc845e4d957fc76f47277fe9c4869e9a8d6dc9d4741f4f3b222aa757d9b3fb" exitCode=0 Dec 06 09:28:06 crc kubenswrapper[4672]: I1206 09:28:06.980042 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zj959" event={"ID":"750227a2-c497-4579-b34b-3ebb2a8d502b","Type":"ContainerDied","Data":"c8cc845e4d957fc76f47277fe9c4869e9a8d6dc9d4741f4f3b222aa757d9b3fb"} Dec 06 09:28:08 crc kubenswrapper[4672]: I1206 09:28:08.482031 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zj959" Dec 06 09:28:08 crc kubenswrapper[4672]: I1206 09:28:08.655186 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/750227a2-c497-4579-b34b-3ebb2a8d502b-inventory\") pod \"750227a2-c497-4579-b34b-3ebb2a8d502b\" (UID: \"750227a2-c497-4579-b34b-3ebb2a8d502b\") " Dec 06 09:28:08 crc kubenswrapper[4672]: I1206 09:28:08.655653 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghwh5\" (UniqueName: \"kubernetes.io/projected/750227a2-c497-4579-b34b-3ebb2a8d502b-kube-api-access-ghwh5\") pod \"750227a2-c497-4579-b34b-3ebb2a8d502b\" (UID: \"750227a2-c497-4579-b34b-3ebb2a8d502b\") " Dec 06 09:28:08 crc kubenswrapper[4672]: I1206 09:28:08.655807 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/750227a2-c497-4579-b34b-3ebb2a8d502b-ssh-key\") pod \"750227a2-c497-4579-b34b-3ebb2a8d502b\" (UID: \"750227a2-c497-4579-b34b-3ebb2a8d502b\") " Dec 06 09:28:08 crc kubenswrapper[4672]: I1206 09:28:08.656008 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750227a2-c497-4579-b34b-3ebb2a8d502b-repo-setup-combined-ca-bundle\") pod \"750227a2-c497-4579-b34b-3ebb2a8d502b\" (UID: \"750227a2-c497-4579-b34b-3ebb2a8d502b\") " Dec 06 09:28:08 crc kubenswrapper[4672]: I1206 09:28:08.660675 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/750227a2-c497-4579-b34b-3ebb2a8d502b-kube-api-access-ghwh5" (OuterVolumeSpecName: "kube-api-access-ghwh5") pod "750227a2-c497-4579-b34b-3ebb2a8d502b" (UID: "750227a2-c497-4579-b34b-3ebb2a8d502b"). InnerVolumeSpecName "kube-api-access-ghwh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:28:08 crc kubenswrapper[4672]: I1206 09:28:08.667160 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750227a2-c497-4579-b34b-3ebb2a8d502b-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "750227a2-c497-4579-b34b-3ebb2a8d502b" (UID: "750227a2-c497-4579-b34b-3ebb2a8d502b"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:28:08 crc kubenswrapper[4672]: I1206 09:28:08.690459 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750227a2-c497-4579-b34b-3ebb2a8d502b-inventory" (OuterVolumeSpecName: "inventory") pod "750227a2-c497-4579-b34b-3ebb2a8d502b" (UID: "750227a2-c497-4579-b34b-3ebb2a8d502b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:28:08 crc kubenswrapper[4672]: I1206 09:28:08.711976 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750227a2-c497-4579-b34b-3ebb2a8d502b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "750227a2-c497-4579-b34b-3ebb2a8d502b" (UID: "750227a2-c497-4579-b34b-3ebb2a8d502b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:28:08 crc kubenswrapper[4672]: I1206 09:28:08.757974 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/750227a2-c497-4579-b34b-3ebb2a8d502b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:28:08 crc kubenswrapper[4672]: I1206 09:28:08.758185 4672 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750227a2-c497-4579-b34b-3ebb2a8d502b-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:28:08 crc kubenswrapper[4672]: I1206 09:28:08.758259 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/750227a2-c497-4579-b34b-3ebb2a8d502b-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:28:08 crc kubenswrapper[4672]: I1206 09:28:08.758326 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghwh5\" (UniqueName: \"kubernetes.io/projected/750227a2-c497-4579-b34b-3ebb2a8d502b-kube-api-access-ghwh5\") on node \"crc\" DevicePath \"\"" Dec 06 09:28:09 crc kubenswrapper[4672]: I1206 09:28:09.000332 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zj959" event={"ID":"750227a2-c497-4579-b34b-3ebb2a8d502b","Type":"ContainerDied","Data":"3edcca62449d2e627c1b8c56deac770399353550ef6c376a4b0130c8a6ee13ec"} Dec 06 09:28:09 crc kubenswrapper[4672]: I1206 09:28:09.000384 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3edcca62449d2e627c1b8c56deac770399353550ef6c376a4b0130c8a6ee13ec" Dec 06 09:28:09 crc kubenswrapper[4672]: I1206 09:28:09.000804 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zj959" Dec 06 09:28:09 crc kubenswrapper[4672]: I1206 09:28:09.129820 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j"] Dec 06 09:28:09 crc kubenswrapper[4672]: E1206 09:28:09.130283 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="750227a2-c497-4579-b34b-3ebb2a8d502b" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 06 09:28:09 crc kubenswrapper[4672]: I1206 09:28:09.130307 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="750227a2-c497-4579-b34b-3ebb2a8d502b" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 06 09:28:09 crc kubenswrapper[4672]: I1206 09:28:09.130591 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="750227a2-c497-4579-b34b-3ebb2a8d502b" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 06 09:28:09 crc kubenswrapper[4672]: I1206 09:28:09.131342 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j" Dec 06 09:28:09 crc kubenswrapper[4672]: I1206 09:28:09.133895 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:28:09 crc kubenswrapper[4672]: I1206 09:28:09.134726 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 09:28:09 crc kubenswrapper[4672]: I1206 09:28:09.136711 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 09:28:09 crc kubenswrapper[4672]: I1206 09:28:09.137273 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p6qrb" Dec 06 09:28:09 crc kubenswrapper[4672]: I1206 09:28:09.144535 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j"] Dec 06 09:28:09 crc kubenswrapper[4672]: I1206 09:28:09.267943 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5e522ba-e183-41c7-a1f3-b9085bdac873-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j\" (UID: \"a5e522ba-e183-41c7-a1f3-b9085bdac873\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j" Dec 06 09:28:09 crc kubenswrapper[4672]: I1206 09:28:09.268011 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smnfk\" (UniqueName: \"kubernetes.io/projected/a5e522ba-e183-41c7-a1f3-b9085bdac873-kube-api-access-smnfk\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j\" (UID: \"a5e522ba-e183-41c7-a1f3-b9085bdac873\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j" Dec 06 09:28:09 crc kubenswrapper[4672]: I1206 09:28:09.268083 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5e522ba-e183-41c7-a1f3-b9085bdac873-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j\" (UID: \"a5e522ba-e183-41c7-a1f3-b9085bdac873\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j" Dec 06 09:28:09 crc kubenswrapper[4672]: I1206 09:28:09.268120 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5e522ba-e183-41c7-a1f3-b9085bdac873-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j\" (UID: \"a5e522ba-e183-41c7-a1f3-b9085bdac873\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j" Dec 06 09:28:09 crc kubenswrapper[4672]: I1206 09:28:09.370240 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5e522ba-e183-41c7-a1f3-b9085bdac873-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j\" (UID: \"a5e522ba-e183-41c7-a1f3-b9085bdac873\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j" Dec 06 09:28:09 crc kubenswrapper[4672]: I1206 09:28:09.370297 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5e522ba-e183-41c7-a1f3-b9085bdac873-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j\" (UID: \"a5e522ba-e183-41c7-a1f3-b9085bdac873\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j" Dec 06 09:28:09 crc kubenswrapper[4672]: I1206 09:28:09.370419 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5e522ba-e183-41c7-a1f3-b9085bdac873-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j\" (UID: \"a5e522ba-e183-41c7-a1f3-b9085bdac873\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j" Dec 06 09:28:09 crc kubenswrapper[4672]: I1206 09:28:09.370458 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smnfk\" (UniqueName: \"kubernetes.io/projected/a5e522ba-e183-41c7-a1f3-b9085bdac873-kube-api-access-smnfk\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j\" (UID: \"a5e522ba-e183-41c7-a1f3-b9085bdac873\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j" Dec 06 09:28:09 crc kubenswrapper[4672]: I1206 09:28:09.375583 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5e522ba-e183-41c7-a1f3-b9085bdac873-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j\" (UID: \"a5e522ba-e183-41c7-a1f3-b9085bdac873\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j" Dec 06 09:28:09 crc kubenswrapper[4672]: I1206 09:28:09.376306 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5e522ba-e183-41c7-a1f3-b9085bdac873-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j\" (UID: \"a5e522ba-e183-41c7-a1f3-b9085bdac873\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j" Dec 06 09:28:09 crc kubenswrapper[4672]: I1206 09:28:09.377106 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5e522ba-e183-41c7-a1f3-b9085bdac873-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j\" (UID: \"a5e522ba-e183-41c7-a1f3-b9085bdac873\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j" Dec 06 09:28:09 crc kubenswrapper[4672]: I1206 09:28:09.393356 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smnfk\" (UniqueName: \"kubernetes.io/projected/a5e522ba-e183-41c7-a1f3-b9085bdac873-kube-api-access-smnfk\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j\" (UID: \"a5e522ba-e183-41c7-a1f3-b9085bdac873\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j" Dec 06 09:28:09 crc kubenswrapper[4672]: I1206 09:28:09.449474 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j" Dec 06 09:28:09 crc kubenswrapper[4672]: I1206 09:28:09.987200 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j"] Dec 06 09:28:10 crc kubenswrapper[4672]: I1206 09:28:10.009038 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j" event={"ID":"a5e522ba-e183-41c7-a1f3-b9085bdac873","Type":"ContainerStarted","Data":"96e1a2bc82a5f363bc1231e4079660bdadf61b1cf777f95a7d2321de36dfaf92"} Dec 06 09:28:11 crc kubenswrapper[4672]: I1206 09:28:11.019214 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j" event={"ID":"a5e522ba-e183-41c7-a1f3-b9085bdac873","Type":"ContainerStarted","Data":"b480dbc4b124c07682885b7255ee6783a8a2f6b4f4e06f57ea4ce59e0d31fb98"} Dec 06 09:28:42 crc kubenswrapper[4672]: I1206 09:28:42.320153 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:28:42 crc kubenswrapper[4672]: I1206 09:28:42.320841 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:29:12 crc kubenswrapper[4672]: I1206 09:29:12.319548 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:29:12 crc kubenswrapper[4672]: I1206 09:29:12.320257 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:29:24 crc kubenswrapper[4672]: I1206 09:29:24.801274 4672 scope.go:117] "RemoveContainer" containerID="309afffbabec4d43c69870d78c6a5d2f44dc65cf1d57c3157ccc80462025e840" Dec 06 09:29:42 crc kubenswrapper[4672]: I1206 09:29:42.319552 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:29:42 crc kubenswrapper[4672]: I1206 09:29:42.321151 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:29:42 crc kubenswrapper[4672]: I1206 09:29:42.321260 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 09:29:42 crc kubenswrapper[4672]: I1206 09:29:42.322069 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"928ca0b7127fa7d124fdf57161f7517b407b723d73a8c32f6adf1f0ea4548786"} pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 09:29:42 crc kubenswrapper[4672]: I1206 09:29:42.322207 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" containerID="cri-o://928ca0b7127fa7d124fdf57161f7517b407b723d73a8c32f6adf1f0ea4548786" gracePeriod=600 Dec 06 09:29:43 crc kubenswrapper[4672]: I1206 09:29:43.033562 4672 generic.go:334] "Generic (PLEG): container finished" podID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerID="928ca0b7127fa7d124fdf57161f7517b407b723d73a8c32f6adf1f0ea4548786" exitCode=0 Dec 06 09:29:43 crc kubenswrapper[4672]: I1206 09:29:43.033622 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerDied","Data":"928ca0b7127fa7d124fdf57161f7517b407b723d73a8c32f6adf1f0ea4548786"} Dec 06 09:29:43 crc kubenswrapper[4672]: I1206 09:29:43.033987 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerStarted","Data":"5dbfab7d581ccc84f9182b337ec607ebbed05c2666107cd40acb0c82fff4b999"} Dec 06 09:29:43 crc kubenswrapper[4672]: I1206 09:29:43.034016 4672 scope.go:117] "RemoveContainer" containerID="a101a6d3a9ea73e6619b9412aec8733c3ef377249e41f4c656d15ff2d987965d" Dec 06 09:29:43 crc kubenswrapper[4672]: I1206 09:29:43.058439 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j" podStartSLOduration=93.461984881 podStartE2EDuration="1m34.058422741s" podCreationTimestamp="2025-12-06 09:28:09 +0000 UTC" firstStartedPulling="2025-12-06 09:28:09.992953008 +0000 UTC m=+1307.737213305" lastFinishedPulling="2025-12-06 09:28:10.589390858 +0000 UTC m=+1308.333651165" observedRunningTime="2025-12-06 09:28:11.040380354 +0000 UTC m=+1308.784640641" watchObservedRunningTime="2025-12-06 09:29:43.058422741 +0000 UTC m=+1400.802683028" Dec 06 09:30:00 crc kubenswrapper[4672]: I1206 09:30:00.170036 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416890-2hz5q"] Dec 06 09:30:00 crc kubenswrapper[4672]: I1206 09:30:00.171962 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-2hz5q" Dec 06 09:30:00 crc kubenswrapper[4672]: I1206 09:30:00.175878 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 09:30:00 crc kubenswrapper[4672]: I1206 09:30:00.178139 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 09:30:00 crc kubenswrapper[4672]: I1206 09:30:00.222290 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416890-2hz5q"] Dec 06 09:30:00 crc kubenswrapper[4672]: I1206 09:30:00.350044 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd5xf\" (UniqueName: \"kubernetes.io/projected/2804b63a-981b-41bd-bedb-370f4d1a4820-kube-api-access-fd5xf\") pod \"collect-profiles-29416890-2hz5q\" (UID: \"2804b63a-981b-41bd-bedb-370f4d1a4820\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-2hz5q" Dec 06 09:30:00 crc kubenswrapper[4672]: I1206 09:30:00.350102 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2804b63a-981b-41bd-bedb-370f4d1a4820-config-volume\") pod \"collect-profiles-29416890-2hz5q\" (UID: \"2804b63a-981b-41bd-bedb-370f4d1a4820\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-2hz5q" Dec 06 09:30:00 crc kubenswrapper[4672]: I1206 09:30:00.350128 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2804b63a-981b-41bd-bedb-370f4d1a4820-secret-volume\") pod \"collect-profiles-29416890-2hz5q\" (UID: \"2804b63a-981b-41bd-bedb-370f4d1a4820\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-2hz5q" Dec 06 09:30:00 crc kubenswrapper[4672]: I1206 09:30:00.451818 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd5xf\" (UniqueName: \"kubernetes.io/projected/2804b63a-981b-41bd-bedb-370f4d1a4820-kube-api-access-fd5xf\") pod \"collect-profiles-29416890-2hz5q\" (UID: \"2804b63a-981b-41bd-bedb-370f4d1a4820\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-2hz5q" Dec 06 09:30:00 crc kubenswrapper[4672]: I1206 09:30:00.451908 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2804b63a-981b-41bd-bedb-370f4d1a4820-config-volume\") pod \"collect-profiles-29416890-2hz5q\" (UID: \"2804b63a-981b-41bd-bedb-370f4d1a4820\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-2hz5q" Dec 06 09:30:00 crc kubenswrapper[4672]: I1206 09:30:00.452906 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2804b63a-981b-41bd-bedb-370f4d1a4820-config-volume\") pod \"collect-profiles-29416890-2hz5q\" (UID: \"2804b63a-981b-41bd-bedb-370f4d1a4820\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-2hz5q" Dec 06 09:30:00 crc kubenswrapper[4672]: I1206 09:30:00.451947 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2804b63a-981b-41bd-bedb-370f4d1a4820-secret-volume\") pod \"collect-profiles-29416890-2hz5q\" (UID: \"2804b63a-981b-41bd-bedb-370f4d1a4820\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-2hz5q" Dec 06 09:30:00 crc kubenswrapper[4672]: I1206 09:30:00.460627 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2804b63a-981b-41bd-bedb-370f4d1a4820-secret-volume\") pod \"collect-profiles-29416890-2hz5q\" (UID: \"2804b63a-981b-41bd-bedb-370f4d1a4820\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-2hz5q" Dec 06 09:30:00 crc kubenswrapper[4672]: I1206 09:30:00.474775 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd5xf\" (UniqueName: \"kubernetes.io/projected/2804b63a-981b-41bd-bedb-370f4d1a4820-kube-api-access-fd5xf\") pod \"collect-profiles-29416890-2hz5q\" (UID: \"2804b63a-981b-41bd-bedb-370f4d1a4820\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-2hz5q" Dec 06 09:30:00 crc kubenswrapper[4672]: I1206 09:30:00.523103 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-2hz5q" Dec 06 09:30:00 crc kubenswrapper[4672]: I1206 09:30:00.980308 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416890-2hz5q"] Dec 06 09:30:01 crc kubenswrapper[4672]: I1206 09:30:01.232849 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-2hz5q" event={"ID":"2804b63a-981b-41bd-bedb-370f4d1a4820","Type":"ContainerStarted","Data":"5f8d1e342b2907d2cfc6d9974f8e7517f45ea6edd8ef201e2b8cdb4d95c8b7bb"} Dec 06 09:30:01 crc kubenswrapper[4672]: I1206 09:30:01.233168 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-2hz5q" event={"ID":"2804b63a-981b-41bd-bedb-370f4d1a4820","Type":"ContainerStarted","Data":"ce9409708fe46f06d56eed2771f6ff224dfb52e6dba6d0c7daacbd1c47fb77f5"} Dec 06 09:30:01 crc kubenswrapper[4672]: I1206 09:30:01.252210 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-2hz5q" podStartSLOduration=1.252190664 podStartE2EDuration="1.252190664s" podCreationTimestamp="2025-12-06 09:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:30:01.251474654 +0000 UTC m=+1418.995734981" watchObservedRunningTime="2025-12-06 09:30:01.252190664 +0000 UTC m=+1418.996450951" Dec 06 09:30:02 crc kubenswrapper[4672]: I1206 09:30:02.246527 4672 generic.go:334] "Generic (PLEG): container finished" podID="2804b63a-981b-41bd-bedb-370f4d1a4820" containerID="5f8d1e342b2907d2cfc6d9974f8e7517f45ea6edd8ef201e2b8cdb4d95c8b7bb" exitCode=0 Dec 06 09:30:02 crc kubenswrapper[4672]: I1206 09:30:02.247005 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-2hz5q" event={"ID":"2804b63a-981b-41bd-bedb-370f4d1a4820","Type":"ContainerDied","Data":"5f8d1e342b2907d2cfc6d9974f8e7517f45ea6edd8ef201e2b8cdb4d95c8b7bb"} Dec 06 09:30:03 crc kubenswrapper[4672]: I1206 09:30:03.601780 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-2hz5q" Dec 06 09:30:03 crc kubenswrapper[4672]: I1206 09:30:03.707835 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2804b63a-981b-41bd-bedb-370f4d1a4820-secret-volume\") pod \"2804b63a-981b-41bd-bedb-370f4d1a4820\" (UID: \"2804b63a-981b-41bd-bedb-370f4d1a4820\") " Dec 06 09:30:03 crc kubenswrapper[4672]: I1206 09:30:03.708053 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2804b63a-981b-41bd-bedb-370f4d1a4820-config-volume\") pod \"2804b63a-981b-41bd-bedb-370f4d1a4820\" (UID: \"2804b63a-981b-41bd-bedb-370f4d1a4820\") " Dec 06 09:30:03 crc kubenswrapper[4672]: I1206 09:30:03.708139 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fd5xf\" (UniqueName: \"kubernetes.io/projected/2804b63a-981b-41bd-bedb-370f4d1a4820-kube-api-access-fd5xf\") pod \"2804b63a-981b-41bd-bedb-370f4d1a4820\" (UID: \"2804b63a-981b-41bd-bedb-370f4d1a4820\") " Dec 06 09:30:03 crc kubenswrapper[4672]: I1206 09:30:03.708737 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2804b63a-981b-41bd-bedb-370f4d1a4820-config-volume" (OuterVolumeSpecName: "config-volume") pod "2804b63a-981b-41bd-bedb-370f4d1a4820" (UID: "2804b63a-981b-41bd-bedb-370f4d1a4820"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:30:03 crc kubenswrapper[4672]: I1206 09:30:03.709750 4672 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2804b63a-981b-41bd-bedb-370f4d1a4820-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 09:30:03 crc kubenswrapper[4672]: I1206 09:30:03.715576 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2804b63a-981b-41bd-bedb-370f4d1a4820-kube-api-access-fd5xf" (OuterVolumeSpecName: "kube-api-access-fd5xf") pod "2804b63a-981b-41bd-bedb-370f4d1a4820" (UID: "2804b63a-981b-41bd-bedb-370f4d1a4820"). InnerVolumeSpecName "kube-api-access-fd5xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:30:03 crc kubenswrapper[4672]: I1206 09:30:03.715867 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2804b63a-981b-41bd-bedb-370f4d1a4820-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2804b63a-981b-41bd-bedb-370f4d1a4820" (UID: "2804b63a-981b-41bd-bedb-370f4d1a4820"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:30:03 crc kubenswrapper[4672]: I1206 09:30:03.811139 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fd5xf\" (UniqueName: \"kubernetes.io/projected/2804b63a-981b-41bd-bedb-370f4d1a4820-kube-api-access-fd5xf\") on node \"crc\" DevicePath \"\"" Dec 06 09:30:03 crc kubenswrapper[4672]: I1206 09:30:03.811175 4672 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2804b63a-981b-41bd-bedb-370f4d1a4820-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 09:30:04 crc kubenswrapper[4672]: I1206 09:30:04.269402 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-2hz5q" event={"ID":"2804b63a-981b-41bd-bedb-370f4d1a4820","Type":"ContainerDied","Data":"ce9409708fe46f06d56eed2771f6ff224dfb52e6dba6d0c7daacbd1c47fb77f5"} Dec 06 09:30:04 crc kubenswrapper[4672]: I1206 09:30:04.269464 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce9409708fe46f06d56eed2771f6ff224dfb52e6dba6d0c7daacbd1c47fb77f5" Dec 06 09:30:04 crc kubenswrapper[4672]: I1206 09:30:04.269782 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-2hz5q" Dec 06 09:30:24 crc kubenswrapper[4672]: I1206 09:30:24.873749 4672 scope.go:117] "RemoveContainer" containerID="2f424b0498a0ac11b6c44b1a0c740dfcb7daa1e794b1450978a3bb7757bf7b89" Dec 06 09:30:24 crc kubenswrapper[4672]: I1206 09:30:24.906278 4672 scope.go:117] "RemoveContainer" containerID="08985045aa95156649c283873a6329b5a506bb275fa4b543c4ba84e1df919191" Dec 06 09:30:25 crc kubenswrapper[4672]: I1206 09:30:25.002182 4672 scope.go:117] "RemoveContainer" containerID="fcfbb3a8fa34c8ded377c8feb108edb04a6c456309a3c94277e9b7012da91f57" Dec 06 09:31:42 crc kubenswrapper[4672]: I1206 09:31:42.321728 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:31:42 crc kubenswrapper[4672]: I1206 09:31:42.322641 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:31:55 crc kubenswrapper[4672]: I1206 09:31:55.385570 4672 generic.go:334] "Generic (PLEG): container finished" podID="a5e522ba-e183-41c7-a1f3-b9085bdac873" containerID="b480dbc4b124c07682885b7255ee6783a8a2f6b4f4e06f57ea4ce59e0d31fb98" exitCode=0 Dec 06 09:31:55 crc kubenswrapper[4672]: I1206 09:31:55.385664 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j" event={"ID":"a5e522ba-e183-41c7-a1f3-b9085bdac873","Type":"ContainerDied","Data":"b480dbc4b124c07682885b7255ee6783a8a2f6b4f4e06f57ea4ce59e0d31fb98"} Dec 06 09:31:56 crc kubenswrapper[4672]: I1206 09:31:56.803909 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j" Dec 06 09:31:56 crc kubenswrapper[4672]: I1206 09:31:56.886591 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5e522ba-e183-41c7-a1f3-b9085bdac873-inventory\") pod \"a5e522ba-e183-41c7-a1f3-b9085bdac873\" (UID: \"a5e522ba-e183-41c7-a1f3-b9085bdac873\") " Dec 06 09:31:56 crc kubenswrapper[4672]: I1206 09:31:56.886751 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5e522ba-e183-41c7-a1f3-b9085bdac873-ssh-key\") pod \"a5e522ba-e183-41c7-a1f3-b9085bdac873\" (UID: \"a5e522ba-e183-41c7-a1f3-b9085bdac873\") " Dec 06 09:31:56 crc kubenswrapper[4672]: I1206 09:31:56.886776 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smnfk\" (UniqueName: \"kubernetes.io/projected/a5e522ba-e183-41c7-a1f3-b9085bdac873-kube-api-access-smnfk\") pod \"a5e522ba-e183-41c7-a1f3-b9085bdac873\" (UID: \"a5e522ba-e183-41c7-a1f3-b9085bdac873\") " Dec 06 09:31:56 crc kubenswrapper[4672]: I1206 09:31:56.886898 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5e522ba-e183-41c7-a1f3-b9085bdac873-bootstrap-combined-ca-bundle\") pod \"a5e522ba-e183-41c7-a1f3-b9085bdac873\" (UID: \"a5e522ba-e183-41c7-a1f3-b9085bdac873\") " Dec 06 09:31:56 crc kubenswrapper[4672]: I1206 09:31:56.893753 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5e522ba-e183-41c7-a1f3-b9085bdac873-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "a5e522ba-e183-41c7-a1f3-b9085bdac873" (UID: "a5e522ba-e183-41c7-a1f3-b9085bdac873"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:31:56 crc kubenswrapper[4672]: I1206 09:31:56.894869 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5e522ba-e183-41c7-a1f3-b9085bdac873-kube-api-access-smnfk" (OuterVolumeSpecName: "kube-api-access-smnfk") pod "a5e522ba-e183-41c7-a1f3-b9085bdac873" (UID: "a5e522ba-e183-41c7-a1f3-b9085bdac873"). InnerVolumeSpecName "kube-api-access-smnfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:31:56 crc kubenswrapper[4672]: I1206 09:31:56.925210 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5e522ba-e183-41c7-a1f3-b9085bdac873-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a5e522ba-e183-41c7-a1f3-b9085bdac873" (UID: "a5e522ba-e183-41c7-a1f3-b9085bdac873"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:31:56 crc kubenswrapper[4672]: I1206 09:31:56.938112 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5e522ba-e183-41c7-a1f3-b9085bdac873-inventory" (OuterVolumeSpecName: "inventory") pod "a5e522ba-e183-41c7-a1f3-b9085bdac873" (UID: "a5e522ba-e183-41c7-a1f3-b9085bdac873"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:31:56 crc kubenswrapper[4672]: I1206 09:31:56.989453 4672 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5e522ba-e183-41c7-a1f3-b9085bdac873-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:31:56 crc kubenswrapper[4672]: I1206 09:31:56.989498 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5e522ba-e183-41c7-a1f3-b9085bdac873-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:31:56 crc kubenswrapper[4672]: I1206 09:31:56.989515 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5e522ba-e183-41c7-a1f3-b9085bdac873-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:31:56 crc kubenswrapper[4672]: I1206 09:31:56.989532 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smnfk\" (UniqueName: \"kubernetes.io/projected/a5e522ba-e183-41c7-a1f3-b9085bdac873-kube-api-access-smnfk\") on node \"crc\" DevicePath \"\"" Dec 06 09:31:57 crc kubenswrapper[4672]: I1206 09:31:57.409048 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j" event={"ID":"a5e522ba-e183-41c7-a1f3-b9085bdac873","Type":"ContainerDied","Data":"96e1a2bc82a5f363bc1231e4079660bdadf61b1cf777f95a7d2321de36dfaf92"} Dec 06 09:31:57 crc kubenswrapper[4672]: I1206 09:31:57.409505 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96e1a2bc82a5f363bc1231e4079660bdadf61b1cf777f95a7d2321de36dfaf92" Dec 06 09:31:57 crc kubenswrapper[4672]: I1206 09:31:57.409149 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j" Dec 06 09:31:57 crc kubenswrapper[4672]: I1206 09:31:57.527785 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp"] Dec 06 09:31:57 crc kubenswrapper[4672]: E1206 09:31:57.528302 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2804b63a-981b-41bd-bedb-370f4d1a4820" containerName="collect-profiles" Dec 06 09:31:57 crc kubenswrapper[4672]: I1206 09:31:57.528331 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="2804b63a-981b-41bd-bedb-370f4d1a4820" containerName="collect-profiles" Dec 06 09:31:57 crc kubenswrapper[4672]: E1206 09:31:57.528358 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e522ba-e183-41c7-a1f3-b9085bdac873" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 06 09:31:57 crc kubenswrapper[4672]: I1206 09:31:57.528374 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e522ba-e183-41c7-a1f3-b9085bdac873" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 06 09:31:57 crc kubenswrapper[4672]: I1206 09:31:57.528736 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="2804b63a-981b-41bd-bedb-370f4d1a4820" containerName="collect-profiles" Dec 06 09:31:57 crc kubenswrapper[4672]: I1206 09:31:57.528766 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5e522ba-e183-41c7-a1f3-b9085bdac873" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 06 09:31:57 crc kubenswrapper[4672]: I1206 09:31:57.529683 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp" Dec 06 09:31:57 crc kubenswrapper[4672]: I1206 09:31:57.531895 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:31:57 crc kubenswrapper[4672]: I1206 09:31:57.533251 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p6qrb" Dec 06 09:31:57 crc kubenswrapper[4672]: I1206 09:31:57.533878 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 09:31:57 crc kubenswrapper[4672]: I1206 09:31:57.536206 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 09:31:57 crc kubenswrapper[4672]: I1206 09:31:57.542025 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp"] Dec 06 09:31:57 crc kubenswrapper[4672]: I1206 09:31:57.600576 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp\" (UID: \"c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp" Dec 06 09:31:57 crc kubenswrapper[4672]: I1206 09:31:57.600741 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp\" (UID: \"c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp" Dec 06 09:31:57 crc kubenswrapper[4672]: I1206 09:31:57.600771 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzgwz\" (UniqueName: \"kubernetes.io/projected/c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c-kube-api-access-bzgwz\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp\" (UID: \"c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp" Dec 06 09:31:57 crc kubenswrapper[4672]: I1206 09:31:57.703581 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp\" (UID: \"c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp" Dec 06 09:31:57 crc kubenswrapper[4672]: I1206 09:31:57.703761 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp\" (UID: \"c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp" Dec 06 09:31:57 crc kubenswrapper[4672]: I1206 09:31:57.703811 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzgwz\" (UniqueName: \"kubernetes.io/projected/c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c-kube-api-access-bzgwz\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp\" (UID: \"c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp" Dec 06 09:31:57 crc kubenswrapper[4672]: I1206 09:31:57.709343 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp\" (UID: \"c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp" Dec 06 09:31:57 crc kubenswrapper[4672]: I1206 09:31:57.710540 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp\" (UID: \"c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp" Dec 06 09:31:57 crc kubenswrapper[4672]: I1206 09:31:57.722899 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzgwz\" (UniqueName: \"kubernetes.io/projected/c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c-kube-api-access-bzgwz\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp\" (UID: \"c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp" Dec 06 09:31:57 crc kubenswrapper[4672]: I1206 09:31:57.860114 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp" Dec 06 09:31:58 crc kubenswrapper[4672]: I1206 09:31:58.476589 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp"] Dec 06 09:31:58 crc kubenswrapper[4672]: I1206 09:31:58.489280 4672 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 09:31:59 crc kubenswrapper[4672]: I1206 09:31:59.453167 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp" event={"ID":"c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c","Type":"ContainerStarted","Data":"381734fb76e133db0b2856795d81d52f3b7ca56692768146bbbbe5241b5d0b1b"} Dec 06 09:31:59 crc kubenswrapper[4672]: I1206 09:31:59.453755 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp" event={"ID":"c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c","Type":"ContainerStarted","Data":"dc351cc8b44b4861cef7502e8a321fe89fae0660ade041cef4f8a4776d5a428a"} Dec 06 09:31:59 crc kubenswrapper[4672]: I1206 09:31:59.480363 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp" podStartSLOduration=1.99464193 podStartE2EDuration="2.480341754s" podCreationTimestamp="2025-12-06 09:31:57 +0000 UTC" firstStartedPulling="2025-12-06 09:31:58.489013429 +0000 UTC m=+1536.233273716" lastFinishedPulling="2025-12-06 09:31:58.974713213 +0000 UTC m=+1536.718973540" observedRunningTime="2025-12-06 09:31:59.469213873 +0000 UTC m=+1537.213474170" watchObservedRunningTime="2025-12-06 09:31:59.480341754 +0000 UTC m=+1537.224602041" Dec 06 09:32:12 crc kubenswrapper[4672]: I1206 09:32:12.320003 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:32:12 crc kubenswrapper[4672]: I1206 09:32:12.320763 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:32:25 crc kubenswrapper[4672]: I1206 09:32:25.108162 4672 scope.go:117] "RemoveContainer" containerID="e07590524f183820b0a8a6e77c268ad94429c6cd54e7127b1081a709225ce187" Dec 06 09:32:42 crc kubenswrapper[4672]: I1206 09:32:42.319365 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:32:42 crc kubenswrapper[4672]: I1206 09:32:42.320026 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:32:42 crc kubenswrapper[4672]: I1206 09:32:42.320081 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 09:32:42 crc kubenswrapper[4672]: I1206 09:32:42.320841 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5dbfab7d581ccc84f9182b337ec607ebbed05c2666107cd40acb0c82fff4b999"} pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 09:32:42 crc kubenswrapper[4672]: I1206 09:32:42.320926 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" containerID="cri-o://5dbfab7d581ccc84f9182b337ec607ebbed05c2666107cd40acb0c82fff4b999" gracePeriod=600 Dec 06 09:32:42 crc kubenswrapper[4672]: E1206 09:32:42.460851 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:32:42 crc kubenswrapper[4672]: I1206 09:32:42.961365 4672 generic.go:334] "Generic (PLEG): container finished" podID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerID="5dbfab7d581ccc84f9182b337ec607ebbed05c2666107cd40acb0c82fff4b999" exitCode=0 Dec 06 09:32:42 crc kubenswrapper[4672]: I1206 09:32:42.961413 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerDied","Data":"5dbfab7d581ccc84f9182b337ec607ebbed05c2666107cd40acb0c82fff4b999"} Dec 06 09:32:42 crc kubenswrapper[4672]: I1206 09:32:42.961746 4672 scope.go:117] "RemoveContainer" containerID="928ca0b7127fa7d124fdf57161f7517b407b723d73a8c32f6adf1f0ea4548786" Dec 06 09:32:42 crc kubenswrapper[4672]: I1206 09:32:42.962349 4672 scope.go:117] "RemoveContainer" containerID="5dbfab7d581ccc84f9182b337ec607ebbed05c2666107cd40acb0c82fff4b999" Dec 06 09:32:42 crc kubenswrapper[4672]: E1206 09:32:42.962640 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:32:43 crc kubenswrapper[4672]: I1206 09:32:43.071650 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-b3bc-account-create-update-wf22b"] Dec 06 09:32:43 crc kubenswrapper[4672]: I1206 09:32:43.092095 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-vnjc2"] Dec 06 09:32:43 crc kubenswrapper[4672]: I1206 09:32:43.103528 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-b3bc-account-create-update-wf22b"] Dec 06 09:32:43 crc kubenswrapper[4672]: I1206 09:32:43.112991 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-6jp7t"] Dec 06 09:32:43 crc kubenswrapper[4672]: I1206 09:32:43.120388 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-jp69p"] Dec 06 09:32:43 crc kubenswrapper[4672]: I1206 09:32:43.126934 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f703-account-create-update-zxv8m"] Dec 06 09:32:43 crc kubenswrapper[4672]: I1206 09:32:43.132892 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-6jp7t"] Dec 06 09:32:43 crc kubenswrapper[4672]: I1206 09:32:43.139062 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-f703-account-create-update-zxv8m"] Dec 06 09:32:43 crc kubenswrapper[4672]: I1206 09:32:43.145432 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-jp69p"] Dec 06 09:32:43 crc kubenswrapper[4672]: I1206 09:32:43.151567 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-vnjc2"] Dec 06 09:32:44 crc kubenswrapper[4672]: I1206 09:32:44.040555 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-ce36-account-create-update-66fz7"] Dec 06 09:32:44 crc kubenswrapper[4672]: I1206 09:32:44.051078 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-ce36-account-create-update-66fz7"] Dec 06 09:32:44 crc kubenswrapper[4672]: I1206 09:32:44.567240 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21d9a4c5-61b5-4342-8b9f-853fefc25329" path="/var/lib/kubelet/pods/21d9a4c5-61b5-4342-8b9f-853fefc25329/volumes" Dec 06 09:32:44 crc kubenswrapper[4672]: I1206 09:32:44.568418 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="438c535f-88ae-4e3d-98f9-014d67606706" path="/var/lib/kubelet/pods/438c535f-88ae-4e3d-98f9-014d67606706/volumes" Dec 06 09:32:44 crc kubenswrapper[4672]: I1206 09:32:44.569064 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46dc695c-b1e7-4eb8-9345-63a7b9365334" path="/var/lib/kubelet/pods/46dc695c-b1e7-4eb8-9345-63a7b9365334/volumes" Dec 06 09:32:44 crc kubenswrapper[4672]: I1206 09:32:44.569714 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9" path="/var/lib/kubelet/pods/4eecb7d4-6ed6-4f94-9a6c-dc8abfd473a9/volumes" Dec 06 09:32:44 crc kubenswrapper[4672]: I1206 09:32:44.570841 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4ecba74-762d-4ca3-a6c4-99c9804d5d64" path="/var/lib/kubelet/pods/d4ecba74-762d-4ca3-a6c4-99c9804d5d64/volumes" Dec 06 09:32:44 crc kubenswrapper[4672]: I1206 09:32:44.571458 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffeed4c1-b7f1-4475-850b-f768a7aafe8a" path="/var/lib/kubelet/pods/ffeed4c1-b7f1-4475-850b-f768a7aafe8a/volumes" Dec 06 09:32:56 crc kubenswrapper[4672]: I1206 09:32:56.557010 4672 scope.go:117] "RemoveContainer" containerID="5dbfab7d581ccc84f9182b337ec607ebbed05c2666107cd40acb0c82fff4b999" Dec 06 09:32:56 crc kubenswrapper[4672]: E1206 09:32:56.557984 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:33:09 crc kubenswrapper[4672]: I1206 09:33:09.078858 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-tlfw9"] Dec 06 09:33:09 crc kubenswrapper[4672]: I1206 09:33:09.099587 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-tlfw9"] Dec 06 09:33:09 crc kubenswrapper[4672]: I1206 09:33:09.427377 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-t6nrl"] Dec 06 09:33:09 crc kubenswrapper[4672]: I1206 09:33:09.429468 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t6nrl" Dec 06 09:33:09 crc kubenswrapper[4672]: I1206 09:33:09.457407 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t6nrl"] Dec 06 09:33:09 crc kubenswrapper[4672]: I1206 09:33:09.589317 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kzpc\" (UniqueName: \"kubernetes.io/projected/89ffe66e-c24d-477c-919c-06eaa04f910e-kube-api-access-2kzpc\") pod \"community-operators-t6nrl\" (UID: \"89ffe66e-c24d-477c-919c-06eaa04f910e\") " pod="openshift-marketplace/community-operators-t6nrl" Dec 06 09:33:09 crc kubenswrapper[4672]: I1206 09:33:09.589386 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89ffe66e-c24d-477c-919c-06eaa04f910e-utilities\") pod \"community-operators-t6nrl\" (UID: \"89ffe66e-c24d-477c-919c-06eaa04f910e\") " pod="openshift-marketplace/community-operators-t6nrl" Dec 06 09:33:09 crc kubenswrapper[4672]: I1206 09:33:09.589418 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89ffe66e-c24d-477c-919c-06eaa04f910e-catalog-content\") pod \"community-operators-t6nrl\" (UID: \"89ffe66e-c24d-477c-919c-06eaa04f910e\") " pod="openshift-marketplace/community-operators-t6nrl" Dec 06 09:33:09 crc kubenswrapper[4672]: I1206 09:33:09.691091 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kzpc\" (UniqueName: \"kubernetes.io/projected/89ffe66e-c24d-477c-919c-06eaa04f910e-kube-api-access-2kzpc\") pod \"community-operators-t6nrl\" (UID: \"89ffe66e-c24d-477c-919c-06eaa04f910e\") " pod="openshift-marketplace/community-operators-t6nrl" Dec 06 09:33:09 crc kubenswrapper[4672]: I1206 09:33:09.691153 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89ffe66e-c24d-477c-919c-06eaa04f910e-utilities\") pod \"community-operators-t6nrl\" (UID: \"89ffe66e-c24d-477c-919c-06eaa04f910e\") " pod="openshift-marketplace/community-operators-t6nrl" Dec 06 09:33:09 crc kubenswrapper[4672]: I1206 09:33:09.691184 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89ffe66e-c24d-477c-919c-06eaa04f910e-catalog-content\") pod \"community-operators-t6nrl\" (UID: \"89ffe66e-c24d-477c-919c-06eaa04f910e\") " pod="openshift-marketplace/community-operators-t6nrl" Dec 06 09:33:09 crc kubenswrapper[4672]: I1206 09:33:09.691583 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89ffe66e-c24d-477c-919c-06eaa04f910e-catalog-content\") pod \"community-operators-t6nrl\" (UID: \"89ffe66e-c24d-477c-919c-06eaa04f910e\") " pod="openshift-marketplace/community-operators-t6nrl" Dec 06 09:33:09 crc kubenswrapper[4672]: I1206 09:33:09.691682 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89ffe66e-c24d-477c-919c-06eaa04f910e-utilities\") pod \"community-operators-t6nrl\" (UID: \"89ffe66e-c24d-477c-919c-06eaa04f910e\") " pod="openshift-marketplace/community-operators-t6nrl" Dec 06 09:33:09 crc kubenswrapper[4672]: I1206 09:33:09.718281 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kzpc\" (UniqueName: \"kubernetes.io/projected/89ffe66e-c24d-477c-919c-06eaa04f910e-kube-api-access-2kzpc\") pod \"community-operators-t6nrl\" (UID: \"89ffe66e-c24d-477c-919c-06eaa04f910e\") " pod="openshift-marketplace/community-operators-t6nrl" Dec 06 09:33:09 crc kubenswrapper[4672]: I1206 09:33:09.756305 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t6nrl" Dec 06 09:33:10 crc kubenswrapper[4672]: I1206 09:33:10.280871 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t6nrl"] Dec 06 09:33:10 crc kubenswrapper[4672]: I1206 09:33:10.557231 4672 scope.go:117] "RemoveContainer" containerID="5dbfab7d581ccc84f9182b337ec607ebbed05c2666107cd40acb0c82fff4b999" Dec 06 09:33:10 crc kubenswrapper[4672]: E1206 09:33:10.557763 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:33:10 crc kubenswrapper[4672]: I1206 09:33:10.568442 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d834bae-e9bc-4c2e-be87-2749743f8ef0" path="/var/lib/kubelet/pods/1d834bae-e9bc-4c2e-be87-2749743f8ef0/volumes" Dec 06 09:33:11 crc kubenswrapper[4672]: I1206 09:33:11.272136 4672 generic.go:334] "Generic (PLEG): container finished" podID="89ffe66e-c24d-477c-919c-06eaa04f910e" containerID="abc055cc358c5684b757cd4350c03184fbfa7b85a0e4329e86eda733450e3ce9" exitCode=0 Dec 06 09:33:11 crc kubenswrapper[4672]: I1206 09:33:11.272184 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6nrl" event={"ID":"89ffe66e-c24d-477c-919c-06eaa04f910e","Type":"ContainerDied","Data":"abc055cc358c5684b757cd4350c03184fbfa7b85a0e4329e86eda733450e3ce9"} Dec 06 09:33:11 crc kubenswrapper[4672]: I1206 09:33:11.272212 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6nrl" event={"ID":"89ffe66e-c24d-477c-919c-06eaa04f910e","Type":"ContainerStarted","Data":"704cf2e5919eceddab1049cc63a23d8df44ac3bc690b0995d586db2bac214cab"} Dec 06 09:33:12 crc kubenswrapper[4672]: I1206 09:33:12.294001 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6nrl" event={"ID":"89ffe66e-c24d-477c-919c-06eaa04f910e","Type":"ContainerStarted","Data":"53f3e3acc8e91029376daac301ba454e96c60c99e1067513e5b0ad6a68f4b3da"} Dec 06 09:33:13 crc kubenswrapper[4672]: I1206 09:33:13.320196 4672 generic.go:334] "Generic (PLEG): container finished" podID="89ffe66e-c24d-477c-919c-06eaa04f910e" containerID="53f3e3acc8e91029376daac301ba454e96c60c99e1067513e5b0ad6a68f4b3da" exitCode=0 Dec 06 09:33:13 crc kubenswrapper[4672]: I1206 09:33:13.320251 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6nrl" event={"ID":"89ffe66e-c24d-477c-919c-06eaa04f910e","Type":"ContainerDied","Data":"53f3e3acc8e91029376daac301ba454e96c60c99e1067513e5b0ad6a68f4b3da"} Dec 06 09:33:14 crc kubenswrapper[4672]: I1206 09:33:14.336792 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6nrl" event={"ID":"89ffe66e-c24d-477c-919c-06eaa04f910e","Type":"ContainerStarted","Data":"d7a32265ccacf385525b4d0232f4d148ad7059d55f5bb7308448eef1f3affbcd"} Dec 06 09:33:14 crc kubenswrapper[4672]: I1206 09:33:14.374881 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-t6nrl" podStartSLOduration=2.633005282 podStartE2EDuration="5.374858284s" podCreationTimestamp="2025-12-06 09:33:09 +0000 UTC" firstStartedPulling="2025-12-06 09:33:11.273665972 +0000 UTC m=+1609.017926259" lastFinishedPulling="2025-12-06 09:33:14.015518934 +0000 UTC m=+1611.759779261" observedRunningTime="2025-12-06 09:33:14.364951567 +0000 UTC m=+1612.109211864" watchObservedRunningTime="2025-12-06 09:33:14.374858284 +0000 UTC m=+1612.119118571" Dec 06 09:33:19 crc kubenswrapper[4672]: I1206 09:33:19.756767 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-t6nrl" Dec 06 09:33:19 crc kubenswrapper[4672]: I1206 09:33:19.758811 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-t6nrl" Dec 06 09:33:19 crc kubenswrapper[4672]: I1206 09:33:19.853683 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-t6nrl" Dec 06 09:33:20 crc kubenswrapper[4672]: I1206 09:33:20.479000 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-t6nrl" Dec 06 09:33:20 crc kubenswrapper[4672]: I1206 09:33:20.553170 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t6nrl"] Dec 06 09:33:22 crc kubenswrapper[4672]: I1206 09:33:22.430493 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-t6nrl" podUID="89ffe66e-c24d-477c-919c-06eaa04f910e" containerName="registry-server" containerID="cri-o://d7a32265ccacf385525b4d0232f4d148ad7059d55f5bb7308448eef1f3affbcd" gracePeriod=2 Dec 06 09:33:22 crc kubenswrapper[4672]: I1206 09:33:22.867889 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t6nrl" Dec 06 09:33:22 crc kubenswrapper[4672]: I1206 09:33:22.948417 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89ffe66e-c24d-477c-919c-06eaa04f910e-catalog-content\") pod \"89ffe66e-c24d-477c-919c-06eaa04f910e\" (UID: \"89ffe66e-c24d-477c-919c-06eaa04f910e\") " Dec 06 09:33:22 crc kubenswrapper[4672]: I1206 09:33:22.948683 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kzpc\" (UniqueName: \"kubernetes.io/projected/89ffe66e-c24d-477c-919c-06eaa04f910e-kube-api-access-2kzpc\") pod \"89ffe66e-c24d-477c-919c-06eaa04f910e\" (UID: \"89ffe66e-c24d-477c-919c-06eaa04f910e\") " Dec 06 09:33:22 crc kubenswrapper[4672]: I1206 09:33:22.948793 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89ffe66e-c24d-477c-919c-06eaa04f910e-utilities\") pod \"89ffe66e-c24d-477c-919c-06eaa04f910e\" (UID: \"89ffe66e-c24d-477c-919c-06eaa04f910e\") " Dec 06 09:33:22 crc kubenswrapper[4672]: I1206 09:33:22.949927 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89ffe66e-c24d-477c-919c-06eaa04f910e-utilities" (OuterVolumeSpecName: "utilities") pod "89ffe66e-c24d-477c-919c-06eaa04f910e" (UID: "89ffe66e-c24d-477c-919c-06eaa04f910e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:33:22 crc kubenswrapper[4672]: I1206 09:33:22.957035 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89ffe66e-c24d-477c-919c-06eaa04f910e-kube-api-access-2kzpc" (OuterVolumeSpecName: "kube-api-access-2kzpc") pod "89ffe66e-c24d-477c-919c-06eaa04f910e" (UID: "89ffe66e-c24d-477c-919c-06eaa04f910e"). InnerVolumeSpecName "kube-api-access-2kzpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:33:23 crc kubenswrapper[4672]: I1206 09:33:23.019448 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89ffe66e-c24d-477c-919c-06eaa04f910e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89ffe66e-c24d-477c-919c-06eaa04f910e" (UID: "89ffe66e-c24d-477c-919c-06eaa04f910e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:33:23 crc kubenswrapper[4672]: I1206 09:33:23.050518 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89ffe66e-c24d-477c-919c-06eaa04f910e-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:33:23 crc kubenswrapper[4672]: I1206 09:33:23.050567 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89ffe66e-c24d-477c-919c-06eaa04f910e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:33:23 crc kubenswrapper[4672]: I1206 09:33:23.050584 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kzpc\" (UniqueName: \"kubernetes.io/projected/89ffe66e-c24d-477c-919c-06eaa04f910e-kube-api-access-2kzpc\") on node \"crc\" DevicePath \"\"" Dec 06 09:33:23 crc kubenswrapper[4672]: I1206 09:33:23.449777 4672 generic.go:334] "Generic (PLEG): container finished" podID="89ffe66e-c24d-477c-919c-06eaa04f910e" containerID="d7a32265ccacf385525b4d0232f4d148ad7059d55f5bb7308448eef1f3affbcd" exitCode=0 Dec 06 09:33:23 crc kubenswrapper[4672]: I1206 09:33:23.449853 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6nrl" event={"ID":"89ffe66e-c24d-477c-919c-06eaa04f910e","Type":"ContainerDied","Data":"d7a32265ccacf385525b4d0232f4d148ad7059d55f5bb7308448eef1f3affbcd"} Dec 06 09:33:23 crc kubenswrapper[4672]: I1206 09:33:23.449924 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6nrl" event={"ID":"89ffe66e-c24d-477c-919c-06eaa04f910e","Type":"ContainerDied","Data":"704cf2e5919eceddab1049cc63a23d8df44ac3bc690b0995d586db2bac214cab"} Dec 06 09:33:23 crc kubenswrapper[4672]: I1206 09:33:23.449943 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t6nrl" Dec 06 09:33:23 crc kubenswrapper[4672]: I1206 09:33:23.449955 4672 scope.go:117] "RemoveContainer" containerID="d7a32265ccacf385525b4d0232f4d148ad7059d55f5bb7308448eef1f3affbcd" Dec 06 09:33:23 crc kubenswrapper[4672]: I1206 09:33:23.520761 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t6nrl"] Dec 06 09:33:23 crc kubenswrapper[4672]: I1206 09:33:23.521375 4672 scope.go:117] "RemoveContainer" containerID="53f3e3acc8e91029376daac301ba454e96c60c99e1067513e5b0ad6a68f4b3da" Dec 06 09:33:23 crc kubenswrapper[4672]: I1206 09:33:23.537646 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-t6nrl"] Dec 06 09:33:23 crc kubenswrapper[4672]: I1206 09:33:23.549919 4672 scope.go:117] "RemoveContainer" containerID="abc055cc358c5684b757cd4350c03184fbfa7b85a0e4329e86eda733450e3ce9" Dec 06 09:33:23 crc kubenswrapper[4672]: I1206 09:33:23.557260 4672 scope.go:117] "RemoveContainer" containerID="5dbfab7d581ccc84f9182b337ec607ebbed05c2666107cd40acb0c82fff4b999" Dec 06 09:33:23 crc kubenswrapper[4672]: E1206 09:33:23.557632 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:33:23 crc kubenswrapper[4672]: I1206 09:33:23.594837 4672 scope.go:117] "RemoveContainer" containerID="d7a32265ccacf385525b4d0232f4d148ad7059d55f5bb7308448eef1f3affbcd" Dec 06 09:33:23 crc kubenswrapper[4672]: E1206 09:33:23.599215 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7a32265ccacf385525b4d0232f4d148ad7059d55f5bb7308448eef1f3affbcd\": container with ID starting with d7a32265ccacf385525b4d0232f4d148ad7059d55f5bb7308448eef1f3affbcd not found: ID does not exist" containerID="d7a32265ccacf385525b4d0232f4d148ad7059d55f5bb7308448eef1f3affbcd" Dec 06 09:33:23 crc kubenswrapper[4672]: I1206 09:33:23.599358 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7a32265ccacf385525b4d0232f4d148ad7059d55f5bb7308448eef1f3affbcd"} err="failed to get container status \"d7a32265ccacf385525b4d0232f4d148ad7059d55f5bb7308448eef1f3affbcd\": rpc error: code = NotFound desc = could not find container \"d7a32265ccacf385525b4d0232f4d148ad7059d55f5bb7308448eef1f3affbcd\": container with ID starting with d7a32265ccacf385525b4d0232f4d148ad7059d55f5bb7308448eef1f3affbcd not found: ID does not exist" Dec 06 09:33:23 crc kubenswrapper[4672]: I1206 09:33:23.599450 4672 scope.go:117] "RemoveContainer" containerID="53f3e3acc8e91029376daac301ba454e96c60c99e1067513e5b0ad6a68f4b3da" Dec 06 09:33:23 crc kubenswrapper[4672]: E1206 09:33:23.607059 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53f3e3acc8e91029376daac301ba454e96c60c99e1067513e5b0ad6a68f4b3da\": container with ID starting with 53f3e3acc8e91029376daac301ba454e96c60c99e1067513e5b0ad6a68f4b3da not found: ID does not exist" containerID="53f3e3acc8e91029376daac301ba454e96c60c99e1067513e5b0ad6a68f4b3da" Dec 06 09:33:23 crc kubenswrapper[4672]: I1206 09:33:23.607104 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53f3e3acc8e91029376daac301ba454e96c60c99e1067513e5b0ad6a68f4b3da"} err="failed to get container status \"53f3e3acc8e91029376daac301ba454e96c60c99e1067513e5b0ad6a68f4b3da\": rpc error: code = NotFound desc = could not find container \"53f3e3acc8e91029376daac301ba454e96c60c99e1067513e5b0ad6a68f4b3da\": container with ID starting with 53f3e3acc8e91029376daac301ba454e96c60c99e1067513e5b0ad6a68f4b3da not found: ID does not exist" Dec 06 09:33:23 crc kubenswrapper[4672]: I1206 09:33:23.607134 4672 scope.go:117] "RemoveContainer" containerID="abc055cc358c5684b757cd4350c03184fbfa7b85a0e4329e86eda733450e3ce9" Dec 06 09:33:23 crc kubenswrapper[4672]: E1206 09:33:23.607440 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abc055cc358c5684b757cd4350c03184fbfa7b85a0e4329e86eda733450e3ce9\": container with ID starting with abc055cc358c5684b757cd4350c03184fbfa7b85a0e4329e86eda733450e3ce9 not found: ID does not exist" containerID="abc055cc358c5684b757cd4350c03184fbfa7b85a0e4329e86eda733450e3ce9" Dec 06 09:33:23 crc kubenswrapper[4672]: I1206 09:33:23.607459 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abc055cc358c5684b757cd4350c03184fbfa7b85a0e4329e86eda733450e3ce9"} err="failed to get container status \"abc055cc358c5684b757cd4350c03184fbfa7b85a0e4329e86eda733450e3ce9\": rpc error: code = NotFound desc = could not find container \"abc055cc358c5684b757cd4350c03184fbfa7b85a0e4329e86eda733450e3ce9\": container with ID starting with abc055cc358c5684b757cd4350c03184fbfa7b85a0e4329e86eda733450e3ce9 not found: ID does not exist" Dec 06 09:33:24 crc kubenswrapper[4672]: I1206 09:33:24.064078 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-9mbg7"] Dec 06 09:33:24 crc kubenswrapper[4672]: I1206 09:33:24.108781 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-c4c2-account-create-update-6lfzq"] Dec 06 09:33:24 crc kubenswrapper[4672]: I1206 09:33:24.123102 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-54df-account-create-update-cq6z9"] Dec 06 09:33:24 crc kubenswrapper[4672]: I1206 09:33:24.131100 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-0de7-account-create-update-zxjmn"] Dec 06 09:33:24 crc kubenswrapper[4672]: I1206 09:33:24.138514 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-c4c2-account-create-update-6lfzq"] Dec 06 09:33:24 crc kubenswrapper[4672]: I1206 09:33:24.145862 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-9mbg7"] Dec 06 09:33:24 crc kubenswrapper[4672]: I1206 09:33:24.152928 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-54df-account-create-update-cq6z9"] Dec 06 09:33:24 crc kubenswrapper[4672]: I1206 09:33:24.160037 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-0de7-account-create-update-zxjmn"] Dec 06 09:33:24 crc kubenswrapper[4672]: I1206 09:33:24.168377 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-pp5qd"] Dec 06 09:33:24 crc kubenswrapper[4672]: I1206 09:33:24.175433 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-pp5qd"] Dec 06 09:33:24 crc kubenswrapper[4672]: I1206 09:33:24.184559 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-cbj5x"] Dec 06 09:33:24 crc kubenswrapper[4672]: I1206 09:33:24.191522 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-cbj5x"] Dec 06 09:33:24 crc kubenswrapper[4672]: I1206 09:33:24.569461 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="165da568-dddb-472f-860a-0f36faa35334" path="/var/lib/kubelet/pods/165da568-dddb-472f-860a-0f36faa35334/volumes" Dec 06 09:33:24 crc kubenswrapper[4672]: I1206 09:33:24.570286 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31cf51cc-3132-467e-8e83-a8633775caa9" path="/var/lib/kubelet/pods/31cf51cc-3132-467e-8e83-a8633775caa9/volumes" Dec 06 09:33:24 crc kubenswrapper[4672]: I1206 09:33:24.571064 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="498d72d7-0324-4ab2-9208-0b32e1df2efa" path="/var/lib/kubelet/pods/498d72d7-0324-4ab2-9208-0b32e1df2efa/volumes" Dec 06 09:33:24 crc kubenswrapper[4672]: I1206 09:33:24.571840 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eaceb20-8947-487b-b126-28b9509598ef" path="/var/lib/kubelet/pods/4eaceb20-8947-487b-b126-28b9509598ef/volumes" Dec 06 09:33:24 crc kubenswrapper[4672]: I1206 09:33:24.573141 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89ffe66e-c24d-477c-919c-06eaa04f910e" path="/var/lib/kubelet/pods/89ffe66e-c24d-477c-919c-06eaa04f910e/volumes" Dec 06 09:33:24 crc kubenswrapper[4672]: I1206 09:33:24.574067 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="babff9b3-3ae2-49df-9b15-c4c7110c21f6" path="/var/lib/kubelet/pods/babff9b3-3ae2-49df-9b15-c4c7110c21f6/volumes" Dec 06 09:33:24 crc kubenswrapper[4672]: I1206 09:33:24.574801 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02" path="/var/lib/kubelet/pods/ebca8fb8-c9a7-4ce0-ba17-0d03ef90de02/volumes" Dec 06 09:33:25 crc kubenswrapper[4672]: I1206 09:33:25.165170 4672 scope.go:117] "RemoveContainer" containerID="805a53042030daa2c1cbac1e0e89a3b90c7e98a10747d1a136497b415ddccb40" Dec 06 09:33:25 crc kubenswrapper[4672]: I1206 09:33:25.200790 4672 scope.go:117] "RemoveContainer" containerID="f2049860aae71f4e068fa600205d6bc7f390cef78952f53e563e8510a67184c2" Dec 06 09:33:25 crc kubenswrapper[4672]: I1206 09:33:25.247591 4672 scope.go:117] "RemoveContainer" containerID="03fbec216a02d03977f726a9d1a0ec30b683bf17c08165795eeedd3707415d9f" Dec 06 09:33:25 crc kubenswrapper[4672]: I1206 09:33:25.298329 4672 scope.go:117] "RemoveContainer" containerID="09408829ab4f8d61406eb319b486ffe5219bbaca59cc509747aa7eaa5cd79ca3" Dec 06 09:33:25 crc kubenswrapper[4672]: I1206 09:33:25.324813 4672 scope.go:117] "RemoveContainer" containerID="b76c56d4d0ca7315c8a755fa2c649eaa1e616d7550c7c65fdb869c70285c12a7" Dec 06 09:33:25 crc kubenswrapper[4672]: I1206 09:33:25.393839 4672 scope.go:117] "RemoveContainer" containerID="ee65c5fd5ed129c3f69bccfa941a91c09f6bcf684773da7bffea705b0be720e1" Dec 06 09:33:25 crc kubenswrapper[4672]: I1206 09:33:25.415473 4672 scope.go:117] "RemoveContainer" containerID="ca10994a9fd98630e1d44279d028b49b0db78557ffea7c52668ca9ba336d5558" Dec 06 09:33:25 crc kubenswrapper[4672]: I1206 09:33:25.440925 4672 scope.go:117] "RemoveContainer" containerID="d8b784aa4ff03fd54abb29abbb3979fe86931fe2c7fd1fbc2f6d783b470fe074" Dec 06 09:33:25 crc kubenswrapper[4672]: I1206 09:33:25.461519 4672 scope.go:117] "RemoveContainer" containerID="fddd92464dd7a0e099dab5aa9f85a16b03b77d4a2cdbcee0fb70b97d2606c96e" Dec 06 09:33:25 crc kubenswrapper[4672]: I1206 09:33:25.491248 4672 scope.go:117] "RemoveContainer" containerID="dc597b7112339745322ed2b80445088fdd35a47d4a64341aecdc32474035ea27" Dec 06 09:33:25 crc kubenswrapper[4672]: I1206 09:33:25.510028 4672 scope.go:117] "RemoveContainer" containerID="38cfbdc5b578be4bc8957b97d561849aece809e2f3d4ade4cf597f1f7de51e61" Dec 06 09:33:25 crc kubenswrapper[4672]: I1206 09:33:25.529492 4672 scope.go:117] "RemoveContainer" containerID="f65e4895b4e77ee8d5d836970adfc6c1ce905d5b3cfb8c43aae2719db8718e27" Dec 06 09:33:25 crc kubenswrapper[4672]: I1206 09:33:25.553550 4672 scope.go:117] "RemoveContainer" containerID="01be840edd74c703d608f7e92900632621eb912485a14429b820f8b313b66b7d" Dec 06 09:33:27 crc kubenswrapper[4672]: I1206 09:33:27.514754 4672 generic.go:334] "Generic (PLEG): container finished" podID="c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c" containerID="381734fb76e133db0b2856795d81d52f3b7ca56692768146bbbbe5241b5d0b1b" exitCode=0 Dec 06 09:33:27 crc kubenswrapper[4672]: I1206 09:33:27.514885 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp" event={"ID":"c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c","Type":"ContainerDied","Data":"381734fb76e133db0b2856795d81d52f3b7ca56692768146bbbbe5241b5d0b1b"} Dec 06 09:33:28 crc kubenswrapper[4672]: I1206 09:33:28.041661 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-jq2gm"] Dec 06 09:33:28 crc kubenswrapper[4672]: I1206 09:33:28.047761 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-jq2gm"] Dec 06 09:33:28 crc kubenswrapper[4672]: I1206 09:33:28.566874 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9d00f72-7218-4bbf-bbfd-cd664d5be035" path="/var/lib/kubelet/pods/d9d00f72-7218-4bbf-bbfd-cd664d5be035/volumes" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:28.996466 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.076926 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c-ssh-key\") pod \"c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c\" (UID: \"c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c\") " Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.077126 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzgwz\" (UniqueName: \"kubernetes.io/projected/c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c-kube-api-access-bzgwz\") pod \"c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c\" (UID: \"c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c\") " Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.077180 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c-inventory\") pod \"c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c\" (UID: \"c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c\") " Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.085898 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c-kube-api-access-bzgwz" (OuterVolumeSpecName: "kube-api-access-bzgwz") pod "c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c" (UID: "c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c"). InnerVolumeSpecName "kube-api-access-bzgwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.123343 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c-inventory" (OuterVolumeSpecName: "inventory") pod "c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c" (UID: "c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.124567 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c" (UID: "c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.184887 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.184946 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzgwz\" (UniqueName: \"kubernetes.io/projected/c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c-kube-api-access-bzgwz\") on node \"crc\" DevicePath \"\"" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.184976 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.537868 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp" event={"ID":"c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c","Type":"ContainerDied","Data":"dc351cc8b44b4861cef7502e8a321fe89fae0660ade041cef4f8a4776d5a428a"} Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.537906 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc351cc8b44b4861cef7502e8a321fe89fae0660ade041cef4f8a4776d5a428a" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.537955 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.617969 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d"] Dec 06 09:33:29 crc kubenswrapper[4672]: E1206 09:33:29.618311 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89ffe66e-c24d-477c-919c-06eaa04f910e" containerName="extract-content" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.618324 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="89ffe66e-c24d-477c-919c-06eaa04f910e" containerName="extract-content" Dec 06 09:33:29 crc kubenswrapper[4672]: E1206 09:33:29.618340 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89ffe66e-c24d-477c-919c-06eaa04f910e" containerName="extract-utilities" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.618346 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="89ffe66e-c24d-477c-919c-06eaa04f910e" containerName="extract-utilities" Dec 06 09:33:29 crc kubenswrapper[4672]: E1206 09:33:29.618361 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.618369 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 06 09:33:29 crc kubenswrapper[4672]: E1206 09:33:29.618387 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89ffe66e-c24d-477c-919c-06eaa04f910e" containerName="registry-server" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.618393 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="89ffe66e-c24d-477c-919c-06eaa04f910e" containerName="registry-server" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.618553 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="89ffe66e-c24d-477c-919c-06eaa04f910e" containerName="registry-server" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.618571 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.619129 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.622113 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p6qrb" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.622635 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.624577 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.628018 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.631142 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d"] Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.694012 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33362cf5-2204-478e-b155-8277d00131a6-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d\" (UID: \"33362cf5-2204-478e-b155-8277d00131a6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.694518 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33362cf5-2204-478e-b155-8277d00131a6-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d\" (UID: \"33362cf5-2204-478e-b155-8277d00131a6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.694624 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2mls\" (UniqueName: \"kubernetes.io/projected/33362cf5-2204-478e-b155-8277d00131a6-kube-api-access-v2mls\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d\" (UID: \"33362cf5-2204-478e-b155-8277d00131a6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.796862 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33362cf5-2204-478e-b155-8277d00131a6-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d\" (UID: \"33362cf5-2204-478e-b155-8277d00131a6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.797231 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2mls\" (UniqueName: \"kubernetes.io/projected/33362cf5-2204-478e-b155-8277d00131a6-kube-api-access-v2mls\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d\" (UID: \"33362cf5-2204-478e-b155-8277d00131a6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.797411 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33362cf5-2204-478e-b155-8277d00131a6-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d\" (UID: \"33362cf5-2204-478e-b155-8277d00131a6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.802494 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33362cf5-2204-478e-b155-8277d00131a6-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d\" (UID: \"33362cf5-2204-478e-b155-8277d00131a6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.805067 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33362cf5-2204-478e-b155-8277d00131a6-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d\" (UID: \"33362cf5-2204-478e-b155-8277d00131a6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.830362 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2mls\" (UniqueName: \"kubernetes.io/projected/33362cf5-2204-478e-b155-8277d00131a6-kube-api-access-v2mls\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d\" (UID: \"33362cf5-2204-478e-b155-8277d00131a6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d" Dec 06 09:33:29 crc kubenswrapper[4672]: I1206 09:33:29.949367 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d" Dec 06 09:33:30 crc kubenswrapper[4672]: I1206 09:33:30.549640 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d"] Dec 06 09:33:31 crc kubenswrapper[4672]: I1206 09:33:31.552560 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d" event={"ID":"33362cf5-2204-478e-b155-8277d00131a6","Type":"ContainerStarted","Data":"157c2c0e2445db78091d7c3c15af2dfba630a570d2cfb1ae3ea139febbaa451e"} Dec 06 09:33:31 crc kubenswrapper[4672]: I1206 09:33:31.552842 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d" event={"ID":"33362cf5-2204-478e-b155-8277d00131a6","Type":"ContainerStarted","Data":"70dbbe3ebabc57be41f347bba7b8a0d1eac3d36a5194cd833e6507f593dd646e"} Dec 06 09:33:31 crc kubenswrapper[4672]: I1206 09:33:31.580817 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d" podStartSLOduration=2.130848959 podStartE2EDuration="2.580802243s" podCreationTimestamp="2025-12-06 09:33:29 +0000 UTC" firstStartedPulling="2025-12-06 09:33:30.542999645 +0000 UTC m=+1628.287259932" lastFinishedPulling="2025-12-06 09:33:30.992952889 +0000 UTC m=+1628.737213216" observedRunningTime="2025-12-06 09:33:31.572639072 +0000 UTC m=+1629.316899359" watchObservedRunningTime="2025-12-06 09:33:31.580802243 +0000 UTC m=+1629.325062530" Dec 06 09:33:34 crc kubenswrapper[4672]: I1206 09:33:34.102210 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fx94x"] Dec 06 09:33:34 crc kubenswrapper[4672]: I1206 09:33:34.106000 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fx94x" Dec 06 09:33:34 crc kubenswrapper[4672]: I1206 09:33:34.112787 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fx94x"] Dec 06 09:33:34 crc kubenswrapper[4672]: I1206 09:33:34.178346 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60193924-70a5-4baa-9d3f-1f39f3a3c0dc-utilities\") pod \"redhat-operators-fx94x\" (UID: \"60193924-70a5-4baa-9d3f-1f39f3a3c0dc\") " pod="openshift-marketplace/redhat-operators-fx94x" Dec 06 09:33:34 crc kubenswrapper[4672]: I1206 09:33:34.179149 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60193924-70a5-4baa-9d3f-1f39f3a3c0dc-catalog-content\") pod \"redhat-operators-fx94x\" (UID: \"60193924-70a5-4baa-9d3f-1f39f3a3c0dc\") " pod="openshift-marketplace/redhat-operators-fx94x" Dec 06 09:33:34 crc kubenswrapper[4672]: I1206 09:33:34.179227 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lnqz\" (UniqueName: \"kubernetes.io/projected/60193924-70a5-4baa-9d3f-1f39f3a3c0dc-kube-api-access-8lnqz\") pod \"redhat-operators-fx94x\" (UID: \"60193924-70a5-4baa-9d3f-1f39f3a3c0dc\") " pod="openshift-marketplace/redhat-operators-fx94x" Dec 06 09:33:34 crc kubenswrapper[4672]: I1206 09:33:34.280687 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60193924-70a5-4baa-9d3f-1f39f3a3c0dc-utilities\") pod \"redhat-operators-fx94x\" (UID: \"60193924-70a5-4baa-9d3f-1f39f3a3c0dc\") " pod="openshift-marketplace/redhat-operators-fx94x" Dec 06 09:33:34 crc kubenswrapper[4672]: I1206 09:33:34.280828 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60193924-70a5-4baa-9d3f-1f39f3a3c0dc-catalog-content\") pod \"redhat-operators-fx94x\" (UID: \"60193924-70a5-4baa-9d3f-1f39f3a3c0dc\") " pod="openshift-marketplace/redhat-operators-fx94x" Dec 06 09:33:34 crc kubenswrapper[4672]: I1206 09:33:34.280852 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lnqz\" (UniqueName: \"kubernetes.io/projected/60193924-70a5-4baa-9d3f-1f39f3a3c0dc-kube-api-access-8lnqz\") pod \"redhat-operators-fx94x\" (UID: \"60193924-70a5-4baa-9d3f-1f39f3a3c0dc\") " pod="openshift-marketplace/redhat-operators-fx94x" Dec 06 09:33:34 crc kubenswrapper[4672]: I1206 09:33:34.281226 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60193924-70a5-4baa-9d3f-1f39f3a3c0dc-utilities\") pod \"redhat-operators-fx94x\" (UID: \"60193924-70a5-4baa-9d3f-1f39f3a3c0dc\") " pod="openshift-marketplace/redhat-operators-fx94x" Dec 06 09:33:34 crc kubenswrapper[4672]: I1206 09:33:34.281306 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60193924-70a5-4baa-9d3f-1f39f3a3c0dc-catalog-content\") pod \"redhat-operators-fx94x\" (UID: \"60193924-70a5-4baa-9d3f-1f39f3a3c0dc\") " pod="openshift-marketplace/redhat-operators-fx94x" Dec 06 09:33:34 crc kubenswrapper[4672]: I1206 09:33:34.298922 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lnqz\" (UniqueName: \"kubernetes.io/projected/60193924-70a5-4baa-9d3f-1f39f3a3c0dc-kube-api-access-8lnqz\") pod \"redhat-operators-fx94x\" (UID: \"60193924-70a5-4baa-9d3f-1f39f3a3c0dc\") " pod="openshift-marketplace/redhat-operators-fx94x" Dec 06 09:33:34 crc kubenswrapper[4672]: I1206 09:33:34.422761 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fx94x" Dec 06 09:33:34 crc kubenswrapper[4672]: I1206 09:33:34.978449 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fx94x"] Dec 06 09:33:35 crc kubenswrapper[4672]: I1206 09:33:35.293735 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-khg2x"] Dec 06 09:33:35 crc kubenswrapper[4672]: I1206 09:33:35.296017 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-khg2x" Dec 06 09:33:35 crc kubenswrapper[4672]: I1206 09:33:35.312875 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-khg2x"] Dec 06 09:33:35 crc kubenswrapper[4672]: I1206 09:33:35.413054 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz7z8\" (UniqueName: \"kubernetes.io/projected/d89d3fb0-8fa5-46b0-a1e3-353459d7cff7-kube-api-access-fz7z8\") pod \"certified-operators-khg2x\" (UID: \"d89d3fb0-8fa5-46b0-a1e3-353459d7cff7\") " pod="openshift-marketplace/certified-operators-khg2x" Dec 06 09:33:35 crc kubenswrapper[4672]: I1206 09:33:35.413113 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d89d3fb0-8fa5-46b0-a1e3-353459d7cff7-utilities\") pod \"certified-operators-khg2x\" (UID: \"d89d3fb0-8fa5-46b0-a1e3-353459d7cff7\") " pod="openshift-marketplace/certified-operators-khg2x" Dec 06 09:33:35 crc kubenswrapper[4672]: I1206 09:33:35.413154 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d89d3fb0-8fa5-46b0-a1e3-353459d7cff7-catalog-content\") pod \"certified-operators-khg2x\" (UID: \"d89d3fb0-8fa5-46b0-a1e3-353459d7cff7\") " pod="openshift-marketplace/certified-operators-khg2x" Dec 06 09:33:35 crc kubenswrapper[4672]: I1206 09:33:35.514972 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz7z8\" (UniqueName: \"kubernetes.io/projected/d89d3fb0-8fa5-46b0-a1e3-353459d7cff7-kube-api-access-fz7z8\") pod \"certified-operators-khg2x\" (UID: \"d89d3fb0-8fa5-46b0-a1e3-353459d7cff7\") " pod="openshift-marketplace/certified-operators-khg2x" Dec 06 09:33:35 crc kubenswrapper[4672]: I1206 09:33:35.515023 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d89d3fb0-8fa5-46b0-a1e3-353459d7cff7-utilities\") pod \"certified-operators-khg2x\" (UID: \"d89d3fb0-8fa5-46b0-a1e3-353459d7cff7\") " pod="openshift-marketplace/certified-operators-khg2x" Dec 06 09:33:35 crc kubenswrapper[4672]: I1206 09:33:35.515041 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d89d3fb0-8fa5-46b0-a1e3-353459d7cff7-catalog-content\") pod \"certified-operators-khg2x\" (UID: \"d89d3fb0-8fa5-46b0-a1e3-353459d7cff7\") " pod="openshift-marketplace/certified-operators-khg2x" Dec 06 09:33:35 crc kubenswrapper[4672]: I1206 09:33:35.515504 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d89d3fb0-8fa5-46b0-a1e3-353459d7cff7-catalog-content\") pod \"certified-operators-khg2x\" (UID: \"d89d3fb0-8fa5-46b0-a1e3-353459d7cff7\") " pod="openshift-marketplace/certified-operators-khg2x" Dec 06 09:33:35 crc kubenswrapper[4672]: I1206 09:33:35.515572 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d89d3fb0-8fa5-46b0-a1e3-353459d7cff7-utilities\") pod \"certified-operators-khg2x\" (UID: \"d89d3fb0-8fa5-46b0-a1e3-353459d7cff7\") " pod="openshift-marketplace/certified-operators-khg2x" Dec 06 09:33:35 crc kubenswrapper[4672]: I1206 09:33:35.534425 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz7z8\" (UniqueName: \"kubernetes.io/projected/d89d3fb0-8fa5-46b0-a1e3-353459d7cff7-kube-api-access-fz7z8\") pod \"certified-operators-khg2x\" (UID: \"d89d3fb0-8fa5-46b0-a1e3-353459d7cff7\") " pod="openshift-marketplace/certified-operators-khg2x" Dec 06 09:33:35 crc kubenswrapper[4672]: I1206 09:33:35.601675 4672 generic.go:334] "Generic (PLEG): container finished" podID="60193924-70a5-4baa-9d3f-1f39f3a3c0dc" containerID="dc3406ba605861c66cd305c2990b81d43a952743d251e253a35cf872b59f2ea3" exitCode=0 Dec 06 09:33:35 crc kubenswrapper[4672]: I1206 09:33:35.601731 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fx94x" event={"ID":"60193924-70a5-4baa-9d3f-1f39f3a3c0dc","Type":"ContainerDied","Data":"dc3406ba605861c66cd305c2990b81d43a952743d251e253a35cf872b59f2ea3"} Dec 06 09:33:35 crc kubenswrapper[4672]: I1206 09:33:35.601768 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fx94x" event={"ID":"60193924-70a5-4baa-9d3f-1f39f3a3c0dc","Type":"ContainerStarted","Data":"46e260f7c8df23947d372e8b41ce411ea412c1b601598eb2ce2567daadba86bc"} Dec 06 09:33:35 crc kubenswrapper[4672]: I1206 09:33:35.612641 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-khg2x" Dec 06 09:33:36 crc kubenswrapper[4672]: I1206 09:33:36.062104 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-khg2x"] Dec 06 09:33:36 crc kubenswrapper[4672]: I1206 09:33:36.557259 4672 scope.go:117] "RemoveContainer" containerID="5dbfab7d581ccc84f9182b337ec607ebbed05c2666107cd40acb0c82fff4b999" Dec 06 09:33:36 crc kubenswrapper[4672]: E1206 09:33:36.557822 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:33:36 crc kubenswrapper[4672]: I1206 09:33:36.611416 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fx94x" event={"ID":"60193924-70a5-4baa-9d3f-1f39f3a3c0dc","Type":"ContainerStarted","Data":"9fa35473dc86195f085f4e6368f47f6500f5059d9d9a35c002fdb5f06f72a7c4"} Dec 06 09:33:36 crc kubenswrapper[4672]: I1206 09:33:36.613482 4672 generic.go:334] "Generic (PLEG): container finished" podID="d89d3fb0-8fa5-46b0-a1e3-353459d7cff7" containerID="c827d1fe1b7f85c36d86edd454d3324c4dfe2d86dbb71eefceedf05634be4376" exitCode=0 Dec 06 09:33:36 crc kubenswrapper[4672]: I1206 09:33:36.613516 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khg2x" event={"ID":"d89d3fb0-8fa5-46b0-a1e3-353459d7cff7","Type":"ContainerDied","Data":"c827d1fe1b7f85c36d86edd454d3324c4dfe2d86dbb71eefceedf05634be4376"} Dec 06 09:33:36 crc kubenswrapper[4672]: I1206 09:33:36.613537 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khg2x" event={"ID":"d89d3fb0-8fa5-46b0-a1e3-353459d7cff7","Type":"ContainerStarted","Data":"9b98792e0b95ab614235c10866eebe782bcab6f194466a4e2d8f8c4095133adc"} Dec 06 09:33:37 crc kubenswrapper[4672]: I1206 09:33:37.623393 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khg2x" event={"ID":"d89d3fb0-8fa5-46b0-a1e3-353459d7cff7","Type":"ContainerStarted","Data":"63af343756aedb934a4ea2bb7ca7eccfd083be2b2b1fd325ec5a3a6800cd02d6"} Dec 06 09:33:38 crc kubenswrapper[4672]: I1206 09:33:38.631020 4672 generic.go:334] "Generic (PLEG): container finished" podID="33362cf5-2204-478e-b155-8277d00131a6" containerID="157c2c0e2445db78091d7c3c15af2dfba630a570d2cfb1ae3ea139febbaa451e" exitCode=0 Dec 06 09:33:38 crc kubenswrapper[4672]: I1206 09:33:38.631138 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d" event={"ID":"33362cf5-2204-478e-b155-8277d00131a6","Type":"ContainerDied","Data":"157c2c0e2445db78091d7c3c15af2dfba630a570d2cfb1ae3ea139febbaa451e"} Dec 06 09:33:39 crc kubenswrapper[4672]: I1206 09:33:39.641481 4672 generic.go:334] "Generic (PLEG): container finished" podID="d89d3fb0-8fa5-46b0-a1e3-353459d7cff7" containerID="63af343756aedb934a4ea2bb7ca7eccfd083be2b2b1fd325ec5a3a6800cd02d6" exitCode=0 Dec 06 09:33:39 crc kubenswrapper[4672]: I1206 09:33:39.641562 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khg2x" event={"ID":"d89d3fb0-8fa5-46b0-a1e3-353459d7cff7","Type":"ContainerDied","Data":"63af343756aedb934a4ea2bb7ca7eccfd083be2b2b1fd325ec5a3a6800cd02d6"} Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.015653 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d" Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.108304 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2mls\" (UniqueName: \"kubernetes.io/projected/33362cf5-2204-478e-b155-8277d00131a6-kube-api-access-v2mls\") pod \"33362cf5-2204-478e-b155-8277d00131a6\" (UID: \"33362cf5-2204-478e-b155-8277d00131a6\") " Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.108352 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33362cf5-2204-478e-b155-8277d00131a6-inventory\") pod \"33362cf5-2204-478e-b155-8277d00131a6\" (UID: \"33362cf5-2204-478e-b155-8277d00131a6\") " Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.108392 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33362cf5-2204-478e-b155-8277d00131a6-ssh-key\") pod \"33362cf5-2204-478e-b155-8277d00131a6\" (UID: \"33362cf5-2204-478e-b155-8277d00131a6\") " Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.136358 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33362cf5-2204-478e-b155-8277d00131a6-kube-api-access-v2mls" (OuterVolumeSpecName: "kube-api-access-v2mls") pod "33362cf5-2204-478e-b155-8277d00131a6" (UID: "33362cf5-2204-478e-b155-8277d00131a6"). InnerVolumeSpecName "kube-api-access-v2mls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.181256 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33362cf5-2204-478e-b155-8277d00131a6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "33362cf5-2204-478e-b155-8277d00131a6" (UID: "33362cf5-2204-478e-b155-8277d00131a6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.199091 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33362cf5-2204-478e-b155-8277d00131a6-inventory" (OuterVolumeSpecName: "inventory") pod "33362cf5-2204-478e-b155-8277d00131a6" (UID: "33362cf5-2204-478e-b155-8277d00131a6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.213746 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2mls\" (UniqueName: \"kubernetes.io/projected/33362cf5-2204-478e-b155-8277d00131a6-kube-api-access-v2mls\") on node \"crc\" DevicePath \"\"" Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.213794 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33362cf5-2204-478e-b155-8277d00131a6-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.213806 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33362cf5-2204-478e-b155-8277d00131a6-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.651892 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khg2x" event={"ID":"d89d3fb0-8fa5-46b0-a1e3-353459d7cff7","Type":"ContainerStarted","Data":"6f924e0587aa5f0accad0a2013632b96114778bb14182162aa394ee59702495e"} Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.654807 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d" event={"ID":"33362cf5-2204-478e-b155-8277d00131a6","Type":"ContainerDied","Data":"70dbbe3ebabc57be41f347bba7b8a0d1eac3d36a5194cd833e6507f593dd646e"} Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.654814 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d" Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.654849 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70dbbe3ebabc57be41f347bba7b8a0d1eac3d36a5194cd833e6507f593dd646e" Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.657818 4672 generic.go:334] "Generic (PLEG): container finished" podID="60193924-70a5-4baa-9d3f-1f39f3a3c0dc" containerID="9fa35473dc86195f085f4e6368f47f6500f5059d9d9a35c002fdb5f06f72a7c4" exitCode=0 Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.657847 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fx94x" event={"ID":"60193924-70a5-4baa-9d3f-1f39f3a3c0dc","Type":"ContainerDied","Data":"9fa35473dc86195f085f4e6368f47f6500f5059d9d9a35c002fdb5f06f72a7c4"} Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.680455 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-khg2x" podStartSLOduration=2.257616794 podStartE2EDuration="5.68043852s" podCreationTimestamp="2025-12-06 09:33:35 +0000 UTC" firstStartedPulling="2025-12-06 09:33:36.615467987 +0000 UTC m=+1634.359728274" lastFinishedPulling="2025-12-06 09:33:40.038289713 +0000 UTC m=+1637.782550000" observedRunningTime="2025-12-06 09:33:40.679944677 +0000 UTC m=+1638.424204964" watchObservedRunningTime="2025-12-06 09:33:40.68043852 +0000 UTC m=+1638.424698807" Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.739427 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kwnww"] Dec 06 09:33:40 crc kubenswrapper[4672]: E1206 09:33:40.739850 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33362cf5-2204-478e-b155-8277d00131a6" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.739868 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="33362cf5-2204-478e-b155-8277d00131a6" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.740049 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="33362cf5-2204-478e-b155-8277d00131a6" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.740633 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kwnww" Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.744177 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.744341 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.744450 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.753317 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kwnww"] Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.756841 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p6qrb" Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.826649 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kwnww\" (UID: \"202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kwnww" Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.826816 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kwnww\" (UID: \"202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kwnww" Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.827034 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhfvr\" (UniqueName: \"kubernetes.io/projected/202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e-kube-api-access-rhfvr\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kwnww\" (UID: \"202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kwnww" Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.929029 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kwnww\" (UID: \"202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kwnww" Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.929101 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kwnww\" (UID: \"202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kwnww" Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.929189 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhfvr\" (UniqueName: \"kubernetes.io/projected/202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e-kube-api-access-rhfvr\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kwnww\" (UID: \"202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kwnww" Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.934975 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kwnww\" (UID: \"202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kwnww" Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.934972 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kwnww\" (UID: \"202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kwnww" Dec 06 09:33:40 crc kubenswrapper[4672]: I1206 09:33:40.954503 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhfvr\" (UniqueName: \"kubernetes.io/projected/202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e-kube-api-access-rhfvr\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kwnww\" (UID: \"202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kwnww" Dec 06 09:33:41 crc kubenswrapper[4672]: I1206 09:33:41.061068 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kwnww" Dec 06 09:33:41 crc kubenswrapper[4672]: I1206 09:33:41.422444 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kwnww"] Dec 06 09:33:41 crc kubenswrapper[4672]: W1206 09:33:41.424281 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod202ca8e6_0bd4_4b3f_b90e_6feb22bdea2e.slice/crio-437b05133aae101e75eebb3c2b0e6d10c1342c8ba42f07cc7759bdb7aae7554e WatchSource:0}: Error finding container 437b05133aae101e75eebb3c2b0e6d10c1342c8ba42f07cc7759bdb7aae7554e: Status 404 returned error can't find the container with id 437b05133aae101e75eebb3c2b0e6d10c1342c8ba42f07cc7759bdb7aae7554e Dec 06 09:33:41 crc kubenswrapper[4672]: I1206 09:33:41.670759 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fx94x" event={"ID":"60193924-70a5-4baa-9d3f-1f39f3a3c0dc","Type":"ContainerStarted","Data":"7b91b1860fe75aa1236795ddbcd8a173b3fe35ece8036271d070422d662e629a"} Dec 06 09:33:41 crc kubenswrapper[4672]: I1206 09:33:41.671811 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kwnww" event={"ID":"202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e","Type":"ContainerStarted","Data":"437b05133aae101e75eebb3c2b0e6d10c1342c8ba42f07cc7759bdb7aae7554e"} Dec 06 09:33:41 crc kubenswrapper[4672]: I1206 09:33:41.688181 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fx94x" podStartSLOduration=2.230079285 podStartE2EDuration="7.688161367s" podCreationTimestamp="2025-12-06 09:33:34 +0000 UTC" firstStartedPulling="2025-12-06 09:33:35.604034301 +0000 UTC m=+1633.348294578" lastFinishedPulling="2025-12-06 09:33:41.062116363 +0000 UTC m=+1638.806376660" observedRunningTime="2025-12-06 09:33:41.685752261 +0000 UTC m=+1639.430012548" watchObservedRunningTime="2025-12-06 09:33:41.688161367 +0000 UTC m=+1639.432421674" Dec 06 09:33:42 crc kubenswrapper[4672]: I1206 09:33:42.692718 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kwnww" event={"ID":"202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e","Type":"ContainerStarted","Data":"2ee253348ca477296a8c93bd7223507e4bf865837529803da55c069a07c171f6"} Dec 06 09:33:42 crc kubenswrapper[4672]: I1206 09:33:42.712139 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kwnww" podStartSLOduration=1.991985658 podStartE2EDuration="2.712123s" podCreationTimestamp="2025-12-06 09:33:40 +0000 UTC" firstStartedPulling="2025-12-06 09:33:41.426380106 +0000 UTC m=+1639.170640393" lastFinishedPulling="2025-12-06 09:33:42.146517448 +0000 UTC m=+1639.890777735" observedRunningTime="2025-12-06 09:33:42.708338648 +0000 UTC m=+1640.452598935" watchObservedRunningTime="2025-12-06 09:33:42.712123 +0000 UTC m=+1640.456383287" Dec 06 09:33:44 crc kubenswrapper[4672]: I1206 09:33:44.423451 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fx94x" Dec 06 09:33:44 crc kubenswrapper[4672]: I1206 09:33:44.423811 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fx94x" Dec 06 09:33:45 crc kubenswrapper[4672]: I1206 09:33:45.470947 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fx94x" podUID="60193924-70a5-4baa-9d3f-1f39f3a3c0dc" containerName="registry-server" probeResult="failure" output=< Dec 06 09:33:45 crc kubenswrapper[4672]: timeout: failed to connect service ":50051" within 1s Dec 06 09:33:45 crc kubenswrapper[4672]: > Dec 06 09:33:45 crc kubenswrapper[4672]: I1206 09:33:45.613530 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-khg2x" Dec 06 09:33:45 crc kubenswrapper[4672]: I1206 09:33:45.613589 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-khg2x" Dec 06 09:33:46 crc kubenswrapper[4672]: I1206 09:33:46.652533 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-khg2x" podUID="d89d3fb0-8fa5-46b0-a1e3-353459d7cff7" containerName="registry-server" probeResult="failure" output=< Dec 06 09:33:46 crc kubenswrapper[4672]: timeout: failed to connect service ":50051" within 1s Dec 06 09:33:46 crc kubenswrapper[4672]: > Dec 06 09:33:49 crc kubenswrapper[4672]: I1206 09:33:49.557507 4672 scope.go:117] "RemoveContainer" containerID="5dbfab7d581ccc84f9182b337ec607ebbed05c2666107cd40acb0c82fff4b999" Dec 06 09:33:49 crc kubenswrapper[4672]: E1206 09:33:49.557982 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:33:52 crc kubenswrapper[4672]: I1206 09:33:52.046968 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-4dcrc"] Dec 06 09:33:52 crc kubenswrapper[4672]: I1206 09:33:52.054733 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-4dcrc"] Dec 06 09:33:52 crc kubenswrapper[4672]: I1206 09:33:52.566039 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5e31a55-73d0-43a5-8308-40e18ab22f58" path="/var/lib/kubelet/pods/e5e31a55-73d0-43a5-8308-40e18ab22f58/volumes" Dec 06 09:33:54 crc kubenswrapper[4672]: I1206 09:33:54.476916 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fx94x" Dec 06 09:33:54 crc kubenswrapper[4672]: I1206 09:33:54.526542 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fx94x" Dec 06 09:33:54 crc kubenswrapper[4672]: I1206 09:33:54.721868 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fx94x"] Dec 06 09:33:55 crc kubenswrapper[4672]: I1206 09:33:55.671736 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-khg2x" Dec 06 09:33:55 crc kubenswrapper[4672]: I1206 09:33:55.718683 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-khg2x" Dec 06 09:33:55 crc kubenswrapper[4672]: I1206 09:33:55.833247 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fx94x" podUID="60193924-70a5-4baa-9d3f-1f39f3a3c0dc" containerName="registry-server" containerID="cri-o://7b91b1860fe75aa1236795ddbcd8a173b3fe35ece8036271d070422d662e629a" gracePeriod=2 Dec 06 09:33:56 crc kubenswrapper[4672]: I1206 09:33:56.337171 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fx94x" Dec 06 09:33:56 crc kubenswrapper[4672]: I1206 09:33:56.387901 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lnqz\" (UniqueName: \"kubernetes.io/projected/60193924-70a5-4baa-9d3f-1f39f3a3c0dc-kube-api-access-8lnqz\") pod \"60193924-70a5-4baa-9d3f-1f39f3a3c0dc\" (UID: \"60193924-70a5-4baa-9d3f-1f39f3a3c0dc\") " Dec 06 09:33:56 crc kubenswrapper[4672]: I1206 09:33:56.388014 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60193924-70a5-4baa-9d3f-1f39f3a3c0dc-catalog-content\") pod \"60193924-70a5-4baa-9d3f-1f39f3a3c0dc\" (UID: \"60193924-70a5-4baa-9d3f-1f39f3a3c0dc\") " Dec 06 09:33:56 crc kubenswrapper[4672]: I1206 09:33:56.388039 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60193924-70a5-4baa-9d3f-1f39f3a3c0dc-utilities\") pod \"60193924-70a5-4baa-9d3f-1f39f3a3c0dc\" (UID: \"60193924-70a5-4baa-9d3f-1f39f3a3c0dc\") " Dec 06 09:33:56 crc kubenswrapper[4672]: I1206 09:33:56.388842 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60193924-70a5-4baa-9d3f-1f39f3a3c0dc-utilities" (OuterVolumeSpecName: "utilities") pod "60193924-70a5-4baa-9d3f-1f39f3a3c0dc" (UID: "60193924-70a5-4baa-9d3f-1f39f3a3c0dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:33:56 crc kubenswrapper[4672]: I1206 09:33:56.404521 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60193924-70a5-4baa-9d3f-1f39f3a3c0dc-kube-api-access-8lnqz" (OuterVolumeSpecName: "kube-api-access-8lnqz") pod "60193924-70a5-4baa-9d3f-1f39f3a3c0dc" (UID: "60193924-70a5-4baa-9d3f-1f39f3a3c0dc"). InnerVolumeSpecName "kube-api-access-8lnqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:33:56 crc kubenswrapper[4672]: I1206 09:33:56.489174 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60193924-70a5-4baa-9d3f-1f39f3a3c0dc-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:33:56 crc kubenswrapper[4672]: I1206 09:33:56.489220 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lnqz\" (UniqueName: \"kubernetes.io/projected/60193924-70a5-4baa-9d3f-1f39f3a3c0dc-kube-api-access-8lnqz\") on node \"crc\" DevicePath \"\"" Dec 06 09:33:56 crc kubenswrapper[4672]: I1206 09:33:56.525154 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60193924-70a5-4baa-9d3f-1f39f3a3c0dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60193924-70a5-4baa-9d3f-1f39f3a3c0dc" (UID: "60193924-70a5-4baa-9d3f-1f39f3a3c0dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:33:56 crc kubenswrapper[4672]: I1206 09:33:56.590579 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60193924-70a5-4baa-9d3f-1f39f3a3c0dc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:33:56 crc kubenswrapper[4672]: I1206 09:33:56.841979 4672 generic.go:334] "Generic (PLEG): container finished" podID="60193924-70a5-4baa-9d3f-1f39f3a3c0dc" containerID="7b91b1860fe75aa1236795ddbcd8a173b3fe35ece8036271d070422d662e629a" exitCode=0 Dec 06 09:33:56 crc kubenswrapper[4672]: I1206 09:33:56.842035 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fx94x" Dec 06 09:33:56 crc kubenswrapper[4672]: I1206 09:33:56.842046 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fx94x" event={"ID":"60193924-70a5-4baa-9d3f-1f39f3a3c0dc","Type":"ContainerDied","Data":"7b91b1860fe75aa1236795ddbcd8a173b3fe35ece8036271d070422d662e629a"} Dec 06 09:33:56 crc kubenswrapper[4672]: I1206 09:33:56.842948 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fx94x" event={"ID":"60193924-70a5-4baa-9d3f-1f39f3a3c0dc","Type":"ContainerDied","Data":"46e260f7c8df23947d372e8b41ce411ea412c1b601598eb2ce2567daadba86bc"} Dec 06 09:33:56 crc kubenswrapper[4672]: I1206 09:33:56.842969 4672 scope.go:117] "RemoveContainer" containerID="7b91b1860fe75aa1236795ddbcd8a173b3fe35ece8036271d070422d662e629a" Dec 06 09:33:56 crc kubenswrapper[4672]: I1206 09:33:56.863124 4672 scope.go:117] "RemoveContainer" containerID="9fa35473dc86195f085f4e6368f47f6500f5059d9d9a35c002fdb5f06f72a7c4" Dec 06 09:33:56 crc kubenswrapper[4672]: I1206 09:33:56.879683 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fx94x"] Dec 06 09:33:56 crc kubenswrapper[4672]: I1206 09:33:56.884813 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fx94x"] Dec 06 09:33:56 crc kubenswrapper[4672]: I1206 09:33:56.889354 4672 scope.go:117] "RemoveContainer" containerID="dc3406ba605861c66cd305c2990b81d43a952743d251e253a35cf872b59f2ea3" Dec 06 09:33:56 crc kubenswrapper[4672]: I1206 09:33:56.927881 4672 scope.go:117] "RemoveContainer" containerID="7b91b1860fe75aa1236795ddbcd8a173b3fe35ece8036271d070422d662e629a" Dec 06 09:33:56 crc kubenswrapper[4672]: E1206 09:33:56.928320 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b91b1860fe75aa1236795ddbcd8a173b3fe35ece8036271d070422d662e629a\": container with ID starting with 7b91b1860fe75aa1236795ddbcd8a173b3fe35ece8036271d070422d662e629a not found: ID does not exist" containerID="7b91b1860fe75aa1236795ddbcd8a173b3fe35ece8036271d070422d662e629a" Dec 06 09:33:56 crc kubenswrapper[4672]: I1206 09:33:56.928353 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b91b1860fe75aa1236795ddbcd8a173b3fe35ece8036271d070422d662e629a"} err="failed to get container status \"7b91b1860fe75aa1236795ddbcd8a173b3fe35ece8036271d070422d662e629a\": rpc error: code = NotFound desc = could not find container \"7b91b1860fe75aa1236795ddbcd8a173b3fe35ece8036271d070422d662e629a\": container with ID starting with 7b91b1860fe75aa1236795ddbcd8a173b3fe35ece8036271d070422d662e629a not found: ID does not exist" Dec 06 09:33:56 crc kubenswrapper[4672]: I1206 09:33:56.928374 4672 scope.go:117] "RemoveContainer" containerID="9fa35473dc86195f085f4e6368f47f6500f5059d9d9a35c002fdb5f06f72a7c4" Dec 06 09:33:56 crc kubenswrapper[4672]: E1206 09:33:56.928771 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fa35473dc86195f085f4e6368f47f6500f5059d9d9a35c002fdb5f06f72a7c4\": container with ID starting with 9fa35473dc86195f085f4e6368f47f6500f5059d9d9a35c002fdb5f06f72a7c4 not found: ID does not exist" containerID="9fa35473dc86195f085f4e6368f47f6500f5059d9d9a35c002fdb5f06f72a7c4" Dec 06 09:33:56 crc kubenswrapper[4672]: I1206 09:33:56.928841 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fa35473dc86195f085f4e6368f47f6500f5059d9d9a35c002fdb5f06f72a7c4"} err="failed to get container status \"9fa35473dc86195f085f4e6368f47f6500f5059d9d9a35c002fdb5f06f72a7c4\": rpc error: code = NotFound desc = could not find container \"9fa35473dc86195f085f4e6368f47f6500f5059d9d9a35c002fdb5f06f72a7c4\": container with ID starting with 9fa35473dc86195f085f4e6368f47f6500f5059d9d9a35c002fdb5f06f72a7c4 not found: ID does not exist" Dec 06 09:33:56 crc kubenswrapper[4672]: I1206 09:33:56.928871 4672 scope.go:117] "RemoveContainer" containerID="dc3406ba605861c66cd305c2990b81d43a952743d251e253a35cf872b59f2ea3" Dec 06 09:33:56 crc kubenswrapper[4672]: E1206 09:33:56.929173 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc3406ba605861c66cd305c2990b81d43a952743d251e253a35cf872b59f2ea3\": container with ID starting with dc3406ba605861c66cd305c2990b81d43a952743d251e253a35cf872b59f2ea3 not found: ID does not exist" containerID="dc3406ba605861c66cd305c2990b81d43a952743d251e253a35cf872b59f2ea3" Dec 06 09:33:56 crc kubenswrapper[4672]: I1206 09:33:56.929195 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc3406ba605861c66cd305c2990b81d43a952743d251e253a35cf872b59f2ea3"} err="failed to get container status \"dc3406ba605861c66cd305c2990b81d43a952743d251e253a35cf872b59f2ea3\": rpc error: code = NotFound desc = could not find container \"dc3406ba605861c66cd305c2990b81d43a952743d251e253a35cf872b59f2ea3\": container with ID starting with dc3406ba605861c66cd305c2990b81d43a952743d251e253a35cf872b59f2ea3 not found: ID does not exist" Dec 06 09:33:57 crc kubenswrapper[4672]: I1206 09:33:57.729404 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-khg2x"] Dec 06 09:33:57 crc kubenswrapper[4672]: I1206 09:33:57.729902 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-khg2x" podUID="d89d3fb0-8fa5-46b0-a1e3-353459d7cff7" containerName="registry-server" containerID="cri-o://6f924e0587aa5f0accad0a2013632b96114778bb14182162aa394ee59702495e" gracePeriod=2 Dec 06 09:33:58 crc kubenswrapper[4672]: I1206 09:33:58.246932 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-khg2x" Dec 06 09:33:58 crc kubenswrapper[4672]: I1206 09:33:58.423854 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz7z8\" (UniqueName: \"kubernetes.io/projected/d89d3fb0-8fa5-46b0-a1e3-353459d7cff7-kube-api-access-fz7z8\") pod \"d89d3fb0-8fa5-46b0-a1e3-353459d7cff7\" (UID: \"d89d3fb0-8fa5-46b0-a1e3-353459d7cff7\") " Dec 06 09:33:58 crc kubenswrapper[4672]: I1206 09:33:58.423959 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d89d3fb0-8fa5-46b0-a1e3-353459d7cff7-catalog-content\") pod \"d89d3fb0-8fa5-46b0-a1e3-353459d7cff7\" (UID: \"d89d3fb0-8fa5-46b0-a1e3-353459d7cff7\") " Dec 06 09:33:58 crc kubenswrapper[4672]: I1206 09:33:58.424152 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d89d3fb0-8fa5-46b0-a1e3-353459d7cff7-utilities\") pod \"d89d3fb0-8fa5-46b0-a1e3-353459d7cff7\" (UID: \"d89d3fb0-8fa5-46b0-a1e3-353459d7cff7\") " Dec 06 09:33:58 crc kubenswrapper[4672]: I1206 09:33:58.424823 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d89d3fb0-8fa5-46b0-a1e3-353459d7cff7-utilities" (OuterVolumeSpecName: "utilities") pod "d89d3fb0-8fa5-46b0-a1e3-353459d7cff7" (UID: "d89d3fb0-8fa5-46b0-a1e3-353459d7cff7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:33:58 crc kubenswrapper[4672]: I1206 09:33:58.436073 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d89d3fb0-8fa5-46b0-a1e3-353459d7cff7-kube-api-access-fz7z8" (OuterVolumeSpecName: "kube-api-access-fz7z8") pod "d89d3fb0-8fa5-46b0-a1e3-353459d7cff7" (UID: "d89d3fb0-8fa5-46b0-a1e3-353459d7cff7"). InnerVolumeSpecName "kube-api-access-fz7z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:33:58 crc kubenswrapper[4672]: I1206 09:33:58.482479 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d89d3fb0-8fa5-46b0-a1e3-353459d7cff7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d89d3fb0-8fa5-46b0-a1e3-353459d7cff7" (UID: "d89d3fb0-8fa5-46b0-a1e3-353459d7cff7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:33:58 crc kubenswrapper[4672]: I1206 09:33:58.526226 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz7z8\" (UniqueName: \"kubernetes.io/projected/d89d3fb0-8fa5-46b0-a1e3-353459d7cff7-kube-api-access-fz7z8\") on node \"crc\" DevicePath \"\"" Dec 06 09:33:58 crc kubenswrapper[4672]: I1206 09:33:58.526276 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d89d3fb0-8fa5-46b0-a1e3-353459d7cff7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:33:58 crc kubenswrapper[4672]: I1206 09:33:58.526294 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d89d3fb0-8fa5-46b0-a1e3-353459d7cff7-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:33:58 crc kubenswrapper[4672]: I1206 09:33:58.568746 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60193924-70a5-4baa-9d3f-1f39f3a3c0dc" path="/var/lib/kubelet/pods/60193924-70a5-4baa-9d3f-1f39f3a3c0dc/volumes" Dec 06 09:33:58 crc kubenswrapper[4672]: I1206 09:33:58.867322 4672 generic.go:334] "Generic (PLEG): container finished" podID="d89d3fb0-8fa5-46b0-a1e3-353459d7cff7" containerID="6f924e0587aa5f0accad0a2013632b96114778bb14182162aa394ee59702495e" exitCode=0 Dec 06 09:33:58 crc kubenswrapper[4672]: I1206 09:33:58.867370 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khg2x" event={"ID":"d89d3fb0-8fa5-46b0-a1e3-353459d7cff7","Type":"ContainerDied","Data":"6f924e0587aa5f0accad0a2013632b96114778bb14182162aa394ee59702495e"} Dec 06 09:33:58 crc kubenswrapper[4672]: I1206 09:33:58.867420 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khg2x" event={"ID":"d89d3fb0-8fa5-46b0-a1e3-353459d7cff7","Type":"ContainerDied","Data":"9b98792e0b95ab614235c10866eebe782bcab6f194466a4e2d8f8c4095133adc"} Dec 06 09:33:58 crc kubenswrapper[4672]: I1206 09:33:58.867450 4672 scope.go:117] "RemoveContainer" containerID="6f924e0587aa5f0accad0a2013632b96114778bb14182162aa394ee59702495e" Dec 06 09:33:58 crc kubenswrapper[4672]: I1206 09:33:58.867516 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-khg2x" Dec 06 09:33:58 crc kubenswrapper[4672]: I1206 09:33:58.895941 4672 scope.go:117] "RemoveContainer" containerID="63af343756aedb934a4ea2bb7ca7eccfd083be2b2b1fd325ec5a3a6800cd02d6" Dec 06 09:33:58 crc kubenswrapper[4672]: I1206 09:33:58.902754 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-khg2x"] Dec 06 09:33:58 crc kubenswrapper[4672]: I1206 09:33:58.934091 4672 scope.go:117] "RemoveContainer" containerID="c827d1fe1b7f85c36d86edd454d3324c4dfe2d86dbb71eefceedf05634be4376" Dec 06 09:33:58 crc kubenswrapper[4672]: I1206 09:33:58.955860 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-khg2x"] Dec 06 09:33:59 crc kubenswrapper[4672]: I1206 09:33:59.038889 4672 scope.go:117] "RemoveContainer" containerID="6f924e0587aa5f0accad0a2013632b96114778bb14182162aa394ee59702495e" Dec 06 09:33:59 crc kubenswrapper[4672]: E1206 09:33:59.040850 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f924e0587aa5f0accad0a2013632b96114778bb14182162aa394ee59702495e\": container with ID starting with 6f924e0587aa5f0accad0a2013632b96114778bb14182162aa394ee59702495e not found: ID does not exist" containerID="6f924e0587aa5f0accad0a2013632b96114778bb14182162aa394ee59702495e" Dec 06 09:33:59 crc kubenswrapper[4672]: I1206 09:33:59.040894 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f924e0587aa5f0accad0a2013632b96114778bb14182162aa394ee59702495e"} err="failed to get container status \"6f924e0587aa5f0accad0a2013632b96114778bb14182162aa394ee59702495e\": rpc error: code = NotFound desc = could not find container \"6f924e0587aa5f0accad0a2013632b96114778bb14182162aa394ee59702495e\": container with ID starting with 6f924e0587aa5f0accad0a2013632b96114778bb14182162aa394ee59702495e not found: ID does not exist" Dec 06 09:33:59 crc kubenswrapper[4672]: I1206 09:33:59.040924 4672 scope.go:117] "RemoveContainer" containerID="63af343756aedb934a4ea2bb7ca7eccfd083be2b2b1fd325ec5a3a6800cd02d6" Dec 06 09:33:59 crc kubenswrapper[4672]: E1206 09:33:59.041209 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63af343756aedb934a4ea2bb7ca7eccfd083be2b2b1fd325ec5a3a6800cd02d6\": container with ID starting with 63af343756aedb934a4ea2bb7ca7eccfd083be2b2b1fd325ec5a3a6800cd02d6 not found: ID does not exist" containerID="63af343756aedb934a4ea2bb7ca7eccfd083be2b2b1fd325ec5a3a6800cd02d6" Dec 06 09:33:59 crc kubenswrapper[4672]: I1206 09:33:59.041241 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63af343756aedb934a4ea2bb7ca7eccfd083be2b2b1fd325ec5a3a6800cd02d6"} err="failed to get container status \"63af343756aedb934a4ea2bb7ca7eccfd083be2b2b1fd325ec5a3a6800cd02d6\": rpc error: code = NotFound desc = could not find container \"63af343756aedb934a4ea2bb7ca7eccfd083be2b2b1fd325ec5a3a6800cd02d6\": container with ID starting with 63af343756aedb934a4ea2bb7ca7eccfd083be2b2b1fd325ec5a3a6800cd02d6 not found: ID does not exist" Dec 06 09:33:59 crc kubenswrapper[4672]: I1206 09:33:59.041261 4672 scope.go:117] "RemoveContainer" containerID="c827d1fe1b7f85c36d86edd454d3324c4dfe2d86dbb71eefceedf05634be4376" Dec 06 09:33:59 crc kubenswrapper[4672]: E1206 09:33:59.041648 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c827d1fe1b7f85c36d86edd454d3324c4dfe2d86dbb71eefceedf05634be4376\": container with ID starting with c827d1fe1b7f85c36d86edd454d3324c4dfe2d86dbb71eefceedf05634be4376 not found: ID does not exist" containerID="c827d1fe1b7f85c36d86edd454d3324c4dfe2d86dbb71eefceedf05634be4376" Dec 06 09:33:59 crc kubenswrapper[4672]: I1206 09:33:59.041679 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c827d1fe1b7f85c36d86edd454d3324c4dfe2d86dbb71eefceedf05634be4376"} err="failed to get container status \"c827d1fe1b7f85c36d86edd454d3324c4dfe2d86dbb71eefceedf05634be4376\": rpc error: code = NotFound desc = could not find container \"c827d1fe1b7f85c36d86edd454d3324c4dfe2d86dbb71eefceedf05634be4376\": container with ID starting with c827d1fe1b7f85c36d86edd454d3324c4dfe2d86dbb71eefceedf05634be4376 not found: ID does not exist" Dec 06 09:33:59 crc kubenswrapper[4672]: I1206 09:33:59.049519 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-wz7rb"] Dec 06 09:33:59 crc kubenswrapper[4672]: I1206 09:33:59.062875 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-wz7rb"] Dec 06 09:34:00 crc kubenswrapper[4672]: I1206 09:34:00.039106 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-pmmnm"] Dec 06 09:34:00 crc kubenswrapper[4672]: I1206 09:34:00.052841 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-pmmnm"] Dec 06 09:34:00 crc kubenswrapper[4672]: I1206 09:34:00.568366 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f78f7c5-5965-4932-845c-8f0be90b421a" path="/var/lib/kubelet/pods/4f78f7c5-5965-4932-845c-8f0be90b421a/volumes" Dec 06 09:34:00 crc kubenswrapper[4672]: I1206 09:34:00.569557 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a9bafbf-4733-4178-8012-3e94d02aa9cb" path="/var/lib/kubelet/pods/6a9bafbf-4733-4178-8012-3e94d02aa9cb/volumes" Dec 06 09:34:00 crc kubenswrapper[4672]: I1206 09:34:00.570394 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d89d3fb0-8fa5-46b0-a1e3-353459d7cff7" path="/var/lib/kubelet/pods/d89d3fb0-8fa5-46b0-a1e3-353459d7cff7/volumes" Dec 06 09:34:01 crc kubenswrapper[4672]: I1206 09:34:01.557828 4672 scope.go:117] "RemoveContainer" containerID="5dbfab7d581ccc84f9182b337ec607ebbed05c2666107cd40acb0c82fff4b999" Dec 06 09:34:01 crc kubenswrapper[4672]: E1206 09:34:01.558366 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:34:02 crc kubenswrapper[4672]: I1206 09:34:02.039130 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-hrzmd"] Dec 06 09:34:02 crc kubenswrapper[4672]: I1206 09:34:02.046383 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-hrzmd"] Dec 06 09:34:02 crc kubenswrapper[4672]: I1206 09:34:02.566817 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e11e53b-de6c-4e98-8a9c-7fbdac1a1401" path="/var/lib/kubelet/pods/6e11e53b-de6c-4e98-8a9c-7fbdac1a1401/volumes" Dec 06 09:34:13 crc kubenswrapper[4672]: I1206 09:34:13.556803 4672 scope.go:117] "RemoveContainer" containerID="5dbfab7d581ccc84f9182b337ec607ebbed05c2666107cd40acb0c82fff4b999" Dec 06 09:34:13 crc kubenswrapper[4672]: E1206 09:34:13.557713 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:34:21 crc kubenswrapper[4672]: I1206 09:34:21.048531 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-jnkpx"] Dec 06 09:34:21 crc kubenswrapper[4672]: I1206 09:34:21.056003 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-jnkpx"] Dec 06 09:34:22 crc kubenswrapper[4672]: I1206 09:34:22.592204 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34617816-681d-44a7-b88d-73983735dd75" path="/var/lib/kubelet/pods/34617816-681d-44a7-b88d-73983735dd75/volumes" Dec 06 09:34:25 crc kubenswrapper[4672]: I1206 09:34:25.794209 4672 scope.go:117] "RemoveContainer" containerID="3ce98b553cc83461588d7a633c3558405e42b7ae843dd18b41c50eb9a8cce884" Dec 06 09:34:25 crc kubenswrapper[4672]: I1206 09:34:25.826735 4672 scope.go:117] "RemoveContainer" containerID="7ea2d51b6c8a3a3d7dd06aa4aa9f77784520acda5263e9107f77febfec11562e" Dec 06 09:34:25 crc kubenswrapper[4672]: I1206 09:34:25.876803 4672 scope.go:117] "RemoveContainer" containerID="1499aa2a253aa13e19e1279daa60b9266aaa4840c9640a1dd613305753d51684" Dec 06 09:34:25 crc kubenswrapper[4672]: I1206 09:34:25.915147 4672 scope.go:117] "RemoveContainer" containerID="8a5bc2686ab392e7026d7311a2b21f7285dff688effd999c36f0d14f7b03646b" Dec 06 09:34:25 crc kubenswrapper[4672]: I1206 09:34:25.948687 4672 scope.go:117] "RemoveContainer" containerID="7f4f5e161f8febd2d5ed444cf217a57031e468ee35323ee9e8ec347625d57236" Dec 06 09:34:26 crc kubenswrapper[4672]: I1206 09:34:26.016051 4672 scope.go:117] "RemoveContainer" containerID="e194302916df6a7135e83785075becbc55e764210162c9c32aaa1fa745ca1f2e" Dec 06 09:34:28 crc kubenswrapper[4672]: I1206 09:34:28.557057 4672 scope.go:117] "RemoveContainer" containerID="5dbfab7d581ccc84f9182b337ec607ebbed05c2666107cd40acb0c82fff4b999" Dec 06 09:34:28 crc kubenswrapper[4672]: E1206 09:34:28.558022 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:34:31 crc kubenswrapper[4672]: I1206 09:34:31.176545 4672 generic.go:334] "Generic (PLEG): container finished" podID="202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e" containerID="2ee253348ca477296a8c93bd7223507e4bf865837529803da55c069a07c171f6" exitCode=0 Dec 06 09:34:31 crc kubenswrapper[4672]: I1206 09:34:31.176937 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kwnww" event={"ID":"202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e","Type":"ContainerDied","Data":"2ee253348ca477296a8c93bd7223507e4bf865837529803da55c069a07c171f6"} Dec 06 09:34:32 crc kubenswrapper[4672]: I1206 09:34:32.637766 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kwnww" Dec 06 09:34:32 crc kubenswrapper[4672]: I1206 09:34:32.761148 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e-ssh-key\") pod \"202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e\" (UID: \"202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e\") " Dec 06 09:34:32 crc kubenswrapper[4672]: I1206 09:34:32.761215 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e-inventory\") pod \"202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e\" (UID: \"202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e\") " Dec 06 09:34:32 crc kubenswrapper[4672]: I1206 09:34:32.761307 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhfvr\" (UniqueName: \"kubernetes.io/projected/202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e-kube-api-access-rhfvr\") pod \"202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e\" (UID: \"202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e\") " Dec 06 09:34:32 crc kubenswrapper[4672]: I1206 09:34:32.768996 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e-kube-api-access-rhfvr" (OuterVolumeSpecName: "kube-api-access-rhfvr") pod "202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e" (UID: "202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e"). InnerVolumeSpecName "kube-api-access-rhfvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:34:32 crc kubenswrapper[4672]: I1206 09:34:32.787454 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e" (UID: "202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:34:32 crc kubenswrapper[4672]: I1206 09:34:32.788180 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e-inventory" (OuterVolumeSpecName: "inventory") pod "202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e" (UID: "202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:34:32 crc kubenswrapper[4672]: I1206 09:34:32.863121 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:34:32 crc kubenswrapper[4672]: I1206 09:34:32.863153 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhfvr\" (UniqueName: \"kubernetes.io/projected/202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e-kube-api-access-rhfvr\") on node \"crc\" DevicePath \"\"" Dec 06 09:34:32 crc kubenswrapper[4672]: I1206 09:34:32.863163 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:34:33 crc kubenswrapper[4672]: I1206 09:34:33.196636 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kwnww" event={"ID":"202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e","Type":"ContainerDied","Data":"437b05133aae101e75eebb3c2b0e6d10c1342c8ba42f07cc7759bdb7aae7554e"} Dec 06 09:34:33 crc kubenswrapper[4672]: I1206 09:34:33.196980 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="437b05133aae101e75eebb3c2b0e6d10c1342c8ba42f07cc7759bdb7aae7554e" Dec 06 09:34:33 crc kubenswrapper[4672]: I1206 09:34:33.196697 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kwnww" Dec 06 09:34:33 crc kubenswrapper[4672]: I1206 09:34:33.293864 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn"] Dec 06 09:34:33 crc kubenswrapper[4672]: E1206 09:34:33.294252 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d89d3fb0-8fa5-46b0-a1e3-353459d7cff7" containerName="extract-utilities" Dec 06 09:34:33 crc kubenswrapper[4672]: I1206 09:34:33.294270 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d89d3fb0-8fa5-46b0-a1e3-353459d7cff7" containerName="extract-utilities" Dec 06 09:34:33 crc kubenswrapper[4672]: E1206 09:34:33.294299 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d89d3fb0-8fa5-46b0-a1e3-353459d7cff7" containerName="registry-server" Dec 06 09:34:33 crc kubenswrapper[4672]: I1206 09:34:33.294325 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d89d3fb0-8fa5-46b0-a1e3-353459d7cff7" containerName="registry-server" Dec 06 09:34:33 crc kubenswrapper[4672]: E1206 09:34:33.294351 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60193924-70a5-4baa-9d3f-1f39f3a3c0dc" containerName="extract-utilities" Dec 06 09:34:33 crc kubenswrapper[4672]: I1206 09:34:33.294360 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="60193924-70a5-4baa-9d3f-1f39f3a3c0dc" containerName="extract-utilities" Dec 06 09:34:33 crc kubenswrapper[4672]: E1206 09:34:33.294369 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60193924-70a5-4baa-9d3f-1f39f3a3c0dc" containerName="extract-content" Dec 06 09:34:33 crc kubenswrapper[4672]: I1206 09:34:33.294377 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="60193924-70a5-4baa-9d3f-1f39f3a3c0dc" containerName="extract-content" Dec 06 09:34:33 crc kubenswrapper[4672]: E1206 09:34:33.294408 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d89d3fb0-8fa5-46b0-a1e3-353459d7cff7" containerName="extract-content" Dec 06 09:34:33 crc kubenswrapper[4672]: I1206 09:34:33.294417 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d89d3fb0-8fa5-46b0-a1e3-353459d7cff7" containerName="extract-content" Dec 06 09:34:33 crc kubenswrapper[4672]: E1206 09:34:33.294434 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60193924-70a5-4baa-9d3f-1f39f3a3c0dc" containerName="registry-server" Dec 06 09:34:33 crc kubenswrapper[4672]: I1206 09:34:33.294442 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="60193924-70a5-4baa-9d3f-1f39f3a3c0dc" containerName="registry-server" Dec 06 09:34:33 crc kubenswrapper[4672]: E1206 09:34:33.294453 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 06 09:34:33 crc kubenswrapper[4672]: I1206 09:34:33.294462 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 06 09:34:33 crc kubenswrapper[4672]: I1206 09:34:33.294689 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 06 09:34:33 crc kubenswrapper[4672]: I1206 09:34:33.294726 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="d89d3fb0-8fa5-46b0-a1e3-353459d7cff7" containerName="registry-server" Dec 06 09:34:33 crc kubenswrapper[4672]: I1206 09:34:33.294737 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="60193924-70a5-4baa-9d3f-1f39f3a3c0dc" containerName="registry-server" Dec 06 09:34:33 crc kubenswrapper[4672]: I1206 09:34:33.295393 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn" Dec 06 09:34:33 crc kubenswrapper[4672]: I1206 09:34:33.297900 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 09:34:33 crc kubenswrapper[4672]: I1206 09:34:33.297917 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p6qrb" Dec 06 09:34:33 crc kubenswrapper[4672]: I1206 09:34:33.298062 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 09:34:33 crc kubenswrapper[4672]: I1206 09:34:33.298763 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:34:33 crc kubenswrapper[4672]: I1206 09:34:33.312805 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn"] Dec 06 09:34:33 crc kubenswrapper[4672]: I1206 09:34:33.371854 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/adc9f5aa-a718-43cf-8889-1e71aaa151c4-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn\" (UID: \"adc9f5aa-a718-43cf-8889-1e71aaa151c4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn" Dec 06 09:34:33 crc kubenswrapper[4672]: I1206 09:34:33.371923 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/adc9f5aa-a718-43cf-8889-1e71aaa151c4-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn\" (UID: \"adc9f5aa-a718-43cf-8889-1e71aaa151c4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn" Dec 06 09:34:33 crc kubenswrapper[4672]: I1206 09:34:33.371971 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmg7w\" (UniqueName: \"kubernetes.io/projected/adc9f5aa-a718-43cf-8889-1e71aaa151c4-kube-api-access-tmg7w\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn\" (UID: \"adc9f5aa-a718-43cf-8889-1e71aaa151c4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn" Dec 06 09:34:33 crc kubenswrapper[4672]: I1206 09:34:33.473083 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/adc9f5aa-a718-43cf-8889-1e71aaa151c4-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn\" (UID: \"adc9f5aa-a718-43cf-8889-1e71aaa151c4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn" Dec 06 09:34:33 crc kubenswrapper[4672]: I1206 09:34:33.475033 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmg7w\" (UniqueName: \"kubernetes.io/projected/adc9f5aa-a718-43cf-8889-1e71aaa151c4-kube-api-access-tmg7w\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn\" (UID: \"adc9f5aa-a718-43cf-8889-1e71aaa151c4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn" Dec 06 09:34:33 crc kubenswrapper[4672]: I1206 09:34:33.475179 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/adc9f5aa-a718-43cf-8889-1e71aaa151c4-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn\" (UID: \"adc9f5aa-a718-43cf-8889-1e71aaa151c4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn" Dec 06 09:34:33 crc kubenswrapper[4672]: I1206 09:34:33.481227 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/adc9f5aa-a718-43cf-8889-1e71aaa151c4-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn\" (UID: \"adc9f5aa-a718-43cf-8889-1e71aaa151c4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn" Dec 06 09:34:33 crc kubenswrapper[4672]: I1206 09:34:33.483513 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/adc9f5aa-a718-43cf-8889-1e71aaa151c4-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn\" (UID: \"adc9f5aa-a718-43cf-8889-1e71aaa151c4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn" Dec 06 09:34:33 crc kubenswrapper[4672]: I1206 09:34:33.493847 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmg7w\" (UniqueName: \"kubernetes.io/projected/adc9f5aa-a718-43cf-8889-1e71aaa151c4-kube-api-access-tmg7w\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn\" (UID: \"adc9f5aa-a718-43cf-8889-1e71aaa151c4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn" Dec 06 09:34:33 crc kubenswrapper[4672]: I1206 09:34:33.612757 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn" Dec 06 09:34:34 crc kubenswrapper[4672]: I1206 09:34:34.159902 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn"] Dec 06 09:34:34 crc kubenswrapper[4672]: I1206 09:34:34.208312 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn" event={"ID":"adc9f5aa-a718-43cf-8889-1e71aaa151c4","Type":"ContainerStarted","Data":"839168eada371fe749cea83622995f5b93a5ab4ed40fe4edac9bae28ea6b0793"} Dec 06 09:34:35 crc kubenswrapper[4672]: I1206 09:34:35.216172 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn" event={"ID":"adc9f5aa-a718-43cf-8889-1e71aaa151c4","Type":"ContainerStarted","Data":"edb58ebf470502fa3a0a3ed0904044ded966a8d1ec88988985df9f3aeb872457"} Dec 06 09:34:35 crc kubenswrapper[4672]: I1206 09:34:35.231894 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn" podStartSLOduration=1.818800609 podStartE2EDuration="2.231876578s" podCreationTimestamp="2025-12-06 09:34:33 +0000 UTC" firstStartedPulling="2025-12-06 09:34:34.17124052 +0000 UTC m=+1691.915500807" lastFinishedPulling="2025-12-06 09:34:34.584316489 +0000 UTC m=+1692.328576776" observedRunningTime="2025-12-06 09:34:35.229658577 +0000 UTC m=+1692.973918864" watchObservedRunningTime="2025-12-06 09:34:35.231876578 +0000 UTC m=+1692.976136865" Dec 06 09:34:40 crc kubenswrapper[4672]: I1206 09:34:40.266764 4672 generic.go:334] "Generic (PLEG): container finished" podID="adc9f5aa-a718-43cf-8889-1e71aaa151c4" containerID="edb58ebf470502fa3a0a3ed0904044ded966a8d1ec88988985df9f3aeb872457" exitCode=0 Dec 06 09:34:40 crc kubenswrapper[4672]: I1206 09:34:40.266859 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn" event={"ID":"adc9f5aa-a718-43cf-8889-1e71aaa151c4","Type":"ContainerDied","Data":"edb58ebf470502fa3a0a3ed0904044ded966a8d1ec88988985df9f3aeb872457"} Dec 06 09:34:41 crc kubenswrapper[4672]: I1206 09:34:41.817733 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn" Dec 06 09:34:41 crc kubenswrapper[4672]: I1206 09:34:41.941830 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/adc9f5aa-a718-43cf-8889-1e71aaa151c4-ssh-key\") pod \"adc9f5aa-a718-43cf-8889-1e71aaa151c4\" (UID: \"adc9f5aa-a718-43cf-8889-1e71aaa151c4\") " Dec 06 09:34:41 crc kubenswrapper[4672]: I1206 09:34:41.941979 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/adc9f5aa-a718-43cf-8889-1e71aaa151c4-inventory\") pod \"adc9f5aa-a718-43cf-8889-1e71aaa151c4\" (UID: \"adc9f5aa-a718-43cf-8889-1e71aaa151c4\") " Dec 06 09:34:41 crc kubenswrapper[4672]: I1206 09:34:41.942028 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmg7w\" (UniqueName: \"kubernetes.io/projected/adc9f5aa-a718-43cf-8889-1e71aaa151c4-kube-api-access-tmg7w\") pod \"adc9f5aa-a718-43cf-8889-1e71aaa151c4\" (UID: \"adc9f5aa-a718-43cf-8889-1e71aaa151c4\") " Dec 06 09:34:41 crc kubenswrapper[4672]: I1206 09:34:41.948900 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adc9f5aa-a718-43cf-8889-1e71aaa151c4-kube-api-access-tmg7w" (OuterVolumeSpecName: "kube-api-access-tmg7w") pod "adc9f5aa-a718-43cf-8889-1e71aaa151c4" (UID: "adc9f5aa-a718-43cf-8889-1e71aaa151c4"). InnerVolumeSpecName "kube-api-access-tmg7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:34:41 crc kubenswrapper[4672]: I1206 09:34:41.991998 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adc9f5aa-a718-43cf-8889-1e71aaa151c4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "adc9f5aa-a718-43cf-8889-1e71aaa151c4" (UID: "adc9f5aa-a718-43cf-8889-1e71aaa151c4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:34:41 crc kubenswrapper[4672]: I1206 09:34:41.994385 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adc9f5aa-a718-43cf-8889-1e71aaa151c4-inventory" (OuterVolumeSpecName: "inventory") pod "adc9f5aa-a718-43cf-8889-1e71aaa151c4" (UID: "adc9f5aa-a718-43cf-8889-1e71aaa151c4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:34:42 crc kubenswrapper[4672]: I1206 09:34:42.044404 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/adc9f5aa-a718-43cf-8889-1e71aaa151c4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:34:42 crc kubenswrapper[4672]: I1206 09:34:42.044561 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/adc9f5aa-a718-43cf-8889-1e71aaa151c4-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:34:42 crc kubenswrapper[4672]: I1206 09:34:42.044682 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmg7w\" (UniqueName: \"kubernetes.io/projected/adc9f5aa-a718-43cf-8889-1e71aaa151c4-kube-api-access-tmg7w\") on node \"crc\" DevicePath \"\"" Dec 06 09:34:42 crc kubenswrapper[4672]: I1206 09:34:42.289922 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn" event={"ID":"adc9f5aa-a718-43cf-8889-1e71aaa151c4","Type":"ContainerDied","Data":"839168eada371fe749cea83622995f5b93a5ab4ed40fe4edac9bae28ea6b0793"} Dec 06 09:34:42 crc kubenswrapper[4672]: I1206 09:34:42.289959 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="839168eada371fe749cea83622995f5b93a5ab4ed40fe4edac9bae28ea6b0793" Dec 06 09:34:42 crc kubenswrapper[4672]: I1206 09:34:42.290398 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn" Dec 06 09:34:42 crc kubenswrapper[4672]: I1206 09:34:42.389873 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2n65f"] Dec 06 09:34:42 crc kubenswrapper[4672]: E1206 09:34:42.390196 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adc9f5aa-a718-43cf-8889-1e71aaa151c4" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 06 09:34:42 crc kubenswrapper[4672]: I1206 09:34:42.390212 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="adc9f5aa-a718-43cf-8889-1e71aaa151c4" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 06 09:34:42 crc kubenswrapper[4672]: I1206 09:34:42.390380 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="adc9f5aa-a718-43cf-8889-1e71aaa151c4" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 06 09:34:42 crc kubenswrapper[4672]: I1206 09:34:42.392215 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2n65f" Dec 06 09:34:42 crc kubenswrapper[4672]: I1206 09:34:42.394460 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 09:34:42 crc kubenswrapper[4672]: I1206 09:34:42.394735 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:34:42 crc kubenswrapper[4672]: I1206 09:34:42.395476 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 09:34:42 crc kubenswrapper[4672]: I1206 09:34:42.395813 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p6qrb" Dec 06 09:34:42 crc kubenswrapper[4672]: I1206 09:34:42.411885 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2n65f"] Dec 06 09:34:42 crc kubenswrapper[4672]: I1206 09:34:42.452050 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6cb6b82-34a9-4543-88ac-89d2a1a52a0f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2n65f\" (UID: \"d6cb6b82-34a9-4543-88ac-89d2a1a52a0f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2n65f" Dec 06 09:34:42 crc kubenswrapper[4672]: I1206 09:34:42.452147 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6cb6b82-34a9-4543-88ac-89d2a1a52a0f-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2n65f\" (UID: \"d6cb6b82-34a9-4543-88ac-89d2a1a52a0f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2n65f" Dec 06 09:34:42 crc kubenswrapper[4672]: I1206 09:34:42.452224 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltr54\" (UniqueName: \"kubernetes.io/projected/d6cb6b82-34a9-4543-88ac-89d2a1a52a0f-kube-api-access-ltr54\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2n65f\" (UID: \"d6cb6b82-34a9-4543-88ac-89d2a1a52a0f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2n65f" Dec 06 09:34:42 crc kubenswrapper[4672]: I1206 09:34:42.553382 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6cb6b82-34a9-4543-88ac-89d2a1a52a0f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2n65f\" (UID: \"d6cb6b82-34a9-4543-88ac-89d2a1a52a0f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2n65f" Dec 06 09:34:42 crc kubenswrapper[4672]: I1206 09:34:42.553470 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6cb6b82-34a9-4543-88ac-89d2a1a52a0f-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2n65f\" (UID: \"d6cb6b82-34a9-4543-88ac-89d2a1a52a0f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2n65f" Dec 06 09:34:42 crc kubenswrapper[4672]: I1206 09:34:42.553526 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltr54\" (UniqueName: \"kubernetes.io/projected/d6cb6b82-34a9-4543-88ac-89d2a1a52a0f-kube-api-access-ltr54\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2n65f\" (UID: \"d6cb6b82-34a9-4543-88ac-89d2a1a52a0f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2n65f" Dec 06 09:34:42 crc kubenswrapper[4672]: I1206 09:34:42.558178 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6cb6b82-34a9-4543-88ac-89d2a1a52a0f-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2n65f\" (UID: \"d6cb6b82-34a9-4543-88ac-89d2a1a52a0f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2n65f" Dec 06 09:34:42 crc kubenswrapper[4672]: I1206 09:34:42.561195 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6cb6b82-34a9-4543-88ac-89d2a1a52a0f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2n65f\" (UID: \"d6cb6b82-34a9-4543-88ac-89d2a1a52a0f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2n65f" Dec 06 09:34:42 crc kubenswrapper[4672]: I1206 09:34:42.581406 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltr54\" (UniqueName: \"kubernetes.io/projected/d6cb6b82-34a9-4543-88ac-89d2a1a52a0f-kube-api-access-ltr54\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2n65f\" (UID: \"d6cb6b82-34a9-4543-88ac-89d2a1a52a0f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2n65f" Dec 06 09:34:42 crc kubenswrapper[4672]: I1206 09:34:42.713005 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2n65f" Dec 06 09:34:43 crc kubenswrapper[4672]: I1206 09:34:43.384259 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2n65f"] Dec 06 09:34:43 crc kubenswrapper[4672]: W1206 09:34:43.396678 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6cb6b82_34a9_4543_88ac_89d2a1a52a0f.slice/crio-fe7a061a9a57f69b51df5811fc7f19532e1b1e5a2a3f958205b66ac1fa541279 WatchSource:0}: Error finding container fe7a061a9a57f69b51df5811fc7f19532e1b1e5a2a3f958205b66ac1fa541279: Status 404 returned error can't find the container with id fe7a061a9a57f69b51df5811fc7f19532e1b1e5a2a3f958205b66ac1fa541279 Dec 06 09:34:43 crc kubenswrapper[4672]: I1206 09:34:43.557211 4672 scope.go:117] "RemoveContainer" containerID="5dbfab7d581ccc84f9182b337ec607ebbed05c2666107cd40acb0c82fff4b999" Dec 06 09:34:43 crc kubenswrapper[4672]: E1206 09:34:43.557802 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:34:44 crc kubenswrapper[4672]: I1206 09:34:44.312508 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2n65f" event={"ID":"d6cb6b82-34a9-4543-88ac-89d2a1a52a0f","Type":"ContainerStarted","Data":"ea5b27f0547da00d36e63961b537fcf4e7fecbc88c14a3b5185fdaf4d13a12ac"} Dec 06 09:34:44 crc kubenswrapper[4672]: I1206 09:34:44.312989 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2n65f" event={"ID":"d6cb6b82-34a9-4543-88ac-89d2a1a52a0f","Type":"ContainerStarted","Data":"fe7a061a9a57f69b51df5811fc7f19532e1b1e5a2a3f958205b66ac1fa541279"} Dec 06 09:34:53 crc kubenswrapper[4672]: I1206 09:34:53.051834 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2n65f" podStartSLOduration=10.653249802 podStartE2EDuration="11.051818619s" podCreationTimestamp="2025-12-06 09:34:42 +0000 UTC" firstStartedPulling="2025-12-06 09:34:43.400243682 +0000 UTC m=+1701.144503969" lastFinishedPulling="2025-12-06 09:34:43.798812499 +0000 UTC m=+1701.543072786" observedRunningTime="2025-12-06 09:34:44.338045374 +0000 UTC m=+1702.082305721" watchObservedRunningTime="2025-12-06 09:34:53.051818619 +0000 UTC m=+1710.796078906" Dec 06 09:34:53 crc kubenswrapper[4672]: I1206 09:34:53.057146 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-hwq9m"] Dec 06 09:34:53 crc kubenswrapper[4672]: I1206 09:34:53.063350 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-hwq9m"] Dec 06 09:34:54 crc kubenswrapper[4672]: I1206 09:34:54.037817 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-7xtlx"] Dec 06 09:34:54 crc kubenswrapper[4672]: I1206 09:34:54.046509 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-7xtlx"] Dec 06 09:34:54 crc kubenswrapper[4672]: I1206 09:34:54.571247 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35287f3b-4228-4e03-9ee9-c837a57009d5" path="/var/lib/kubelet/pods/35287f3b-4228-4e03-9ee9-c837a57009d5/volumes" Dec 06 09:34:54 crc kubenswrapper[4672]: I1206 09:34:54.574310 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8acaf68e-bce8-458b-bdb7-054e1ea6269a" path="/var/lib/kubelet/pods/8acaf68e-bce8-458b-bdb7-054e1ea6269a/volumes" Dec 06 09:34:55 crc kubenswrapper[4672]: I1206 09:34:55.040857 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-6b72-account-create-update-cgrb2"] Dec 06 09:34:55 crc kubenswrapper[4672]: I1206 09:34:55.049538 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-c7f7-account-create-update-2v6qt"] Dec 06 09:34:55 crc kubenswrapper[4672]: I1206 09:34:55.057984 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-tgdtg"] Dec 06 09:34:55 crc kubenswrapper[4672]: I1206 09:34:55.066171 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-tgdtg"] Dec 06 09:34:55 crc kubenswrapper[4672]: I1206 09:34:55.073142 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-01e6-account-create-update-t97qt"] Dec 06 09:34:55 crc kubenswrapper[4672]: I1206 09:34:55.079653 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-c7f7-account-create-update-2v6qt"] Dec 06 09:34:55 crc kubenswrapper[4672]: I1206 09:34:55.085706 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-6b72-account-create-update-cgrb2"] Dec 06 09:34:55 crc kubenswrapper[4672]: I1206 09:34:55.091417 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-01e6-account-create-update-t97qt"] Dec 06 09:34:56 crc kubenswrapper[4672]: I1206 09:34:56.571343 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aa4486a-c2a9-4144-88f0-68ceb2b8ce76" path="/var/lib/kubelet/pods/0aa4486a-c2a9-4144-88f0-68ceb2b8ce76/volumes" Dec 06 09:34:56 crc kubenswrapper[4672]: I1206 09:34:56.572654 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bc460c7-9f51-4f71-bc98-d4b588694439" path="/var/lib/kubelet/pods/3bc460c7-9f51-4f71-bc98-d4b588694439/volumes" Dec 06 09:34:56 crc kubenswrapper[4672]: I1206 09:34:56.573779 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dfad81d-78bd-4063-87f6-a26a8d24205f" path="/var/lib/kubelet/pods/5dfad81d-78bd-4063-87f6-a26a8d24205f/volumes" Dec 06 09:34:56 crc kubenswrapper[4672]: I1206 09:34:56.575209 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e09b8611-6210-4e51-bf26-cfbcd8732572" path="/var/lib/kubelet/pods/e09b8611-6210-4e51-bf26-cfbcd8732572/volumes" Dec 06 09:34:58 crc kubenswrapper[4672]: I1206 09:34:58.557231 4672 scope.go:117] "RemoveContainer" containerID="5dbfab7d581ccc84f9182b337ec607ebbed05c2666107cd40acb0c82fff4b999" Dec 06 09:34:58 crc kubenswrapper[4672]: E1206 09:34:58.557833 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:35:12 crc kubenswrapper[4672]: I1206 09:35:12.566927 4672 scope.go:117] "RemoveContainer" containerID="5dbfab7d581ccc84f9182b337ec607ebbed05c2666107cd40acb0c82fff4b999" Dec 06 09:35:12 crc kubenswrapper[4672]: E1206 09:35:12.569104 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:35:23 crc kubenswrapper[4672]: I1206 09:35:23.047752 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kd7p6"] Dec 06 09:35:23 crc kubenswrapper[4672]: I1206 09:35:23.057356 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kd7p6"] Dec 06 09:35:23 crc kubenswrapper[4672]: I1206 09:35:23.556896 4672 scope.go:117] "RemoveContainer" containerID="5dbfab7d581ccc84f9182b337ec607ebbed05c2666107cd40acb0c82fff4b999" Dec 06 09:35:23 crc kubenswrapper[4672]: E1206 09:35:23.557208 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:35:24 crc kubenswrapper[4672]: I1206 09:35:24.584924 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2afdbd55-d7bc-4744-878a-389d84f66824" path="/var/lib/kubelet/pods/2afdbd55-d7bc-4744-878a-389d84f66824/volumes" Dec 06 09:35:26 crc kubenswrapper[4672]: I1206 09:35:26.194234 4672 scope.go:117] "RemoveContainer" containerID="72094cf055bd3e19b3c7495992f99628ed015cfb7785ef10e2c771bb078e2a54" Dec 06 09:35:26 crc kubenswrapper[4672]: I1206 09:35:26.227485 4672 scope.go:117] "RemoveContainer" containerID="28b71027accb54967268c1a6d7d788e99b44d6a325bae206815caa9315911420" Dec 06 09:35:26 crc kubenswrapper[4672]: I1206 09:35:26.283405 4672 scope.go:117] "RemoveContainer" containerID="5d88cb3ecea1b968344bbad4d530e5577cca48c6a8ff7b18cb6dde570d86a4ad" Dec 06 09:35:26 crc kubenswrapper[4672]: I1206 09:35:26.323016 4672 scope.go:117] "RemoveContainer" containerID="50a8592699b7ae495abb8cd99ba5e3f70412e0bcffd0a769435c6ab04dcdfda1" Dec 06 09:35:26 crc kubenswrapper[4672]: I1206 09:35:26.399611 4672 scope.go:117] "RemoveContainer" containerID="fde2a159531758f663d3b1495a399a2e254e9b46e28bb6e7a904cac7ea8b5d59" Dec 06 09:35:26 crc kubenswrapper[4672]: I1206 09:35:26.418467 4672 scope.go:117] "RemoveContainer" containerID="96e31fe026ffc3c7cfa3864a6332b7a364aeb3f75dacce66d8f2c0d2c3a709a6" Dec 06 09:35:26 crc kubenswrapper[4672]: I1206 09:35:26.460317 4672 scope.go:117] "RemoveContainer" containerID="1a3a91f37ad129034295446401788ca2ef731e6bf91f7c6dc9c5db44128b2214" Dec 06 09:35:37 crc kubenswrapper[4672]: I1206 09:35:37.557949 4672 scope.go:117] "RemoveContainer" containerID="5dbfab7d581ccc84f9182b337ec607ebbed05c2666107cd40acb0c82fff4b999" Dec 06 09:35:37 crc kubenswrapper[4672]: E1206 09:35:37.559160 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:35:43 crc kubenswrapper[4672]: I1206 09:35:43.058730 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-mc84g"] Dec 06 09:35:43 crc kubenswrapper[4672]: I1206 09:35:43.067789 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-mc84g"] Dec 06 09:35:44 crc kubenswrapper[4672]: I1206 09:35:44.570708 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b02ee21-c208-483e-b2e9-8830c54605d7" path="/var/lib/kubelet/pods/8b02ee21-c208-483e-b2e9-8830c54605d7/volumes" Dec 06 09:35:45 crc kubenswrapper[4672]: I1206 09:35:45.038412 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zlxms"] Dec 06 09:35:45 crc kubenswrapper[4672]: I1206 09:35:45.050090 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zlxms"] Dec 06 09:35:46 crc kubenswrapper[4672]: I1206 09:35:46.570713 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71ef5f2b-0e30-4ee4-b6b3-957ee08ad335" path="/var/lib/kubelet/pods/71ef5f2b-0e30-4ee4-b6b3-957ee08ad335/volumes" Dec 06 09:35:48 crc kubenswrapper[4672]: I1206 09:35:48.557728 4672 scope.go:117] "RemoveContainer" containerID="5dbfab7d581ccc84f9182b337ec607ebbed05c2666107cd40acb0c82fff4b999" Dec 06 09:35:48 crc kubenswrapper[4672]: E1206 09:35:48.558464 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:35:51 crc kubenswrapper[4672]: I1206 09:35:51.489126 4672 generic.go:334] "Generic (PLEG): container finished" podID="d6cb6b82-34a9-4543-88ac-89d2a1a52a0f" containerID="ea5b27f0547da00d36e63961b537fcf4e7fecbc88c14a3b5185fdaf4d13a12ac" exitCode=0 Dec 06 09:35:51 crc kubenswrapper[4672]: I1206 09:35:51.489184 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2n65f" event={"ID":"d6cb6b82-34a9-4543-88ac-89d2a1a52a0f","Type":"ContainerDied","Data":"ea5b27f0547da00d36e63961b537fcf4e7fecbc88c14a3b5185fdaf4d13a12ac"} Dec 06 09:35:52 crc kubenswrapper[4672]: I1206 09:35:52.994644 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2n65f" Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.118716 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6cb6b82-34a9-4543-88ac-89d2a1a52a0f-inventory\") pod \"d6cb6b82-34a9-4543-88ac-89d2a1a52a0f\" (UID: \"d6cb6b82-34a9-4543-88ac-89d2a1a52a0f\") " Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.118760 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6cb6b82-34a9-4543-88ac-89d2a1a52a0f-ssh-key\") pod \"d6cb6b82-34a9-4543-88ac-89d2a1a52a0f\" (UID: \"d6cb6b82-34a9-4543-88ac-89d2a1a52a0f\") " Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.118833 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltr54\" (UniqueName: \"kubernetes.io/projected/d6cb6b82-34a9-4543-88ac-89d2a1a52a0f-kube-api-access-ltr54\") pod \"d6cb6b82-34a9-4543-88ac-89d2a1a52a0f\" (UID: \"d6cb6b82-34a9-4543-88ac-89d2a1a52a0f\") " Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.125013 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6cb6b82-34a9-4543-88ac-89d2a1a52a0f-kube-api-access-ltr54" (OuterVolumeSpecName: "kube-api-access-ltr54") pod "d6cb6b82-34a9-4543-88ac-89d2a1a52a0f" (UID: "d6cb6b82-34a9-4543-88ac-89d2a1a52a0f"). InnerVolumeSpecName "kube-api-access-ltr54". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.152631 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6cb6b82-34a9-4543-88ac-89d2a1a52a0f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d6cb6b82-34a9-4543-88ac-89d2a1a52a0f" (UID: "d6cb6b82-34a9-4543-88ac-89d2a1a52a0f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.176265 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6cb6b82-34a9-4543-88ac-89d2a1a52a0f-inventory" (OuterVolumeSpecName: "inventory") pod "d6cb6b82-34a9-4543-88ac-89d2a1a52a0f" (UID: "d6cb6b82-34a9-4543-88ac-89d2a1a52a0f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.220661 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltr54\" (UniqueName: \"kubernetes.io/projected/d6cb6b82-34a9-4543-88ac-89d2a1a52a0f-kube-api-access-ltr54\") on node \"crc\" DevicePath \"\"" Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.220720 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6cb6b82-34a9-4543-88ac-89d2a1a52a0f-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.220734 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6cb6b82-34a9-4543-88ac-89d2a1a52a0f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.510978 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2n65f" event={"ID":"d6cb6b82-34a9-4543-88ac-89d2a1a52a0f","Type":"ContainerDied","Data":"fe7a061a9a57f69b51df5811fc7f19532e1b1e5a2a3f958205b66ac1fa541279"} Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.511341 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe7a061a9a57f69b51df5811fc7f19532e1b1e5a2a3f958205b66ac1fa541279" Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.511087 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2n65f" Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.594393 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-q5rtn"] Dec 06 09:35:53 crc kubenswrapper[4672]: E1206 09:35:53.594747 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6cb6b82-34a9-4543-88ac-89d2a1a52a0f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.594759 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6cb6b82-34a9-4543-88ac-89d2a1a52a0f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.594934 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6cb6b82-34a9-4543-88ac-89d2a1a52a0f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.595500 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-q5rtn" Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.597981 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.598467 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.600326 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p6qrb" Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.608511 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.619046 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-q5rtn"] Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.731187 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2k2x\" (UniqueName: \"kubernetes.io/projected/5a25749d-e4fb-4a67-9a0d-d95b4aa8609a-kube-api-access-m2k2x\") pod \"ssh-known-hosts-edpm-deployment-q5rtn\" (UID: \"5a25749d-e4fb-4a67-9a0d-d95b4aa8609a\") " pod="openstack/ssh-known-hosts-edpm-deployment-q5rtn" Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.731763 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a25749d-e4fb-4a67-9a0d-d95b4aa8609a-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-q5rtn\" (UID: \"5a25749d-e4fb-4a67-9a0d-d95b4aa8609a\") " pod="openstack/ssh-known-hosts-edpm-deployment-q5rtn" Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.731923 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5a25749d-e4fb-4a67-9a0d-d95b4aa8609a-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-q5rtn\" (UID: \"5a25749d-e4fb-4a67-9a0d-d95b4aa8609a\") " pod="openstack/ssh-known-hosts-edpm-deployment-q5rtn" Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.834136 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a25749d-e4fb-4a67-9a0d-d95b4aa8609a-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-q5rtn\" (UID: \"5a25749d-e4fb-4a67-9a0d-d95b4aa8609a\") " pod="openstack/ssh-known-hosts-edpm-deployment-q5rtn" Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.834245 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5a25749d-e4fb-4a67-9a0d-d95b4aa8609a-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-q5rtn\" (UID: \"5a25749d-e4fb-4a67-9a0d-d95b4aa8609a\") " pod="openstack/ssh-known-hosts-edpm-deployment-q5rtn" Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.834330 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2k2x\" (UniqueName: \"kubernetes.io/projected/5a25749d-e4fb-4a67-9a0d-d95b4aa8609a-kube-api-access-m2k2x\") pod \"ssh-known-hosts-edpm-deployment-q5rtn\" (UID: \"5a25749d-e4fb-4a67-9a0d-d95b4aa8609a\") " pod="openstack/ssh-known-hosts-edpm-deployment-q5rtn" Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.838079 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a25749d-e4fb-4a67-9a0d-d95b4aa8609a-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-q5rtn\" (UID: \"5a25749d-e4fb-4a67-9a0d-d95b4aa8609a\") " pod="openstack/ssh-known-hosts-edpm-deployment-q5rtn" Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.838239 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5a25749d-e4fb-4a67-9a0d-d95b4aa8609a-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-q5rtn\" (UID: \"5a25749d-e4fb-4a67-9a0d-d95b4aa8609a\") " pod="openstack/ssh-known-hosts-edpm-deployment-q5rtn" Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.850142 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2k2x\" (UniqueName: \"kubernetes.io/projected/5a25749d-e4fb-4a67-9a0d-d95b4aa8609a-kube-api-access-m2k2x\") pod \"ssh-known-hosts-edpm-deployment-q5rtn\" (UID: \"5a25749d-e4fb-4a67-9a0d-d95b4aa8609a\") " pod="openstack/ssh-known-hosts-edpm-deployment-q5rtn" Dec 06 09:35:53 crc kubenswrapper[4672]: I1206 09:35:53.912526 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-q5rtn" Dec 06 09:35:54 crc kubenswrapper[4672]: I1206 09:35:54.479838 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-q5rtn"] Dec 06 09:35:54 crc kubenswrapper[4672]: I1206 09:35:54.531335 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-q5rtn" event={"ID":"5a25749d-e4fb-4a67-9a0d-d95b4aa8609a","Type":"ContainerStarted","Data":"04daf0516e7dc0ef06d49084968353fda7971e334185570d89a2b42ecd9944e6"} Dec 06 09:35:55 crc kubenswrapper[4672]: I1206 09:35:55.560854 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-q5rtn" event={"ID":"5a25749d-e4fb-4a67-9a0d-d95b4aa8609a","Type":"ContainerStarted","Data":"68148a62182292f0e0af5b426c0a90a6c2c4bf2f5d666e80043910147604e434"} Dec 06 09:35:55 crc kubenswrapper[4672]: I1206 09:35:55.586171 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-q5rtn" podStartSLOduration=2.018016782 podStartE2EDuration="2.586103536s" podCreationTimestamp="2025-12-06 09:35:53 +0000 UTC" firstStartedPulling="2025-12-06 09:35:54.497240446 +0000 UTC m=+1772.241500773" lastFinishedPulling="2025-12-06 09:35:55.06532725 +0000 UTC m=+1772.809587527" observedRunningTime="2025-12-06 09:35:55.575269424 +0000 UTC m=+1773.319529751" watchObservedRunningTime="2025-12-06 09:35:55.586103536 +0000 UTC m=+1773.330363833" Dec 06 09:36:03 crc kubenswrapper[4672]: I1206 09:36:03.557720 4672 scope.go:117] "RemoveContainer" containerID="5dbfab7d581ccc84f9182b337ec607ebbed05c2666107cd40acb0c82fff4b999" Dec 06 09:36:03 crc kubenswrapper[4672]: E1206 09:36:03.558741 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:36:03 crc kubenswrapper[4672]: I1206 09:36:03.635807 4672 generic.go:334] "Generic (PLEG): container finished" podID="5a25749d-e4fb-4a67-9a0d-d95b4aa8609a" containerID="68148a62182292f0e0af5b426c0a90a6c2c4bf2f5d666e80043910147604e434" exitCode=0 Dec 06 09:36:03 crc kubenswrapper[4672]: I1206 09:36:03.635937 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-q5rtn" event={"ID":"5a25749d-e4fb-4a67-9a0d-d95b4aa8609a","Type":"ContainerDied","Data":"68148a62182292f0e0af5b426c0a90a6c2c4bf2f5d666e80043910147604e434"} Dec 06 09:36:05 crc kubenswrapper[4672]: I1206 09:36:05.062138 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-q5rtn" Dec 06 09:36:05 crc kubenswrapper[4672]: I1206 09:36:05.190458 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a25749d-e4fb-4a67-9a0d-d95b4aa8609a-ssh-key-openstack-edpm-ipam\") pod \"5a25749d-e4fb-4a67-9a0d-d95b4aa8609a\" (UID: \"5a25749d-e4fb-4a67-9a0d-d95b4aa8609a\") " Dec 06 09:36:05 crc kubenswrapper[4672]: I1206 09:36:05.190999 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5a25749d-e4fb-4a67-9a0d-d95b4aa8609a-inventory-0\") pod \"5a25749d-e4fb-4a67-9a0d-d95b4aa8609a\" (UID: \"5a25749d-e4fb-4a67-9a0d-d95b4aa8609a\") " Dec 06 09:36:05 crc kubenswrapper[4672]: I1206 09:36:05.191173 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2k2x\" (UniqueName: \"kubernetes.io/projected/5a25749d-e4fb-4a67-9a0d-d95b4aa8609a-kube-api-access-m2k2x\") pod \"5a25749d-e4fb-4a67-9a0d-d95b4aa8609a\" (UID: \"5a25749d-e4fb-4a67-9a0d-d95b4aa8609a\") " Dec 06 09:36:05 crc kubenswrapper[4672]: I1206 09:36:05.200002 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a25749d-e4fb-4a67-9a0d-d95b4aa8609a-kube-api-access-m2k2x" (OuterVolumeSpecName: "kube-api-access-m2k2x") pod "5a25749d-e4fb-4a67-9a0d-d95b4aa8609a" (UID: "5a25749d-e4fb-4a67-9a0d-d95b4aa8609a"). InnerVolumeSpecName "kube-api-access-m2k2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:36:05 crc kubenswrapper[4672]: I1206 09:36:05.219641 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a25749d-e4fb-4a67-9a0d-d95b4aa8609a-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "5a25749d-e4fb-4a67-9a0d-d95b4aa8609a" (UID: "5a25749d-e4fb-4a67-9a0d-d95b4aa8609a"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:36:05 crc kubenswrapper[4672]: I1206 09:36:05.229009 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a25749d-e4fb-4a67-9a0d-d95b4aa8609a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5a25749d-e4fb-4a67-9a0d-d95b4aa8609a" (UID: "5a25749d-e4fb-4a67-9a0d-d95b4aa8609a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:36:05 crc kubenswrapper[4672]: I1206 09:36:05.293262 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2k2x\" (UniqueName: \"kubernetes.io/projected/5a25749d-e4fb-4a67-9a0d-d95b4aa8609a-kube-api-access-m2k2x\") on node \"crc\" DevicePath \"\"" Dec 06 09:36:05 crc kubenswrapper[4672]: I1206 09:36:05.293301 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a25749d-e4fb-4a67-9a0d-d95b4aa8609a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 06 09:36:05 crc kubenswrapper[4672]: I1206 09:36:05.293318 4672 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5a25749d-e4fb-4a67-9a0d-d95b4aa8609a-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:36:05 crc kubenswrapper[4672]: I1206 09:36:05.660577 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-q5rtn" event={"ID":"5a25749d-e4fb-4a67-9a0d-d95b4aa8609a","Type":"ContainerDied","Data":"04daf0516e7dc0ef06d49084968353fda7971e334185570d89a2b42ecd9944e6"} Dec 06 09:36:05 crc kubenswrapper[4672]: I1206 09:36:05.660644 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04daf0516e7dc0ef06d49084968353fda7971e334185570d89a2b42ecd9944e6" Dec 06 09:36:05 crc kubenswrapper[4672]: I1206 09:36:05.660651 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-q5rtn" Dec 06 09:36:05 crc kubenswrapper[4672]: I1206 09:36:05.815880 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9gcwx"] Dec 06 09:36:05 crc kubenswrapper[4672]: E1206 09:36:05.816311 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a25749d-e4fb-4a67-9a0d-d95b4aa8609a" containerName="ssh-known-hosts-edpm-deployment" Dec 06 09:36:05 crc kubenswrapper[4672]: I1206 09:36:05.816329 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a25749d-e4fb-4a67-9a0d-d95b4aa8609a" containerName="ssh-known-hosts-edpm-deployment" Dec 06 09:36:05 crc kubenswrapper[4672]: I1206 09:36:05.816544 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a25749d-e4fb-4a67-9a0d-d95b4aa8609a" containerName="ssh-known-hosts-edpm-deployment" Dec 06 09:36:05 crc kubenswrapper[4672]: I1206 09:36:05.817234 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9gcwx" Dec 06 09:36:05 crc kubenswrapper[4672]: I1206 09:36:05.820158 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 09:36:05 crc kubenswrapper[4672]: I1206 09:36:05.820664 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:36:05 crc kubenswrapper[4672]: I1206 09:36:05.822440 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 09:36:05 crc kubenswrapper[4672]: I1206 09:36:05.825169 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9gcwx"] Dec 06 09:36:05 crc kubenswrapper[4672]: I1206 09:36:05.826496 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p6qrb" Dec 06 09:36:05 crc kubenswrapper[4672]: I1206 09:36:05.905876 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c09fbca5-c748-48fc-8008-d2b3d644df4a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9gcwx\" (UID: \"c09fbca5-c748-48fc-8008-d2b3d644df4a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9gcwx" Dec 06 09:36:05 crc kubenswrapper[4672]: I1206 09:36:05.905956 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72tb9\" (UniqueName: \"kubernetes.io/projected/c09fbca5-c748-48fc-8008-d2b3d644df4a-kube-api-access-72tb9\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9gcwx\" (UID: \"c09fbca5-c748-48fc-8008-d2b3d644df4a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9gcwx" Dec 06 09:36:05 crc kubenswrapper[4672]: I1206 09:36:05.905990 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c09fbca5-c748-48fc-8008-d2b3d644df4a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9gcwx\" (UID: \"c09fbca5-c748-48fc-8008-d2b3d644df4a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9gcwx" Dec 06 09:36:06 crc kubenswrapper[4672]: I1206 09:36:06.007639 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c09fbca5-c748-48fc-8008-d2b3d644df4a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9gcwx\" (UID: \"c09fbca5-c748-48fc-8008-d2b3d644df4a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9gcwx" Dec 06 09:36:06 crc kubenswrapper[4672]: I1206 09:36:06.007923 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72tb9\" (UniqueName: \"kubernetes.io/projected/c09fbca5-c748-48fc-8008-d2b3d644df4a-kube-api-access-72tb9\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9gcwx\" (UID: \"c09fbca5-c748-48fc-8008-d2b3d644df4a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9gcwx" Dec 06 09:36:06 crc kubenswrapper[4672]: I1206 09:36:06.008039 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c09fbca5-c748-48fc-8008-d2b3d644df4a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9gcwx\" (UID: \"c09fbca5-c748-48fc-8008-d2b3d644df4a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9gcwx" Dec 06 09:36:06 crc kubenswrapper[4672]: I1206 09:36:06.013900 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c09fbca5-c748-48fc-8008-d2b3d644df4a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9gcwx\" (UID: \"c09fbca5-c748-48fc-8008-d2b3d644df4a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9gcwx" Dec 06 09:36:06 crc kubenswrapper[4672]: I1206 09:36:06.014387 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c09fbca5-c748-48fc-8008-d2b3d644df4a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9gcwx\" (UID: \"c09fbca5-c748-48fc-8008-d2b3d644df4a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9gcwx" Dec 06 09:36:06 crc kubenswrapper[4672]: I1206 09:36:06.030492 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72tb9\" (UniqueName: \"kubernetes.io/projected/c09fbca5-c748-48fc-8008-d2b3d644df4a-kube-api-access-72tb9\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9gcwx\" (UID: \"c09fbca5-c748-48fc-8008-d2b3d644df4a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9gcwx" Dec 06 09:36:06 crc kubenswrapper[4672]: I1206 09:36:06.132667 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9gcwx" Dec 06 09:36:06 crc kubenswrapper[4672]: I1206 09:36:06.570838 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9gcwx"] Dec 06 09:36:06 crc kubenswrapper[4672]: I1206 09:36:06.671814 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9gcwx" event={"ID":"c09fbca5-c748-48fc-8008-d2b3d644df4a","Type":"ContainerStarted","Data":"8bf54c2e3ec8ace1f1e1469797babcb04b599a05773d2c01f05bc522219b2144"} Dec 06 09:36:07 crc kubenswrapper[4672]: I1206 09:36:07.685370 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9gcwx" event={"ID":"c09fbca5-c748-48fc-8008-d2b3d644df4a","Type":"ContainerStarted","Data":"a314e652a2d077d56ab8a91a1cb6e22b64bda121bde14cb706a48eb6a7e88fe8"} Dec 06 09:36:07 crc kubenswrapper[4672]: I1206 09:36:07.714741 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9gcwx" podStartSLOduration=2.324978673 podStartE2EDuration="2.714723703s" podCreationTimestamp="2025-12-06 09:36:05 +0000 UTC" firstStartedPulling="2025-12-06 09:36:06.555988558 +0000 UTC m=+1784.300248845" lastFinishedPulling="2025-12-06 09:36:06.945733558 +0000 UTC m=+1784.689993875" observedRunningTime="2025-12-06 09:36:07.708394503 +0000 UTC m=+1785.452654790" watchObservedRunningTime="2025-12-06 09:36:07.714723703 +0000 UTC m=+1785.458983990" Dec 06 09:36:16 crc kubenswrapper[4672]: I1206 09:36:16.788276 4672 generic.go:334] "Generic (PLEG): container finished" podID="c09fbca5-c748-48fc-8008-d2b3d644df4a" containerID="a314e652a2d077d56ab8a91a1cb6e22b64bda121bde14cb706a48eb6a7e88fe8" exitCode=0 Dec 06 09:36:16 crc kubenswrapper[4672]: I1206 09:36:16.788367 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9gcwx" event={"ID":"c09fbca5-c748-48fc-8008-d2b3d644df4a","Type":"ContainerDied","Data":"a314e652a2d077d56ab8a91a1cb6e22b64bda121bde14cb706a48eb6a7e88fe8"} Dec 06 09:36:17 crc kubenswrapper[4672]: I1206 09:36:17.556891 4672 scope.go:117] "RemoveContainer" containerID="5dbfab7d581ccc84f9182b337ec607ebbed05c2666107cd40acb0c82fff4b999" Dec 06 09:36:17 crc kubenswrapper[4672]: E1206 09:36:17.557285 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:36:18 crc kubenswrapper[4672]: I1206 09:36:18.225081 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9gcwx" Dec 06 09:36:18 crc kubenswrapper[4672]: I1206 09:36:18.386511 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72tb9\" (UniqueName: \"kubernetes.io/projected/c09fbca5-c748-48fc-8008-d2b3d644df4a-kube-api-access-72tb9\") pod \"c09fbca5-c748-48fc-8008-d2b3d644df4a\" (UID: \"c09fbca5-c748-48fc-8008-d2b3d644df4a\") " Dec 06 09:36:18 crc kubenswrapper[4672]: I1206 09:36:18.386616 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c09fbca5-c748-48fc-8008-d2b3d644df4a-ssh-key\") pod \"c09fbca5-c748-48fc-8008-d2b3d644df4a\" (UID: \"c09fbca5-c748-48fc-8008-d2b3d644df4a\") " Dec 06 09:36:18 crc kubenswrapper[4672]: I1206 09:36:18.386703 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c09fbca5-c748-48fc-8008-d2b3d644df4a-inventory\") pod \"c09fbca5-c748-48fc-8008-d2b3d644df4a\" (UID: \"c09fbca5-c748-48fc-8008-d2b3d644df4a\") " Dec 06 09:36:18 crc kubenswrapper[4672]: I1206 09:36:18.392549 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c09fbca5-c748-48fc-8008-d2b3d644df4a-kube-api-access-72tb9" (OuterVolumeSpecName: "kube-api-access-72tb9") pod "c09fbca5-c748-48fc-8008-d2b3d644df4a" (UID: "c09fbca5-c748-48fc-8008-d2b3d644df4a"). InnerVolumeSpecName "kube-api-access-72tb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:36:18 crc kubenswrapper[4672]: I1206 09:36:18.419498 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c09fbca5-c748-48fc-8008-d2b3d644df4a-inventory" (OuterVolumeSpecName: "inventory") pod "c09fbca5-c748-48fc-8008-d2b3d644df4a" (UID: "c09fbca5-c748-48fc-8008-d2b3d644df4a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:36:18 crc kubenswrapper[4672]: I1206 09:36:18.427522 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c09fbca5-c748-48fc-8008-d2b3d644df4a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c09fbca5-c748-48fc-8008-d2b3d644df4a" (UID: "c09fbca5-c748-48fc-8008-d2b3d644df4a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:36:18 crc kubenswrapper[4672]: I1206 09:36:18.488782 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72tb9\" (UniqueName: \"kubernetes.io/projected/c09fbca5-c748-48fc-8008-d2b3d644df4a-kube-api-access-72tb9\") on node \"crc\" DevicePath \"\"" Dec 06 09:36:18 crc kubenswrapper[4672]: I1206 09:36:18.488827 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c09fbca5-c748-48fc-8008-d2b3d644df4a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:36:18 crc kubenswrapper[4672]: I1206 09:36:18.488839 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c09fbca5-c748-48fc-8008-d2b3d644df4a-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:36:18 crc kubenswrapper[4672]: I1206 09:36:18.810752 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9gcwx" event={"ID":"c09fbca5-c748-48fc-8008-d2b3d644df4a","Type":"ContainerDied","Data":"8bf54c2e3ec8ace1f1e1469797babcb04b599a05773d2c01f05bc522219b2144"} Dec 06 09:36:18 crc kubenswrapper[4672]: I1206 09:36:18.810816 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bf54c2e3ec8ace1f1e1469797babcb04b599a05773d2c01f05bc522219b2144" Dec 06 09:36:18 crc kubenswrapper[4672]: I1206 09:36:18.810840 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9gcwx" Dec 06 09:36:18 crc kubenswrapper[4672]: I1206 09:36:18.919949 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6"] Dec 06 09:36:18 crc kubenswrapper[4672]: E1206 09:36:18.920410 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c09fbca5-c748-48fc-8008-d2b3d644df4a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 06 09:36:18 crc kubenswrapper[4672]: I1206 09:36:18.920427 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c09fbca5-c748-48fc-8008-d2b3d644df4a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 06 09:36:18 crc kubenswrapper[4672]: I1206 09:36:18.920664 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="c09fbca5-c748-48fc-8008-d2b3d644df4a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 06 09:36:18 crc kubenswrapper[4672]: I1206 09:36:18.921349 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6" Dec 06 09:36:18 crc kubenswrapper[4672]: I1206 09:36:18.923963 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:36:18 crc kubenswrapper[4672]: I1206 09:36:18.924101 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 09:36:18 crc kubenswrapper[4672]: I1206 09:36:18.924300 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 09:36:18 crc kubenswrapper[4672]: I1206 09:36:18.924355 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p6qrb" Dec 06 09:36:18 crc kubenswrapper[4672]: I1206 09:36:18.935018 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6"] Dec 06 09:36:19 crc kubenswrapper[4672]: I1206 09:36:19.101900 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/067d00eb-9c67-44d0-a734-70c09ab491a2-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6\" (UID: \"067d00eb-9c67-44d0-a734-70c09ab491a2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6" Dec 06 09:36:19 crc kubenswrapper[4672]: I1206 09:36:19.102243 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/067d00eb-9c67-44d0-a734-70c09ab491a2-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6\" (UID: \"067d00eb-9c67-44d0-a734-70c09ab491a2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6" Dec 06 09:36:19 crc kubenswrapper[4672]: I1206 09:36:19.102772 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sblp4\" (UniqueName: \"kubernetes.io/projected/067d00eb-9c67-44d0-a734-70c09ab491a2-kube-api-access-sblp4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6\" (UID: \"067d00eb-9c67-44d0-a734-70c09ab491a2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6" Dec 06 09:36:19 crc kubenswrapper[4672]: I1206 09:36:19.203540 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/067d00eb-9c67-44d0-a734-70c09ab491a2-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6\" (UID: \"067d00eb-9c67-44d0-a734-70c09ab491a2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6" Dec 06 09:36:19 crc kubenswrapper[4672]: I1206 09:36:19.203636 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sblp4\" (UniqueName: \"kubernetes.io/projected/067d00eb-9c67-44d0-a734-70c09ab491a2-kube-api-access-sblp4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6\" (UID: \"067d00eb-9c67-44d0-a734-70c09ab491a2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6" Dec 06 09:36:19 crc kubenswrapper[4672]: I1206 09:36:19.203691 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/067d00eb-9c67-44d0-a734-70c09ab491a2-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6\" (UID: \"067d00eb-9c67-44d0-a734-70c09ab491a2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6" Dec 06 09:36:19 crc kubenswrapper[4672]: I1206 09:36:19.208432 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/067d00eb-9c67-44d0-a734-70c09ab491a2-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6\" (UID: \"067d00eb-9c67-44d0-a734-70c09ab491a2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6" Dec 06 09:36:19 crc kubenswrapper[4672]: I1206 09:36:19.209701 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/067d00eb-9c67-44d0-a734-70c09ab491a2-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6\" (UID: \"067d00eb-9c67-44d0-a734-70c09ab491a2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6" Dec 06 09:36:19 crc kubenswrapper[4672]: I1206 09:36:19.232371 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sblp4\" (UniqueName: \"kubernetes.io/projected/067d00eb-9c67-44d0-a734-70c09ab491a2-kube-api-access-sblp4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6\" (UID: \"067d00eb-9c67-44d0-a734-70c09ab491a2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6" Dec 06 09:36:19 crc kubenswrapper[4672]: I1206 09:36:19.245123 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6" Dec 06 09:36:19 crc kubenswrapper[4672]: I1206 09:36:19.784176 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6"] Dec 06 09:36:19 crc kubenswrapper[4672]: I1206 09:36:19.823251 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6" event={"ID":"067d00eb-9c67-44d0-a734-70c09ab491a2","Type":"ContainerStarted","Data":"a65d9ba467f540548e288b8bc67f032f16cc3712c963dc5ee1ab3afd8f39dbf5"} Dec 06 09:36:20 crc kubenswrapper[4672]: I1206 09:36:20.835386 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6" event={"ID":"067d00eb-9c67-44d0-a734-70c09ab491a2","Type":"ContainerStarted","Data":"82923c4e0c03dad55123bf8cb1a799d163c4ca6f1a1acb88f3023abad9d3c8dc"} Dec 06 09:36:20 crc kubenswrapper[4672]: I1206 09:36:20.853379 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6" podStartSLOduration=2.450369095 podStartE2EDuration="2.853352532s" podCreationTimestamp="2025-12-06 09:36:18 +0000 UTC" firstStartedPulling="2025-12-06 09:36:19.792075727 +0000 UTC m=+1797.536336024" lastFinishedPulling="2025-12-06 09:36:20.195059134 +0000 UTC m=+1797.939319461" observedRunningTime="2025-12-06 09:36:20.851162282 +0000 UTC m=+1798.595422569" watchObservedRunningTime="2025-12-06 09:36:20.853352532 +0000 UTC m=+1798.597612849" Dec 06 09:36:26 crc kubenswrapper[4672]: I1206 09:36:26.651789 4672 scope.go:117] "RemoveContainer" containerID="a2a360d83bce94c2323b21912f26a423771f5c2386dffa6443b23b111c1fc5f9" Dec 06 09:36:26 crc kubenswrapper[4672]: I1206 09:36:26.704311 4672 scope.go:117] "RemoveContainer" containerID="02d219a88ba9a4eced955f38312f6ea28188845dee97713baf52fb4683cc0a89" Dec 06 09:36:27 crc kubenswrapper[4672]: I1206 09:36:27.051979 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-mtfvz"] Dec 06 09:36:27 crc kubenswrapper[4672]: I1206 09:36:27.065754 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-mtfvz"] Dec 06 09:36:28 crc kubenswrapper[4672]: I1206 09:36:28.572386 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9531e27a-bb7e-4700-9bd9-5008c3c7b12f" path="/var/lib/kubelet/pods/9531e27a-bb7e-4700-9bd9-5008c3c7b12f/volumes" Dec 06 09:36:30 crc kubenswrapper[4672]: I1206 09:36:30.557008 4672 scope.go:117] "RemoveContainer" containerID="5dbfab7d581ccc84f9182b337ec607ebbed05c2666107cd40acb0c82fff4b999" Dec 06 09:36:30 crc kubenswrapper[4672]: E1206 09:36:30.558179 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:36:30 crc kubenswrapper[4672]: I1206 09:36:30.943082 4672 generic.go:334] "Generic (PLEG): container finished" podID="067d00eb-9c67-44d0-a734-70c09ab491a2" containerID="82923c4e0c03dad55123bf8cb1a799d163c4ca6f1a1acb88f3023abad9d3c8dc" exitCode=0 Dec 06 09:36:30 crc kubenswrapper[4672]: I1206 09:36:30.943269 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6" event={"ID":"067d00eb-9c67-44d0-a734-70c09ab491a2","Type":"ContainerDied","Data":"82923c4e0c03dad55123bf8cb1a799d163c4ca6f1a1acb88f3023abad9d3c8dc"} Dec 06 09:36:32 crc kubenswrapper[4672]: I1206 09:36:32.416293 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6" Dec 06 09:36:32 crc kubenswrapper[4672]: I1206 09:36:32.583490 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/067d00eb-9c67-44d0-a734-70c09ab491a2-ssh-key\") pod \"067d00eb-9c67-44d0-a734-70c09ab491a2\" (UID: \"067d00eb-9c67-44d0-a734-70c09ab491a2\") " Dec 06 09:36:32 crc kubenswrapper[4672]: I1206 09:36:32.584832 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/067d00eb-9c67-44d0-a734-70c09ab491a2-inventory\") pod \"067d00eb-9c67-44d0-a734-70c09ab491a2\" (UID: \"067d00eb-9c67-44d0-a734-70c09ab491a2\") " Dec 06 09:36:32 crc kubenswrapper[4672]: I1206 09:36:32.584874 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sblp4\" (UniqueName: \"kubernetes.io/projected/067d00eb-9c67-44d0-a734-70c09ab491a2-kube-api-access-sblp4\") pod \"067d00eb-9c67-44d0-a734-70c09ab491a2\" (UID: \"067d00eb-9c67-44d0-a734-70c09ab491a2\") " Dec 06 09:36:32 crc kubenswrapper[4672]: I1206 09:36:32.593701 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/067d00eb-9c67-44d0-a734-70c09ab491a2-kube-api-access-sblp4" (OuterVolumeSpecName: "kube-api-access-sblp4") pod "067d00eb-9c67-44d0-a734-70c09ab491a2" (UID: "067d00eb-9c67-44d0-a734-70c09ab491a2"). InnerVolumeSpecName "kube-api-access-sblp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:36:32 crc kubenswrapper[4672]: I1206 09:36:32.611327 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/067d00eb-9c67-44d0-a734-70c09ab491a2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "067d00eb-9c67-44d0-a734-70c09ab491a2" (UID: "067d00eb-9c67-44d0-a734-70c09ab491a2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:36:32 crc kubenswrapper[4672]: I1206 09:36:32.628562 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/067d00eb-9c67-44d0-a734-70c09ab491a2-inventory" (OuterVolumeSpecName: "inventory") pod "067d00eb-9c67-44d0-a734-70c09ab491a2" (UID: "067d00eb-9c67-44d0-a734-70c09ab491a2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:36:32 crc kubenswrapper[4672]: I1206 09:36:32.686899 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/067d00eb-9c67-44d0-a734-70c09ab491a2-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:36:32 crc kubenswrapper[4672]: I1206 09:36:32.687137 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/067d00eb-9c67-44d0-a734-70c09ab491a2-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:36:32 crc kubenswrapper[4672]: I1206 09:36:32.687489 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sblp4\" (UniqueName: \"kubernetes.io/projected/067d00eb-9c67-44d0-a734-70c09ab491a2-kube-api-access-sblp4\") on node \"crc\" DevicePath \"\"" Dec 06 09:36:32 crc kubenswrapper[4672]: I1206 09:36:32.971247 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6" event={"ID":"067d00eb-9c67-44d0-a734-70c09ab491a2","Type":"ContainerDied","Data":"a65d9ba467f540548e288b8bc67f032f16cc3712c963dc5ee1ab3afd8f39dbf5"} Dec 06 09:36:32 crc kubenswrapper[4672]: I1206 09:36:32.971325 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a65d9ba467f540548e288b8bc67f032f16cc3712c963dc5ee1ab3afd8f39dbf5" Dec 06 09:36:32 crc kubenswrapper[4672]: I1206 09:36:32.971282 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6" Dec 06 09:36:43 crc kubenswrapper[4672]: I1206 09:36:43.556555 4672 scope.go:117] "RemoveContainer" containerID="5dbfab7d581ccc84f9182b337ec607ebbed05c2666107cd40acb0c82fff4b999" Dec 06 09:36:43 crc kubenswrapper[4672]: E1206 09:36:43.557439 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:36:55 crc kubenswrapper[4672]: I1206 09:36:55.557592 4672 scope.go:117] "RemoveContainer" containerID="5dbfab7d581ccc84f9182b337ec607ebbed05c2666107cd40acb0c82fff4b999" Dec 06 09:36:55 crc kubenswrapper[4672]: E1206 09:36:55.558858 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:37:06 crc kubenswrapper[4672]: I1206 09:37:06.557307 4672 scope.go:117] "RemoveContainer" containerID="5dbfab7d581ccc84f9182b337ec607ebbed05c2666107cd40acb0c82fff4b999" Dec 06 09:37:06 crc kubenswrapper[4672]: E1206 09:37:06.558412 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:37:18 crc kubenswrapper[4672]: I1206 09:37:18.557685 4672 scope.go:117] "RemoveContainer" containerID="5dbfab7d581ccc84f9182b337ec607ebbed05c2666107cd40acb0c82fff4b999" Dec 06 09:37:18 crc kubenswrapper[4672]: E1206 09:37:18.558471 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:37:26 crc kubenswrapper[4672]: I1206 09:37:26.830016 4672 scope.go:117] "RemoveContainer" containerID="767ad21198037849acf6a95a7a19a08a7e36946276a177e4b827a921edb12e65" Dec 06 09:37:31 crc kubenswrapper[4672]: I1206 09:37:31.558279 4672 scope.go:117] "RemoveContainer" containerID="5dbfab7d581ccc84f9182b337ec607ebbed05c2666107cd40acb0c82fff4b999" Dec 06 09:37:31 crc kubenswrapper[4672]: E1206 09:37:31.559765 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:37:42 crc kubenswrapper[4672]: I1206 09:37:42.563221 4672 scope.go:117] "RemoveContainer" containerID="5dbfab7d581ccc84f9182b337ec607ebbed05c2666107cd40acb0c82fff4b999" Dec 06 09:37:43 crc kubenswrapper[4672]: I1206 09:37:43.730703 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerStarted","Data":"71a9b1f852f62d2842628c107ad52332f067c0c03486844efb4bf89c1bd15b0e"} Dec 06 09:39:42 crc kubenswrapper[4672]: I1206 09:39:42.319867 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:39:42 crc kubenswrapper[4672]: I1206 09:39:42.320463 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:40:12 crc kubenswrapper[4672]: I1206 09:40:12.320311 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:40:12 crc kubenswrapper[4672]: I1206 09:40:12.322018 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:40:42 crc kubenswrapper[4672]: I1206 09:40:42.319692 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:40:42 crc kubenswrapper[4672]: I1206 09:40:42.320139 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:40:42 crc kubenswrapper[4672]: I1206 09:40:42.320188 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 09:40:42 crc kubenswrapper[4672]: I1206 09:40:42.320932 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"71a9b1f852f62d2842628c107ad52332f067c0c03486844efb4bf89c1bd15b0e"} pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 09:40:42 crc kubenswrapper[4672]: I1206 09:40:42.320984 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" containerID="cri-o://71a9b1f852f62d2842628c107ad52332f067c0c03486844efb4bf89c1bd15b0e" gracePeriod=600 Dec 06 09:40:43 crc kubenswrapper[4672]: I1206 09:40:43.846915 4672 generic.go:334] "Generic (PLEG): container finished" podID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerID="71a9b1f852f62d2842628c107ad52332f067c0c03486844efb4bf89c1bd15b0e" exitCode=0 Dec 06 09:40:43 crc kubenswrapper[4672]: I1206 09:40:43.847001 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerDied","Data":"71a9b1f852f62d2842628c107ad52332f067c0c03486844efb4bf89c1bd15b0e"} Dec 06 09:40:43 crc kubenswrapper[4672]: I1206 09:40:43.849078 4672 scope.go:117] "RemoveContainer" containerID="5dbfab7d581ccc84f9182b337ec607ebbed05c2666107cd40acb0c82fff4b999" Dec 06 09:40:45 crc kubenswrapper[4672]: I1206 09:40:45.869787 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerStarted","Data":"9db37941c2f5797e3cc2e07c2a5ea926cd6224bccf10d013946bef80402ff7bb"} Dec 06 09:40:55 crc kubenswrapper[4672]: I1206 09:40:55.477316 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w4jf6"] Dec 06 09:40:55 crc kubenswrapper[4672]: E1206 09:40:55.479137 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="067d00eb-9c67-44d0-a734-70c09ab491a2" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 06 09:40:55 crc kubenswrapper[4672]: I1206 09:40:55.479229 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="067d00eb-9c67-44d0-a734-70c09ab491a2" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 06 09:40:55 crc kubenswrapper[4672]: I1206 09:40:55.479535 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="067d00eb-9c67-44d0-a734-70c09ab491a2" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 06 09:40:55 crc kubenswrapper[4672]: I1206 09:40:55.481140 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w4jf6" Dec 06 09:40:55 crc kubenswrapper[4672]: I1206 09:40:55.490234 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4jf6"] Dec 06 09:40:55 crc kubenswrapper[4672]: I1206 09:40:55.664328 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b6lp\" (UniqueName: \"kubernetes.io/projected/7726e37b-e38f-4ebb-b6ae-55fa3a7a7493-kube-api-access-8b6lp\") pod \"redhat-marketplace-w4jf6\" (UID: \"7726e37b-e38f-4ebb-b6ae-55fa3a7a7493\") " pod="openshift-marketplace/redhat-marketplace-w4jf6" Dec 06 09:40:55 crc kubenswrapper[4672]: I1206 09:40:55.664414 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7726e37b-e38f-4ebb-b6ae-55fa3a7a7493-utilities\") pod \"redhat-marketplace-w4jf6\" (UID: \"7726e37b-e38f-4ebb-b6ae-55fa3a7a7493\") " pod="openshift-marketplace/redhat-marketplace-w4jf6" Dec 06 09:40:55 crc kubenswrapper[4672]: I1206 09:40:55.664450 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7726e37b-e38f-4ebb-b6ae-55fa3a7a7493-catalog-content\") pod \"redhat-marketplace-w4jf6\" (UID: \"7726e37b-e38f-4ebb-b6ae-55fa3a7a7493\") " pod="openshift-marketplace/redhat-marketplace-w4jf6" Dec 06 09:40:55 crc kubenswrapper[4672]: I1206 09:40:55.766743 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b6lp\" (UniqueName: \"kubernetes.io/projected/7726e37b-e38f-4ebb-b6ae-55fa3a7a7493-kube-api-access-8b6lp\") pod \"redhat-marketplace-w4jf6\" (UID: \"7726e37b-e38f-4ebb-b6ae-55fa3a7a7493\") " pod="openshift-marketplace/redhat-marketplace-w4jf6" Dec 06 09:40:55 crc kubenswrapper[4672]: I1206 09:40:55.766817 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7726e37b-e38f-4ebb-b6ae-55fa3a7a7493-utilities\") pod \"redhat-marketplace-w4jf6\" (UID: \"7726e37b-e38f-4ebb-b6ae-55fa3a7a7493\") " pod="openshift-marketplace/redhat-marketplace-w4jf6" Dec 06 09:40:55 crc kubenswrapper[4672]: I1206 09:40:55.766860 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7726e37b-e38f-4ebb-b6ae-55fa3a7a7493-catalog-content\") pod \"redhat-marketplace-w4jf6\" (UID: \"7726e37b-e38f-4ebb-b6ae-55fa3a7a7493\") " pod="openshift-marketplace/redhat-marketplace-w4jf6" Dec 06 09:40:55 crc kubenswrapper[4672]: I1206 09:40:55.767330 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7726e37b-e38f-4ebb-b6ae-55fa3a7a7493-utilities\") pod \"redhat-marketplace-w4jf6\" (UID: \"7726e37b-e38f-4ebb-b6ae-55fa3a7a7493\") " pod="openshift-marketplace/redhat-marketplace-w4jf6" Dec 06 09:40:55 crc kubenswrapper[4672]: I1206 09:40:55.767389 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7726e37b-e38f-4ebb-b6ae-55fa3a7a7493-catalog-content\") pod \"redhat-marketplace-w4jf6\" (UID: \"7726e37b-e38f-4ebb-b6ae-55fa3a7a7493\") " pod="openshift-marketplace/redhat-marketplace-w4jf6" Dec 06 09:40:55 crc kubenswrapper[4672]: I1206 09:40:55.791018 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b6lp\" (UniqueName: \"kubernetes.io/projected/7726e37b-e38f-4ebb-b6ae-55fa3a7a7493-kube-api-access-8b6lp\") pod \"redhat-marketplace-w4jf6\" (UID: \"7726e37b-e38f-4ebb-b6ae-55fa3a7a7493\") " pod="openshift-marketplace/redhat-marketplace-w4jf6" Dec 06 09:40:55 crc kubenswrapper[4672]: I1206 09:40:55.807775 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w4jf6" Dec 06 09:40:56 crc kubenswrapper[4672]: I1206 09:40:56.288173 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4jf6"] Dec 06 09:40:57 crc kubenswrapper[4672]: I1206 09:40:57.201208 4672 generic.go:334] "Generic (PLEG): container finished" podID="7726e37b-e38f-4ebb-b6ae-55fa3a7a7493" containerID="464598eccdd9509cf228224fbe1cb08790fab6ce3808bc0e31e13a08d6fca6b5" exitCode=0 Dec 06 09:40:57 crc kubenswrapper[4672]: I1206 09:40:57.201279 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4jf6" event={"ID":"7726e37b-e38f-4ebb-b6ae-55fa3a7a7493","Type":"ContainerDied","Data":"464598eccdd9509cf228224fbe1cb08790fab6ce3808bc0e31e13a08d6fca6b5"} Dec 06 09:40:57 crc kubenswrapper[4672]: I1206 09:40:57.201533 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4jf6" event={"ID":"7726e37b-e38f-4ebb-b6ae-55fa3a7a7493","Type":"ContainerStarted","Data":"448e0907da523be0feba7a7b1adc4378ec3fe22038c7d9b0371ef960d5f163f4"} Dec 06 09:40:57 crc kubenswrapper[4672]: I1206 09:40:57.204833 4672 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 09:40:58 crc kubenswrapper[4672]: I1206 09:40:58.212568 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4jf6" event={"ID":"7726e37b-e38f-4ebb-b6ae-55fa3a7a7493","Type":"ContainerStarted","Data":"b0a637653adc8aeca0739d626017e9471d47479710bdcb902d5d682866e21997"} Dec 06 09:40:59 crc kubenswrapper[4672]: I1206 09:40:59.221391 4672 generic.go:334] "Generic (PLEG): container finished" podID="7726e37b-e38f-4ebb-b6ae-55fa3a7a7493" containerID="b0a637653adc8aeca0739d626017e9471d47479710bdcb902d5d682866e21997" exitCode=0 Dec 06 09:40:59 crc kubenswrapper[4672]: I1206 09:40:59.221446 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4jf6" event={"ID":"7726e37b-e38f-4ebb-b6ae-55fa3a7a7493","Type":"ContainerDied","Data":"b0a637653adc8aeca0739d626017e9471d47479710bdcb902d5d682866e21997"} Dec 06 09:41:00 crc kubenswrapper[4672]: I1206 09:41:00.231517 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4jf6" event={"ID":"7726e37b-e38f-4ebb-b6ae-55fa3a7a7493","Type":"ContainerStarted","Data":"0842e4aa2b625da4132b90305702c72d2b87ba937e155ad6d65b18c60024a46b"} Dec 06 09:41:05 crc kubenswrapper[4672]: I1206 09:41:05.808028 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w4jf6" Dec 06 09:41:05 crc kubenswrapper[4672]: I1206 09:41:05.808850 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w4jf6" Dec 06 09:41:05 crc kubenswrapper[4672]: I1206 09:41:05.858645 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w4jf6" Dec 06 09:41:05 crc kubenswrapper[4672]: I1206 09:41:05.881976 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w4jf6" podStartSLOduration=8.466536355 podStartE2EDuration="10.881955453s" podCreationTimestamp="2025-12-06 09:40:55 +0000 UTC" firstStartedPulling="2025-12-06 09:40:57.204508188 +0000 UTC m=+2074.948768475" lastFinishedPulling="2025-12-06 09:40:59.619927266 +0000 UTC m=+2077.364187573" observedRunningTime="2025-12-06 09:41:00.254311191 +0000 UTC m=+2077.998571478" watchObservedRunningTime="2025-12-06 09:41:05.881955453 +0000 UTC m=+2083.626215780" Dec 06 09:41:06 crc kubenswrapper[4672]: I1206 09:41:06.328404 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w4jf6" Dec 06 09:41:06 crc kubenswrapper[4672]: I1206 09:41:06.374966 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4jf6"] Dec 06 09:41:08 crc kubenswrapper[4672]: I1206 09:41:08.306391 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w4jf6" podUID="7726e37b-e38f-4ebb-b6ae-55fa3a7a7493" containerName="registry-server" containerID="cri-o://0842e4aa2b625da4132b90305702c72d2b87ba937e155ad6d65b18c60024a46b" gracePeriod=2 Dec 06 09:41:09 crc kubenswrapper[4672]: I1206 09:41:09.258638 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w4jf6" Dec 06 09:41:09 crc kubenswrapper[4672]: I1206 09:41:09.318355 4672 generic.go:334] "Generic (PLEG): container finished" podID="7726e37b-e38f-4ebb-b6ae-55fa3a7a7493" containerID="0842e4aa2b625da4132b90305702c72d2b87ba937e155ad6d65b18c60024a46b" exitCode=0 Dec 06 09:41:09 crc kubenswrapper[4672]: I1206 09:41:09.318404 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4jf6" event={"ID":"7726e37b-e38f-4ebb-b6ae-55fa3a7a7493","Type":"ContainerDied","Data":"0842e4aa2b625da4132b90305702c72d2b87ba937e155ad6d65b18c60024a46b"} Dec 06 09:41:09 crc kubenswrapper[4672]: I1206 09:41:09.318416 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w4jf6" Dec 06 09:41:09 crc kubenswrapper[4672]: I1206 09:41:09.318437 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4jf6" event={"ID":"7726e37b-e38f-4ebb-b6ae-55fa3a7a7493","Type":"ContainerDied","Data":"448e0907da523be0feba7a7b1adc4378ec3fe22038c7d9b0371ef960d5f163f4"} Dec 06 09:41:09 crc kubenswrapper[4672]: I1206 09:41:09.318458 4672 scope.go:117] "RemoveContainer" containerID="0842e4aa2b625da4132b90305702c72d2b87ba937e155ad6d65b18c60024a46b" Dec 06 09:41:09 crc kubenswrapper[4672]: I1206 09:41:09.339547 4672 scope.go:117] "RemoveContainer" containerID="b0a637653adc8aeca0739d626017e9471d47479710bdcb902d5d682866e21997" Dec 06 09:41:09 crc kubenswrapper[4672]: I1206 09:41:09.358027 4672 scope.go:117] "RemoveContainer" containerID="464598eccdd9509cf228224fbe1cb08790fab6ce3808bc0e31e13a08d6fca6b5" Dec 06 09:41:09 crc kubenswrapper[4672]: I1206 09:41:09.407048 4672 scope.go:117] "RemoveContainer" containerID="0842e4aa2b625da4132b90305702c72d2b87ba937e155ad6d65b18c60024a46b" Dec 06 09:41:09 crc kubenswrapper[4672]: E1206 09:41:09.407724 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0842e4aa2b625da4132b90305702c72d2b87ba937e155ad6d65b18c60024a46b\": container with ID starting with 0842e4aa2b625da4132b90305702c72d2b87ba937e155ad6d65b18c60024a46b not found: ID does not exist" containerID="0842e4aa2b625da4132b90305702c72d2b87ba937e155ad6d65b18c60024a46b" Dec 06 09:41:09 crc kubenswrapper[4672]: I1206 09:41:09.407754 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0842e4aa2b625da4132b90305702c72d2b87ba937e155ad6d65b18c60024a46b"} err="failed to get container status \"0842e4aa2b625da4132b90305702c72d2b87ba937e155ad6d65b18c60024a46b\": rpc error: code = NotFound desc = could not find container \"0842e4aa2b625da4132b90305702c72d2b87ba937e155ad6d65b18c60024a46b\": container with ID starting with 0842e4aa2b625da4132b90305702c72d2b87ba937e155ad6d65b18c60024a46b not found: ID does not exist" Dec 06 09:41:09 crc kubenswrapper[4672]: I1206 09:41:09.407789 4672 scope.go:117] "RemoveContainer" containerID="b0a637653adc8aeca0739d626017e9471d47479710bdcb902d5d682866e21997" Dec 06 09:41:09 crc kubenswrapper[4672]: E1206 09:41:09.408247 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0a637653adc8aeca0739d626017e9471d47479710bdcb902d5d682866e21997\": container with ID starting with b0a637653adc8aeca0739d626017e9471d47479710bdcb902d5d682866e21997 not found: ID does not exist" containerID="b0a637653adc8aeca0739d626017e9471d47479710bdcb902d5d682866e21997" Dec 06 09:41:09 crc kubenswrapper[4672]: I1206 09:41:09.408276 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0a637653adc8aeca0739d626017e9471d47479710bdcb902d5d682866e21997"} err="failed to get container status \"b0a637653adc8aeca0739d626017e9471d47479710bdcb902d5d682866e21997\": rpc error: code = NotFound desc = could not find container \"b0a637653adc8aeca0739d626017e9471d47479710bdcb902d5d682866e21997\": container with ID starting with b0a637653adc8aeca0739d626017e9471d47479710bdcb902d5d682866e21997 not found: ID does not exist" Dec 06 09:41:09 crc kubenswrapper[4672]: I1206 09:41:09.408296 4672 scope.go:117] "RemoveContainer" containerID="464598eccdd9509cf228224fbe1cb08790fab6ce3808bc0e31e13a08d6fca6b5" Dec 06 09:41:09 crc kubenswrapper[4672]: E1206 09:41:09.408585 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"464598eccdd9509cf228224fbe1cb08790fab6ce3808bc0e31e13a08d6fca6b5\": container with ID starting with 464598eccdd9509cf228224fbe1cb08790fab6ce3808bc0e31e13a08d6fca6b5 not found: ID does not exist" containerID="464598eccdd9509cf228224fbe1cb08790fab6ce3808bc0e31e13a08d6fca6b5" Dec 06 09:41:09 crc kubenswrapper[4672]: I1206 09:41:09.408624 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"464598eccdd9509cf228224fbe1cb08790fab6ce3808bc0e31e13a08d6fca6b5"} err="failed to get container status \"464598eccdd9509cf228224fbe1cb08790fab6ce3808bc0e31e13a08d6fca6b5\": rpc error: code = NotFound desc = could not find container \"464598eccdd9509cf228224fbe1cb08790fab6ce3808bc0e31e13a08d6fca6b5\": container with ID starting with 464598eccdd9509cf228224fbe1cb08790fab6ce3808bc0e31e13a08d6fca6b5 not found: ID does not exist" Dec 06 09:41:09 crc kubenswrapper[4672]: I1206 09:41:09.439630 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7726e37b-e38f-4ebb-b6ae-55fa3a7a7493-utilities\") pod \"7726e37b-e38f-4ebb-b6ae-55fa3a7a7493\" (UID: \"7726e37b-e38f-4ebb-b6ae-55fa3a7a7493\") " Dec 06 09:41:09 crc kubenswrapper[4672]: I1206 09:41:09.439780 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b6lp\" (UniqueName: \"kubernetes.io/projected/7726e37b-e38f-4ebb-b6ae-55fa3a7a7493-kube-api-access-8b6lp\") pod \"7726e37b-e38f-4ebb-b6ae-55fa3a7a7493\" (UID: \"7726e37b-e38f-4ebb-b6ae-55fa3a7a7493\") " Dec 06 09:41:09 crc kubenswrapper[4672]: I1206 09:41:09.439832 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7726e37b-e38f-4ebb-b6ae-55fa3a7a7493-catalog-content\") pod \"7726e37b-e38f-4ebb-b6ae-55fa3a7a7493\" (UID: \"7726e37b-e38f-4ebb-b6ae-55fa3a7a7493\") " Dec 06 09:41:09 crc kubenswrapper[4672]: I1206 09:41:09.441260 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7726e37b-e38f-4ebb-b6ae-55fa3a7a7493-utilities" (OuterVolumeSpecName: "utilities") pod "7726e37b-e38f-4ebb-b6ae-55fa3a7a7493" (UID: "7726e37b-e38f-4ebb-b6ae-55fa3a7a7493"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:41:09 crc kubenswrapper[4672]: I1206 09:41:09.447982 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7726e37b-e38f-4ebb-b6ae-55fa3a7a7493-kube-api-access-8b6lp" (OuterVolumeSpecName: "kube-api-access-8b6lp") pod "7726e37b-e38f-4ebb-b6ae-55fa3a7a7493" (UID: "7726e37b-e38f-4ebb-b6ae-55fa3a7a7493"). InnerVolumeSpecName "kube-api-access-8b6lp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:41:09 crc kubenswrapper[4672]: I1206 09:41:09.463587 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7726e37b-e38f-4ebb-b6ae-55fa3a7a7493-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7726e37b-e38f-4ebb-b6ae-55fa3a7a7493" (UID: "7726e37b-e38f-4ebb-b6ae-55fa3a7a7493"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:41:09 crc kubenswrapper[4672]: I1206 09:41:09.542017 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7726e37b-e38f-4ebb-b6ae-55fa3a7a7493-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:41:09 crc kubenswrapper[4672]: I1206 09:41:09.542193 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b6lp\" (UniqueName: \"kubernetes.io/projected/7726e37b-e38f-4ebb-b6ae-55fa3a7a7493-kube-api-access-8b6lp\") on node \"crc\" DevicePath \"\"" Dec 06 09:41:09 crc kubenswrapper[4672]: I1206 09:41:09.542283 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7726e37b-e38f-4ebb-b6ae-55fa3a7a7493-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:41:09 crc kubenswrapper[4672]: I1206 09:41:09.667510 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4jf6"] Dec 06 09:41:09 crc kubenswrapper[4672]: I1206 09:41:09.676406 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4jf6"] Dec 06 09:41:10 crc kubenswrapper[4672]: I1206 09:41:10.570479 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7726e37b-e38f-4ebb-b6ae-55fa3a7a7493" path="/var/lib/kubelet/pods/7726e37b-e38f-4ebb-b6ae-55fa3a7a7493/volumes" Dec 06 09:42:29 crc kubenswrapper[4672]: E1206 09:42:29.651405 4672 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.30:40456->38.102.83.30:37519: write tcp 38.102.83.30:40456->38.102.83.30:37519: write: broken pipe Dec 06 09:42:53 crc kubenswrapper[4672]: I1206 09:42:53.503782 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d"] Dec 06 09:42:53 crc kubenswrapper[4672]: I1206 09:42:53.519184 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9gcwx"] Dec 06 09:42:53 crc kubenswrapper[4672]: I1206 09:42:53.547432 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2n65f"] Dec 06 09:42:53 crc kubenswrapper[4672]: I1206 09:42:53.558850 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m8k8d"] Dec 06 09:42:53 crc kubenswrapper[4672]: I1206 09:42:53.566436 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn"] Dec 06 09:42:53 crc kubenswrapper[4672]: I1206 09:42:53.578730 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j"] Dec 06 09:42:53 crc kubenswrapper[4672]: I1206 09:42:53.586916 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-q5rtn"] Dec 06 09:42:53 crc kubenswrapper[4672]: I1206 09:42:53.596306 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9gcwx"] Dec 06 09:42:53 crc kubenswrapper[4672]: I1206 09:42:53.603956 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2n65f"] Dec 06 09:42:53 crc kubenswrapper[4672]: I1206 09:42:53.609999 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mdj7j"] Dec 06 09:42:53 crc kubenswrapper[4672]: I1206 09:42:53.615536 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp"] Dec 06 09:42:53 crc kubenswrapper[4672]: I1206 09:42:53.621221 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zj959"] Dec 06 09:42:53 crc kubenswrapper[4672]: I1206 09:42:53.627063 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6"] Dec 06 09:42:53 crc kubenswrapper[4672]: I1206 09:42:53.632586 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pc4sn"] Dec 06 09:42:53 crc kubenswrapper[4672]: I1206 09:42:53.638129 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kwnww"] Dec 06 09:42:53 crc kubenswrapper[4672]: I1206 09:42:53.644093 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-q5rtn"] Dec 06 09:42:53 crc kubenswrapper[4672]: I1206 09:42:53.650770 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vh9xp"] Dec 06 09:42:53 crc kubenswrapper[4672]: I1206 09:42:53.659504 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zj959"] Dec 06 09:42:53 crc kubenswrapper[4672]: I1206 09:42:53.667632 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kwnww"] Dec 06 09:42:53 crc kubenswrapper[4672]: I1206 09:42:53.675733 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jsfh6"] Dec 06 09:42:54 crc kubenswrapper[4672]: I1206 09:42:54.566379 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="067d00eb-9c67-44d0-a734-70c09ab491a2" path="/var/lib/kubelet/pods/067d00eb-9c67-44d0-a734-70c09ab491a2/volumes" Dec 06 09:42:54 crc kubenswrapper[4672]: I1206 09:42:54.567368 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e" path="/var/lib/kubelet/pods/202ca8e6-0bd4-4b3f-b90e-6feb22bdea2e/volumes" Dec 06 09:42:54 crc kubenswrapper[4672]: I1206 09:42:54.567972 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33362cf5-2204-478e-b155-8277d00131a6" path="/var/lib/kubelet/pods/33362cf5-2204-478e-b155-8277d00131a6/volumes" Dec 06 09:42:54 crc kubenswrapper[4672]: I1206 09:42:54.568501 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a25749d-e4fb-4a67-9a0d-d95b4aa8609a" path="/var/lib/kubelet/pods/5a25749d-e4fb-4a67-9a0d-d95b4aa8609a/volumes" Dec 06 09:42:54 crc kubenswrapper[4672]: I1206 09:42:54.569560 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="750227a2-c497-4579-b34b-3ebb2a8d502b" path="/var/lib/kubelet/pods/750227a2-c497-4579-b34b-3ebb2a8d502b/volumes" Dec 06 09:42:54 crc kubenswrapper[4672]: I1206 09:42:54.570298 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5e522ba-e183-41c7-a1f3-b9085bdac873" path="/var/lib/kubelet/pods/a5e522ba-e183-41c7-a1f3-b9085bdac873/volumes" Dec 06 09:42:54 crc kubenswrapper[4672]: I1206 09:42:54.570852 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adc9f5aa-a718-43cf-8889-1e71aaa151c4" path="/var/lib/kubelet/pods/adc9f5aa-a718-43cf-8889-1e71aaa151c4/volumes" Dec 06 09:42:54 crc kubenswrapper[4672]: I1206 09:42:54.571852 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c09fbca5-c748-48fc-8008-d2b3d644df4a" path="/var/lib/kubelet/pods/c09fbca5-c748-48fc-8008-d2b3d644df4a/volumes" Dec 06 09:42:54 crc kubenswrapper[4672]: I1206 09:42:54.572371 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c" path="/var/lib/kubelet/pods/c1f97f31-1aa4-4bb6-8c84-4cc45fe5238c/volumes" Dec 06 09:42:54 crc kubenswrapper[4672]: I1206 09:42:54.572927 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6cb6b82-34a9-4543-88ac-89d2a1a52a0f" path="/var/lib/kubelet/pods/d6cb6b82-34a9-4543-88ac-89d2a1a52a0f/volumes" Dec 06 09:43:06 crc kubenswrapper[4672]: I1206 09:43:06.888101 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt"] Dec 06 09:43:06 crc kubenswrapper[4672]: E1206 09:43:06.889186 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7726e37b-e38f-4ebb-b6ae-55fa3a7a7493" containerName="extract-content" Dec 06 09:43:06 crc kubenswrapper[4672]: I1206 09:43:06.889204 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="7726e37b-e38f-4ebb-b6ae-55fa3a7a7493" containerName="extract-content" Dec 06 09:43:06 crc kubenswrapper[4672]: E1206 09:43:06.889236 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7726e37b-e38f-4ebb-b6ae-55fa3a7a7493" containerName="extract-utilities" Dec 06 09:43:06 crc kubenswrapper[4672]: I1206 09:43:06.889245 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="7726e37b-e38f-4ebb-b6ae-55fa3a7a7493" containerName="extract-utilities" Dec 06 09:43:06 crc kubenswrapper[4672]: E1206 09:43:06.889269 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7726e37b-e38f-4ebb-b6ae-55fa3a7a7493" containerName="registry-server" Dec 06 09:43:06 crc kubenswrapper[4672]: I1206 09:43:06.889277 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="7726e37b-e38f-4ebb-b6ae-55fa3a7a7493" containerName="registry-server" Dec 06 09:43:06 crc kubenswrapper[4672]: I1206 09:43:06.889467 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="7726e37b-e38f-4ebb-b6ae-55fa3a7a7493" containerName="registry-server" Dec 06 09:43:06 crc kubenswrapper[4672]: I1206 09:43:06.890233 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt" Dec 06 09:43:06 crc kubenswrapper[4672]: I1206 09:43:06.892426 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p6qrb" Dec 06 09:43:06 crc kubenswrapper[4672]: I1206 09:43:06.892829 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 09:43:06 crc kubenswrapper[4672]: I1206 09:43:06.892904 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 06 09:43:06 crc kubenswrapper[4672]: I1206 09:43:06.893081 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:43:06 crc kubenswrapper[4672]: I1206 09:43:06.893193 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 09:43:06 crc kubenswrapper[4672]: I1206 09:43:06.910232 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt"] Dec 06 09:43:07 crc kubenswrapper[4672]: I1206 09:43:07.042152 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/38dd2d36-2778-405f-97b8-d2651746de0c-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt\" (UID: \"38dd2d36-2778-405f-97b8-d2651746de0c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt" Dec 06 09:43:07 crc kubenswrapper[4672]: I1206 09:43:07.042237 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq2z6\" (UniqueName: \"kubernetes.io/projected/38dd2d36-2778-405f-97b8-d2651746de0c-kube-api-access-mq2z6\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt\" (UID: \"38dd2d36-2778-405f-97b8-d2651746de0c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt" Dec 06 09:43:07 crc kubenswrapper[4672]: I1206 09:43:07.042328 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/38dd2d36-2778-405f-97b8-d2651746de0c-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt\" (UID: \"38dd2d36-2778-405f-97b8-d2651746de0c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt" Dec 06 09:43:07 crc kubenswrapper[4672]: I1206 09:43:07.042478 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38dd2d36-2778-405f-97b8-d2651746de0c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt\" (UID: \"38dd2d36-2778-405f-97b8-d2651746de0c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt" Dec 06 09:43:07 crc kubenswrapper[4672]: I1206 09:43:07.042673 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38dd2d36-2778-405f-97b8-d2651746de0c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt\" (UID: \"38dd2d36-2778-405f-97b8-d2651746de0c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt" Dec 06 09:43:07 crc kubenswrapper[4672]: I1206 09:43:07.144882 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/38dd2d36-2778-405f-97b8-d2651746de0c-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt\" (UID: \"38dd2d36-2778-405f-97b8-d2651746de0c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt" Dec 06 09:43:07 crc kubenswrapper[4672]: I1206 09:43:07.144934 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38dd2d36-2778-405f-97b8-d2651746de0c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt\" (UID: \"38dd2d36-2778-405f-97b8-d2651746de0c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt" Dec 06 09:43:07 crc kubenswrapper[4672]: I1206 09:43:07.144979 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38dd2d36-2778-405f-97b8-d2651746de0c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt\" (UID: \"38dd2d36-2778-405f-97b8-d2651746de0c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt" Dec 06 09:43:07 crc kubenswrapper[4672]: I1206 09:43:07.145050 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/38dd2d36-2778-405f-97b8-d2651746de0c-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt\" (UID: \"38dd2d36-2778-405f-97b8-d2651746de0c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt" Dec 06 09:43:07 crc kubenswrapper[4672]: I1206 09:43:07.145099 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq2z6\" (UniqueName: \"kubernetes.io/projected/38dd2d36-2778-405f-97b8-d2651746de0c-kube-api-access-mq2z6\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt\" (UID: \"38dd2d36-2778-405f-97b8-d2651746de0c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt" Dec 06 09:43:07 crc kubenswrapper[4672]: I1206 09:43:07.153683 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38dd2d36-2778-405f-97b8-d2651746de0c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt\" (UID: \"38dd2d36-2778-405f-97b8-d2651746de0c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt" Dec 06 09:43:07 crc kubenswrapper[4672]: I1206 09:43:07.155127 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/38dd2d36-2778-405f-97b8-d2651746de0c-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt\" (UID: \"38dd2d36-2778-405f-97b8-d2651746de0c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt" Dec 06 09:43:07 crc kubenswrapper[4672]: I1206 09:43:07.157309 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/38dd2d36-2778-405f-97b8-d2651746de0c-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt\" (UID: \"38dd2d36-2778-405f-97b8-d2651746de0c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt" Dec 06 09:43:07 crc kubenswrapper[4672]: I1206 09:43:07.157390 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38dd2d36-2778-405f-97b8-d2651746de0c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt\" (UID: \"38dd2d36-2778-405f-97b8-d2651746de0c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt" Dec 06 09:43:07 crc kubenswrapper[4672]: I1206 09:43:07.162869 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq2z6\" (UniqueName: \"kubernetes.io/projected/38dd2d36-2778-405f-97b8-d2651746de0c-kube-api-access-mq2z6\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt\" (UID: \"38dd2d36-2778-405f-97b8-d2651746de0c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt" Dec 06 09:43:07 crc kubenswrapper[4672]: I1206 09:43:07.249102 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt" Dec 06 09:43:07 crc kubenswrapper[4672]: I1206 09:43:07.773876 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt"] Dec 06 09:43:08 crc kubenswrapper[4672]: I1206 09:43:08.474746 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt" event={"ID":"38dd2d36-2778-405f-97b8-d2651746de0c","Type":"ContainerStarted","Data":"7cc046d419d7e15d40ee983ab4a786813c53eb46dc90e85a7595df2067766ecf"} Dec 06 09:43:08 crc kubenswrapper[4672]: I1206 09:43:08.475330 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt" event={"ID":"38dd2d36-2778-405f-97b8-d2651746de0c","Type":"ContainerStarted","Data":"e4d5ef97eed3f17244b1271b44a7fc5ec140f91e57ab92b03f318b04c9b16fdc"} Dec 06 09:43:08 crc kubenswrapper[4672]: I1206 09:43:08.499240 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt" podStartSLOduration=2.086018146 podStartE2EDuration="2.49921955s" podCreationTimestamp="2025-12-06 09:43:06 +0000 UTC" firstStartedPulling="2025-12-06 09:43:07.784623112 +0000 UTC m=+2205.528883399" lastFinishedPulling="2025-12-06 09:43:08.197824516 +0000 UTC m=+2205.942084803" observedRunningTime="2025-12-06 09:43:08.494077311 +0000 UTC m=+2206.238337608" watchObservedRunningTime="2025-12-06 09:43:08.49921955 +0000 UTC m=+2206.243479847" Dec 06 09:43:12 crc kubenswrapper[4672]: I1206 09:43:12.319901 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:43:12 crc kubenswrapper[4672]: I1206 09:43:12.320550 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:43:20 crc kubenswrapper[4672]: I1206 09:43:20.618070 4672 generic.go:334] "Generic (PLEG): container finished" podID="38dd2d36-2778-405f-97b8-d2651746de0c" containerID="7cc046d419d7e15d40ee983ab4a786813c53eb46dc90e85a7595df2067766ecf" exitCode=0 Dec 06 09:43:20 crc kubenswrapper[4672]: I1206 09:43:20.618147 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt" event={"ID":"38dd2d36-2778-405f-97b8-d2651746de0c","Type":"ContainerDied","Data":"7cc046d419d7e15d40ee983ab4a786813c53eb46dc90e85a7595df2067766ecf"} Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.120707 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.231435 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/38dd2d36-2778-405f-97b8-d2651746de0c-ssh-key\") pod \"38dd2d36-2778-405f-97b8-d2651746de0c\" (UID: \"38dd2d36-2778-405f-97b8-d2651746de0c\") " Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.231646 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38dd2d36-2778-405f-97b8-d2651746de0c-inventory\") pod \"38dd2d36-2778-405f-97b8-d2651746de0c\" (UID: \"38dd2d36-2778-405f-97b8-d2651746de0c\") " Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.231716 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq2z6\" (UniqueName: \"kubernetes.io/projected/38dd2d36-2778-405f-97b8-d2651746de0c-kube-api-access-mq2z6\") pod \"38dd2d36-2778-405f-97b8-d2651746de0c\" (UID: \"38dd2d36-2778-405f-97b8-d2651746de0c\") " Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.231758 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/38dd2d36-2778-405f-97b8-d2651746de0c-ceph\") pod \"38dd2d36-2778-405f-97b8-d2651746de0c\" (UID: \"38dd2d36-2778-405f-97b8-d2651746de0c\") " Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.231827 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38dd2d36-2778-405f-97b8-d2651746de0c-repo-setup-combined-ca-bundle\") pod \"38dd2d36-2778-405f-97b8-d2651746de0c\" (UID: \"38dd2d36-2778-405f-97b8-d2651746de0c\") " Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.239067 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38dd2d36-2778-405f-97b8-d2651746de0c-kube-api-access-mq2z6" (OuterVolumeSpecName: "kube-api-access-mq2z6") pod "38dd2d36-2778-405f-97b8-d2651746de0c" (UID: "38dd2d36-2778-405f-97b8-d2651746de0c"). InnerVolumeSpecName "kube-api-access-mq2z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.262699 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38dd2d36-2778-405f-97b8-d2651746de0c-ceph" (OuterVolumeSpecName: "ceph") pod "38dd2d36-2778-405f-97b8-d2651746de0c" (UID: "38dd2d36-2778-405f-97b8-d2651746de0c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.262761 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38dd2d36-2778-405f-97b8-d2651746de0c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "38dd2d36-2778-405f-97b8-d2651746de0c" (UID: "38dd2d36-2778-405f-97b8-d2651746de0c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.270173 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38dd2d36-2778-405f-97b8-d2651746de0c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "38dd2d36-2778-405f-97b8-d2651746de0c" (UID: "38dd2d36-2778-405f-97b8-d2651746de0c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.286930 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38dd2d36-2778-405f-97b8-d2651746de0c-inventory" (OuterVolumeSpecName: "inventory") pod "38dd2d36-2778-405f-97b8-d2651746de0c" (UID: "38dd2d36-2778-405f-97b8-d2651746de0c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.334382 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38dd2d36-2778-405f-97b8-d2651746de0c-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.334420 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq2z6\" (UniqueName: \"kubernetes.io/projected/38dd2d36-2778-405f-97b8-d2651746de0c-kube-api-access-mq2z6\") on node \"crc\" DevicePath \"\"" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.334438 4672 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/38dd2d36-2778-405f-97b8-d2651746de0c-ceph\") on node \"crc\" DevicePath \"\"" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.334453 4672 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38dd2d36-2778-405f-97b8-d2651746de0c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.334464 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/38dd2d36-2778-405f-97b8-d2651746de0c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.638055 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt" event={"ID":"38dd2d36-2778-405f-97b8-d2651746de0c","Type":"ContainerDied","Data":"e4d5ef97eed3f17244b1271b44a7fc5ec140f91e57ab92b03f318b04c9b16fdc"} Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.638093 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4d5ef97eed3f17244b1271b44a7fc5ec140f91e57ab92b03f318b04c9b16fdc" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.638365 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.733428 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9"] Dec 06 09:43:22 crc kubenswrapper[4672]: E1206 09:43:22.733843 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38dd2d36-2778-405f-97b8-d2651746de0c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.733861 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="38dd2d36-2778-405f-97b8-d2651746de0c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.734019 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="38dd2d36-2778-405f-97b8-d2651746de0c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.734566 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.740168 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.741855 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.742354 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.742535 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.742762 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p6qrb" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.746744 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9"] Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.877673 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87cce220-e210-44d8-ac72-946b6e9bb4c4-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9\" (UID: \"87cce220-e210-44d8-ac72-946b6e9bb4c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.877794 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87cce220-e210-44d8-ac72-946b6e9bb4c4-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9\" (UID: \"87cce220-e210-44d8-ac72-946b6e9bb4c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.877928 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/87cce220-e210-44d8-ac72-946b6e9bb4c4-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9\" (UID: \"87cce220-e210-44d8-ac72-946b6e9bb4c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.877970 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj2kl\" (UniqueName: \"kubernetes.io/projected/87cce220-e210-44d8-ac72-946b6e9bb4c4-kube-api-access-kj2kl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9\" (UID: \"87cce220-e210-44d8-ac72-946b6e9bb4c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.878037 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87cce220-e210-44d8-ac72-946b6e9bb4c4-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9\" (UID: \"87cce220-e210-44d8-ac72-946b6e9bb4c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.979517 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87cce220-e210-44d8-ac72-946b6e9bb4c4-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9\" (UID: \"87cce220-e210-44d8-ac72-946b6e9bb4c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.979634 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87cce220-e210-44d8-ac72-946b6e9bb4c4-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9\" (UID: \"87cce220-e210-44d8-ac72-946b6e9bb4c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.979731 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/87cce220-e210-44d8-ac72-946b6e9bb4c4-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9\" (UID: \"87cce220-e210-44d8-ac72-946b6e9bb4c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.979760 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj2kl\" (UniqueName: \"kubernetes.io/projected/87cce220-e210-44d8-ac72-946b6e9bb4c4-kube-api-access-kj2kl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9\" (UID: \"87cce220-e210-44d8-ac72-946b6e9bb4c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.979832 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87cce220-e210-44d8-ac72-946b6e9bb4c4-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9\" (UID: \"87cce220-e210-44d8-ac72-946b6e9bb4c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.983973 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87cce220-e210-44d8-ac72-946b6e9bb4c4-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9\" (UID: \"87cce220-e210-44d8-ac72-946b6e9bb4c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.984294 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87cce220-e210-44d8-ac72-946b6e9bb4c4-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9\" (UID: \"87cce220-e210-44d8-ac72-946b6e9bb4c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.984392 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87cce220-e210-44d8-ac72-946b6e9bb4c4-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9\" (UID: \"87cce220-e210-44d8-ac72-946b6e9bb4c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.997539 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/87cce220-e210-44d8-ac72-946b6e9bb4c4-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9\" (UID: \"87cce220-e210-44d8-ac72-946b6e9bb4c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9" Dec 06 09:43:22 crc kubenswrapper[4672]: I1206 09:43:22.998041 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj2kl\" (UniqueName: \"kubernetes.io/projected/87cce220-e210-44d8-ac72-946b6e9bb4c4-kube-api-access-kj2kl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9\" (UID: \"87cce220-e210-44d8-ac72-946b6e9bb4c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9" Dec 06 09:43:23 crc kubenswrapper[4672]: I1206 09:43:23.054215 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9" Dec 06 09:43:23 crc kubenswrapper[4672]: I1206 09:43:23.618907 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9"] Dec 06 09:43:23 crc kubenswrapper[4672]: I1206 09:43:23.647782 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9" event={"ID":"87cce220-e210-44d8-ac72-946b6e9bb4c4","Type":"ContainerStarted","Data":"473294c5f99fe7f6ba38cc1bf740f7013a59f8df83d7695ba09fa18e50df1779"} Dec 06 09:43:25 crc kubenswrapper[4672]: I1206 09:43:25.666143 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9" event={"ID":"87cce220-e210-44d8-ac72-946b6e9bb4c4","Type":"ContainerStarted","Data":"4e7ff482a63fef5ff3b0b48cbf361b361d2d7ad2d9b5f69d175097af8870f974"} Dec 06 09:43:27 crc kubenswrapper[4672]: I1206 09:43:27.029548 4672 scope.go:117] "RemoveContainer" containerID="ea5b27f0547da00d36e63961b537fcf4e7fecbc88c14a3b5185fdaf4d13a12ac" Dec 06 09:43:27 crc kubenswrapper[4672]: I1206 09:43:27.093833 4672 scope.go:117] "RemoveContainer" containerID="2ee253348ca477296a8c93bd7223507e4bf865837529803da55c069a07c171f6" Dec 06 09:43:27 crc kubenswrapper[4672]: I1206 09:43:27.170008 4672 scope.go:117] "RemoveContainer" containerID="157c2c0e2445db78091d7c3c15af2dfba630a570d2cfb1ae3ea139febbaa451e" Dec 06 09:43:27 crc kubenswrapper[4672]: I1206 09:43:27.209177 4672 scope.go:117] "RemoveContainer" containerID="68148a62182292f0e0af5b426c0a90a6c2c4bf2f5d666e80043910147604e434" Dec 06 09:43:27 crc kubenswrapper[4672]: I1206 09:43:27.238313 4672 scope.go:117] "RemoveContainer" containerID="a314e652a2d077d56ab8a91a1cb6e22b64bda121bde14cb706a48eb6a7e88fe8" Dec 06 09:43:27 crc kubenswrapper[4672]: I1206 09:43:27.303399 4672 scope.go:117] "RemoveContainer" containerID="381734fb76e133db0b2856795d81d52f3b7ca56692768146bbbbe5241b5d0b1b" Dec 06 09:43:27 crc kubenswrapper[4672]: I1206 09:43:27.359886 4672 scope.go:117] "RemoveContainer" containerID="c8cc845e4d957fc76f47277fe9c4869e9a8d6dc9d4741f4f3b222aa757d9b3fb" Dec 06 09:43:27 crc kubenswrapper[4672]: I1206 09:43:27.388753 4672 scope.go:117] "RemoveContainer" containerID="82923c4e0c03dad55123bf8cb1a799d163c4ca6f1a1acb88f3023abad9d3c8dc" Dec 06 09:43:27 crc kubenswrapper[4672]: I1206 09:43:27.423103 4672 scope.go:117] "RemoveContainer" containerID="edb58ebf470502fa3a0a3ed0904044ded966a8d1ec88988985df9f3aeb872457" Dec 06 09:43:27 crc kubenswrapper[4672]: I1206 09:43:27.453305 4672 scope.go:117] "RemoveContainer" containerID="b480dbc4b124c07682885b7255ee6783a8a2f6b4f4e06f57ea4ce59e0d31fb98" Dec 06 09:43:39 crc kubenswrapper[4672]: I1206 09:43:39.405373 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9" podStartSLOduration=16.760633735 podStartE2EDuration="17.405347431s" podCreationTimestamp="2025-12-06 09:43:22 +0000 UTC" firstStartedPulling="2025-12-06 09:43:23.620451829 +0000 UTC m=+2221.364712126" lastFinishedPulling="2025-12-06 09:43:24.265165535 +0000 UTC m=+2222.009425822" observedRunningTime="2025-12-06 09:43:25.688286818 +0000 UTC m=+2223.432547105" watchObservedRunningTime="2025-12-06 09:43:39.405347431 +0000 UTC m=+2237.149607758" Dec 06 09:43:39 crc kubenswrapper[4672]: I1206 09:43:39.417340 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l4xtg"] Dec 06 09:43:39 crc kubenswrapper[4672]: I1206 09:43:39.420806 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l4xtg" Dec 06 09:43:39 crc kubenswrapper[4672]: I1206 09:43:39.439348 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l4xtg"] Dec 06 09:43:39 crc kubenswrapper[4672]: I1206 09:43:39.504352 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c12faea2-1fe3-4146-adeb-02e713260bde-catalog-content\") pod \"redhat-operators-l4xtg\" (UID: \"c12faea2-1fe3-4146-adeb-02e713260bde\") " pod="openshift-marketplace/redhat-operators-l4xtg" Dec 06 09:43:39 crc kubenswrapper[4672]: I1206 09:43:39.504480 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c12faea2-1fe3-4146-adeb-02e713260bde-utilities\") pod \"redhat-operators-l4xtg\" (UID: \"c12faea2-1fe3-4146-adeb-02e713260bde\") " pod="openshift-marketplace/redhat-operators-l4xtg" Dec 06 09:43:39 crc kubenswrapper[4672]: I1206 09:43:39.504512 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6tlk\" (UniqueName: \"kubernetes.io/projected/c12faea2-1fe3-4146-adeb-02e713260bde-kube-api-access-s6tlk\") pod \"redhat-operators-l4xtg\" (UID: \"c12faea2-1fe3-4146-adeb-02e713260bde\") " pod="openshift-marketplace/redhat-operators-l4xtg" Dec 06 09:43:39 crc kubenswrapper[4672]: I1206 09:43:39.607484 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c12faea2-1fe3-4146-adeb-02e713260bde-catalog-content\") pod \"redhat-operators-l4xtg\" (UID: \"c12faea2-1fe3-4146-adeb-02e713260bde\") " pod="openshift-marketplace/redhat-operators-l4xtg" Dec 06 09:43:39 crc kubenswrapper[4672]: I1206 09:43:39.607557 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c12faea2-1fe3-4146-adeb-02e713260bde-utilities\") pod \"redhat-operators-l4xtg\" (UID: \"c12faea2-1fe3-4146-adeb-02e713260bde\") " pod="openshift-marketplace/redhat-operators-l4xtg" Dec 06 09:43:39 crc kubenswrapper[4672]: I1206 09:43:39.607575 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6tlk\" (UniqueName: \"kubernetes.io/projected/c12faea2-1fe3-4146-adeb-02e713260bde-kube-api-access-s6tlk\") pod \"redhat-operators-l4xtg\" (UID: \"c12faea2-1fe3-4146-adeb-02e713260bde\") " pod="openshift-marketplace/redhat-operators-l4xtg" Dec 06 09:43:39 crc kubenswrapper[4672]: I1206 09:43:39.608536 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c12faea2-1fe3-4146-adeb-02e713260bde-utilities\") pod \"redhat-operators-l4xtg\" (UID: \"c12faea2-1fe3-4146-adeb-02e713260bde\") " pod="openshift-marketplace/redhat-operators-l4xtg" Dec 06 09:43:39 crc kubenswrapper[4672]: I1206 09:43:39.608562 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c12faea2-1fe3-4146-adeb-02e713260bde-catalog-content\") pod \"redhat-operators-l4xtg\" (UID: \"c12faea2-1fe3-4146-adeb-02e713260bde\") " pod="openshift-marketplace/redhat-operators-l4xtg" Dec 06 09:43:39 crc kubenswrapper[4672]: I1206 09:43:39.633378 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6tlk\" (UniqueName: \"kubernetes.io/projected/c12faea2-1fe3-4146-adeb-02e713260bde-kube-api-access-s6tlk\") pod \"redhat-operators-l4xtg\" (UID: \"c12faea2-1fe3-4146-adeb-02e713260bde\") " pod="openshift-marketplace/redhat-operators-l4xtg" Dec 06 09:43:39 crc kubenswrapper[4672]: I1206 09:43:39.759669 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l4xtg" Dec 06 09:43:40 crc kubenswrapper[4672]: I1206 09:43:40.240781 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l4xtg"] Dec 06 09:43:40 crc kubenswrapper[4672]: I1206 09:43:40.809821 4672 generic.go:334] "Generic (PLEG): container finished" podID="c12faea2-1fe3-4146-adeb-02e713260bde" containerID="ebec905a34c7191d0384993363fc8aa9312ada6149832772045fba6ef4cb410b" exitCode=0 Dec 06 09:43:40 crc kubenswrapper[4672]: I1206 09:43:40.809974 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4xtg" event={"ID":"c12faea2-1fe3-4146-adeb-02e713260bde","Type":"ContainerDied","Data":"ebec905a34c7191d0384993363fc8aa9312ada6149832772045fba6ef4cb410b"} Dec 06 09:43:40 crc kubenswrapper[4672]: I1206 09:43:40.810172 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4xtg" event={"ID":"c12faea2-1fe3-4146-adeb-02e713260bde","Type":"ContainerStarted","Data":"7b3db8915c2112379e447c387374cb1412ca74842a90b7f87271b8c07ec5f90e"} Dec 06 09:43:41 crc kubenswrapper[4672]: I1206 09:43:41.819060 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4xtg" event={"ID":"c12faea2-1fe3-4146-adeb-02e713260bde","Type":"ContainerStarted","Data":"b30186ff1b5722869744ab385a460f66337094a880195aa96b1139e75b036346"} Dec 06 09:43:42 crc kubenswrapper[4672]: I1206 09:43:42.319708 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:43:42 crc kubenswrapper[4672]: I1206 09:43:42.319761 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:43:50 crc kubenswrapper[4672]: I1206 09:43:50.931350 4672 generic.go:334] "Generic (PLEG): container finished" podID="c12faea2-1fe3-4146-adeb-02e713260bde" containerID="b30186ff1b5722869744ab385a460f66337094a880195aa96b1139e75b036346" exitCode=0 Dec 06 09:43:50 crc kubenswrapper[4672]: I1206 09:43:50.931412 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4xtg" event={"ID":"c12faea2-1fe3-4146-adeb-02e713260bde","Type":"ContainerDied","Data":"b30186ff1b5722869744ab385a460f66337094a880195aa96b1139e75b036346"} Dec 06 09:43:53 crc kubenswrapper[4672]: I1206 09:43:53.954114 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4xtg" event={"ID":"c12faea2-1fe3-4146-adeb-02e713260bde","Type":"ContainerStarted","Data":"960c7f88e77d9279011dbc98051d7e39eec0f8182ce92788948ff0b1b4687c5d"} Dec 06 09:43:53 crc kubenswrapper[4672]: I1206 09:43:53.979803 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l4xtg" podStartSLOduration=2.5888138830000003 podStartE2EDuration="14.979784447s" podCreationTimestamp="2025-12-06 09:43:39 +0000 UTC" firstStartedPulling="2025-12-06 09:43:40.811903148 +0000 UTC m=+2238.556163435" lastFinishedPulling="2025-12-06 09:43:53.202873692 +0000 UTC m=+2250.947133999" observedRunningTime="2025-12-06 09:43:53.977283251 +0000 UTC m=+2251.721543558" watchObservedRunningTime="2025-12-06 09:43:53.979784447 +0000 UTC m=+2251.724044734" Dec 06 09:43:59 crc kubenswrapper[4672]: I1206 09:43:59.761124 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l4xtg" Dec 06 09:43:59 crc kubenswrapper[4672]: I1206 09:43:59.761904 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l4xtg" Dec 06 09:43:59 crc kubenswrapper[4672]: I1206 09:43:59.818626 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l4xtg" Dec 06 09:44:00 crc kubenswrapper[4672]: I1206 09:44:00.066409 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l4xtg" Dec 06 09:44:00 crc kubenswrapper[4672]: I1206 09:44:00.129069 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l4xtg"] Dec 06 09:44:02 crc kubenswrapper[4672]: I1206 09:44:02.036669 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l4xtg" podUID="c12faea2-1fe3-4146-adeb-02e713260bde" containerName="registry-server" containerID="cri-o://960c7f88e77d9279011dbc98051d7e39eec0f8182ce92788948ff0b1b4687c5d" gracePeriod=2 Dec 06 09:44:02 crc kubenswrapper[4672]: I1206 09:44:02.525774 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l4xtg" Dec 06 09:44:02 crc kubenswrapper[4672]: I1206 09:44:02.632243 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c12faea2-1fe3-4146-adeb-02e713260bde-catalog-content\") pod \"c12faea2-1fe3-4146-adeb-02e713260bde\" (UID: \"c12faea2-1fe3-4146-adeb-02e713260bde\") " Dec 06 09:44:02 crc kubenswrapper[4672]: I1206 09:44:02.632310 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c12faea2-1fe3-4146-adeb-02e713260bde-utilities\") pod \"c12faea2-1fe3-4146-adeb-02e713260bde\" (UID: \"c12faea2-1fe3-4146-adeb-02e713260bde\") " Dec 06 09:44:02 crc kubenswrapper[4672]: I1206 09:44:02.632359 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6tlk\" (UniqueName: \"kubernetes.io/projected/c12faea2-1fe3-4146-adeb-02e713260bde-kube-api-access-s6tlk\") pod \"c12faea2-1fe3-4146-adeb-02e713260bde\" (UID: \"c12faea2-1fe3-4146-adeb-02e713260bde\") " Dec 06 09:44:02 crc kubenswrapper[4672]: I1206 09:44:02.633130 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c12faea2-1fe3-4146-adeb-02e713260bde-utilities" (OuterVolumeSpecName: "utilities") pod "c12faea2-1fe3-4146-adeb-02e713260bde" (UID: "c12faea2-1fe3-4146-adeb-02e713260bde"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:44:02 crc kubenswrapper[4672]: I1206 09:44:02.637687 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c12faea2-1fe3-4146-adeb-02e713260bde-kube-api-access-s6tlk" (OuterVolumeSpecName: "kube-api-access-s6tlk") pod "c12faea2-1fe3-4146-adeb-02e713260bde" (UID: "c12faea2-1fe3-4146-adeb-02e713260bde"). InnerVolumeSpecName "kube-api-access-s6tlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:44:02 crc kubenswrapper[4672]: I1206 09:44:02.734669 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c12faea2-1fe3-4146-adeb-02e713260bde-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:44:02 crc kubenswrapper[4672]: I1206 09:44:02.734711 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6tlk\" (UniqueName: \"kubernetes.io/projected/c12faea2-1fe3-4146-adeb-02e713260bde-kube-api-access-s6tlk\") on node \"crc\" DevicePath \"\"" Dec 06 09:44:02 crc kubenswrapper[4672]: I1206 09:44:02.739538 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c12faea2-1fe3-4146-adeb-02e713260bde-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c12faea2-1fe3-4146-adeb-02e713260bde" (UID: "c12faea2-1fe3-4146-adeb-02e713260bde"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:44:02 crc kubenswrapper[4672]: I1206 09:44:02.836108 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c12faea2-1fe3-4146-adeb-02e713260bde-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:44:03 crc kubenswrapper[4672]: I1206 09:44:03.045920 4672 generic.go:334] "Generic (PLEG): container finished" podID="c12faea2-1fe3-4146-adeb-02e713260bde" containerID="960c7f88e77d9279011dbc98051d7e39eec0f8182ce92788948ff0b1b4687c5d" exitCode=0 Dec 06 09:44:03 crc kubenswrapper[4672]: I1206 09:44:03.045958 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4xtg" event={"ID":"c12faea2-1fe3-4146-adeb-02e713260bde","Type":"ContainerDied","Data":"960c7f88e77d9279011dbc98051d7e39eec0f8182ce92788948ff0b1b4687c5d"} Dec 06 09:44:03 crc kubenswrapper[4672]: I1206 09:44:03.045983 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4xtg" event={"ID":"c12faea2-1fe3-4146-adeb-02e713260bde","Type":"ContainerDied","Data":"7b3db8915c2112379e447c387374cb1412ca74842a90b7f87271b8c07ec5f90e"} Dec 06 09:44:03 crc kubenswrapper[4672]: I1206 09:44:03.046009 4672 scope.go:117] "RemoveContainer" containerID="960c7f88e77d9279011dbc98051d7e39eec0f8182ce92788948ff0b1b4687c5d" Dec 06 09:44:03 crc kubenswrapper[4672]: I1206 09:44:03.046111 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l4xtg" Dec 06 09:44:03 crc kubenswrapper[4672]: I1206 09:44:03.072140 4672 scope.go:117] "RemoveContainer" containerID="b30186ff1b5722869744ab385a460f66337094a880195aa96b1139e75b036346" Dec 06 09:44:03 crc kubenswrapper[4672]: I1206 09:44:03.086428 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l4xtg"] Dec 06 09:44:03 crc kubenswrapper[4672]: I1206 09:44:03.092258 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l4xtg"] Dec 06 09:44:03 crc kubenswrapper[4672]: I1206 09:44:03.103933 4672 scope.go:117] "RemoveContainer" containerID="ebec905a34c7191d0384993363fc8aa9312ada6149832772045fba6ef4cb410b" Dec 06 09:44:03 crc kubenswrapper[4672]: I1206 09:44:03.138152 4672 scope.go:117] "RemoveContainer" containerID="960c7f88e77d9279011dbc98051d7e39eec0f8182ce92788948ff0b1b4687c5d" Dec 06 09:44:03 crc kubenswrapper[4672]: E1206 09:44:03.138536 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"960c7f88e77d9279011dbc98051d7e39eec0f8182ce92788948ff0b1b4687c5d\": container with ID starting with 960c7f88e77d9279011dbc98051d7e39eec0f8182ce92788948ff0b1b4687c5d not found: ID does not exist" containerID="960c7f88e77d9279011dbc98051d7e39eec0f8182ce92788948ff0b1b4687c5d" Dec 06 09:44:03 crc kubenswrapper[4672]: I1206 09:44:03.138578 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"960c7f88e77d9279011dbc98051d7e39eec0f8182ce92788948ff0b1b4687c5d"} err="failed to get container status \"960c7f88e77d9279011dbc98051d7e39eec0f8182ce92788948ff0b1b4687c5d\": rpc error: code = NotFound desc = could not find container \"960c7f88e77d9279011dbc98051d7e39eec0f8182ce92788948ff0b1b4687c5d\": container with ID starting with 960c7f88e77d9279011dbc98051d7e39eec0f8182ce92788948ff0b1b4687c5d not found: ID does not exist" Dec 06 09:44:03 crc kubenswrapper[4672]: I1206 09:44:03.138656 4672 scope.go:117] "RemoveContainer" containerID="b30186ff1b5722869744ab385a460f66337094a880195aa96b1139e75b036346" Dec 06 09:44:03 crc kubenswrapper[4672]: E1206 09:44:03.140295 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b30186ff1b5722869744ab385a460f66337094a880195aa96b1139e75b036346\": container with ID starting with b30186ff1b5722869744ab385a460f66337094a880195aa96b1139e75b036346 not found: ID does not exist" containerID="b30186ff1b5722869744ab385a460f66337094a880195aa96b1139e75b036346" Dec 06 09:44:03 crc kubenswrapper[4672]: I1206 09:44:03.140328 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b30186ff1b5722869744ab385a460f66337094a880195aa96b1139e75b036346"} err="failed to get container status \"b30186ff1b5722869744ab385a460f66337094a880195aa96b1139e75b036346\": rpc error: code = NotFound desc = could not find container \"b30186ff1b5722869744ab385a460f66337094a880195aa96b1139e75b036346\": container with ID starting with b30186ff1b5722869744ab385a460f66337094a880195aa96b1139e75b036346 not found: ID does not exist" Dec 06 09:44:03 crc kubenswrapper[4672]: I1206 09:44:03.140347 4672 scope.go:117] "RemoveContainer" containerID="ebec905a34c7191d0384993363fc8aa9312ada6149832772045fba6ef4cb410b" Dec 06 09:44:03 crc kubenswrapper[4672]: E1206 09:44:03.140700 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebec905a34c7191d0384993363fc8aa9312ada6149832772045fba6ef4cb410b\": container with ID starting with ebec905a34c7191d0384993363fc8aa9312ada6149832772045fba6ef4cb410b not found: ID does not exist" containerID="ebec905a34c7191d0384993363fc8aa9312ada6149832772045fba6ef4cb410b" Dec 06 09:44:03 crc kubenswrapper[4672]: I1206 09:44:03.140752 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebec905a34c7191d0384993363fc8aa9312ada6149832772045fba6ef4cb410b"} err="failed to get container status \"ebec905a34c7191d0384993363fc8aa9312ada6149832772045fba6ef4cb410b\": rpc error: code = NotFound desc = could not find container \"ebec905a34c7191d0384993363fc8aa9312ada6149832772045fba6ef4cb410b\": container with ID starting with ebec905a34c7191d0384993363fc8aa9312ada6149832772045fba6ef4cb410b not found: ID does not exist" Dec 06 09:44:04 crc kubenswrapper[4672]: I1206 09:44:04.567285 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c12faea2-1fe3-4146-adeb-02e713260bde" path="/var/lib/kubelet/pods/c12faea2-1fe3-4146-adeb-02e713260bde/volumes" Dec 06 09:44:09 crc kubenswrapper[4672]: I1206 09:44:09.042378 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s2z9s"] Dec 06 09:44:09 crc kubenswrapper[4672]: E1206 09:44:09.043630 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c12faea2-1fe3-4146-adeb-02e713260bde" containerName="extract-utilities" Dec 06 09:44:09 crc kubenswrapper[4672]: I1206 09:44:09.043653 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c12faea2-1fe3-4146-adeb-02e713260bde" containerName="extract-utilities" Dec 06 09:44:09 crc kubenswrapper[4672]: E1206 09:44:09.043677 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c12faea2-1fe3-4146-adeb-02e713260bde" containerName="registry-server" Dec 06 09:44:09 crc kubenswrapper[4672]: I1206 09:44:09.043691 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c12faea2-1fe3-4146-adeb-02e713260bde" containerName="registry-server" Dec 06 09:44:09 crc kubenswrapper[4672]: E1206 09:44:09.043741 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c12faea2-1fe3-4146-adeb-02e713260bde" containerName="extract-content" Dec 06 09:44:09 crc kubenswrapper[4672]: I1206 09:44:09.043756 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c12faea2-1fe3-4146-adeb-02e713260bde" containerName="extract-content" Dec 06 09:44:09 crc kubenswrapper[4672]: I1206 09:44:09.044112 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="c12faea2-1fe3-4146-adeb-02e713260bde" containerName="registry-server" Dec 06 09:44:09 crc kubenswrapper[4672]: I1206 09:44:09.047651 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s2z9s" Dec 06 09:44:09 crc kubenswrapper[4672]: I1206 09:44:09.051730 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s2z9s"] Dec 06 09:44:09 crc kubenswrapper[4672]: I1206 09:44:09.148697 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0-utilities\") pod \"community-operators-s2z9s\" (UID: \"0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0\") " pod="openshift-marketplace/community-operators-s2z9s" Dec 06 09:44:09 crc kubenswrapper[4672]: I1206 09:44:09.148779 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78ggg\" (UniqueName: \"kubernetes.io/projected/0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0-kube-api-access-78ggg\") pod \"community-operators-s2z9s\" (UID: \"0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0\") " pod="openshift-marketplace/community-operators-s2z9s" Dec 06 09:44:09 crc kubenswrapper[4672]: I1206 09:44:09.148819 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0-catalog-content\") pod \"community-operators-s2z9s\" (UID: \"0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0\") " pod="openshift-marketplace/community-operators-s2z9s" Dec 06 09:44:09 crc kubenswrapper[4672]: I1206 09:44:09.250501 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0-catalog-content\") pod \"community-operators-s2z9s\" (UID: \"0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0\") " pod="openshift-marketplace/community-operators-s2z9s" Dec 06 09:44:09 crc kubenswrapper[4672]: I1206 09:44:09.250698 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0-utilities\") pod \"community-operators-s2z9s\" (UID: \"0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0\") " pod="openshift-marketplace/community-operators-s2z9s" Dec 06 09:44:09 crc kubenswrapper[4672]: I1206 09:44:09.250737 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78ggg\" (UniqueName: \"kubernetes.io/projected/0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0-kube-api-access-78ggg\") pod \"community-operators-s2z9s\" (UID: \"0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0\") " pod="openshift-marketplace/community-operators-s2z9s" Dec 06 09:44:09 crc kubenswrapper[4672]: I1206 09:44:09.251140 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0-catalog-content\") pod \"community-operators-s2z9s\" (UID: \"0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0\") " pod="openshift-marketplace/community-operators-s2z9s" Dec 06 09:44:09 crc kubenswrapper[4672]: I1206 09:44:09.251166 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0-utilities\") pod \"community-operators-s2z9s\" (UID: \"0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0\") " pod="openshift-marketplace/community-operators-s2z9s" Dec 06 09:44:09 crc kubenswrapper[4672]: I1206 09:44:09.270041 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78ggg\" (UniqueName: \"kubernetes.io/projected/0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0-kube-api-access-78ggg\") pod \"community-operators-s2z9s\" (UID: \"0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0\") " pod="openshift-marketplace/community-operators-s2z9s" Dec 06 09:44:09 crc kubenswrapper[4672]: I1206 09:44:09.379821 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s2z9s" Dec 06 09:44:09 crc kubenswrapper[4672]: I1206 09:44:09.990879 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s2z9s"] Dec 06 09:44:10 crc kubenswrapper[4672]: I1206 09:44:10.134209 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2z9s" event={"ID":"0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0","Type":"ContainerStarted","Data":"b29f7efef0c217810391184d3194e89f772b72f44e5098d8a5a481df721daa66"} Dec 06 09:44:11 crc kubenswrapper[4672]: I1206 09:44:11.144032 4672 generic.go:334] "Generic (PLEG): container finished" podID="0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0" containerID="6af6b3d6dd0193116d2d48ecdf54dac434bb985dd7e5f7febafc0b17399bb7e9" exitCode=0 Dec 06 09:44:11 crc kubenswrapper[4672]: I1206 09:44:11.144116 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2z9s" event={"ID":"0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0","Type":"ContainerDied","Data":"6af6b3d6dd0193116d2d48ecdf54dac434bb985dd7e5f7febafc0b17399bb7e9"} Dec 06 09:44:12 crc kubenswrapper[4672]: I1206 09:44:12.235800 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c6qk7"] Dec 06 09:44:12 crc kubenswrapper[4672]: I1206 09:44:12.238779 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c6qk7" Dec 06 09:44:12 crc kubenswrapper[4672]: I1206 09:44:12.251999 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c6qk7"] Dec 06 09:44:12 crc kubenswrapper[4672]: I1206 09:44:12.313714 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da19355e-8626-43f9-a9f7-2e09b176dd8b-utilities\") pod \"certified-operators-c6qk7\" (UID: \"da19355e-8626-43f9-a9f7-2e09b176dd8b\") " pod="openshift-marketplace/certified-operators-c6qk7" Dec 06 09:44:12 crc kubenswrapper[4672]: I1206 09:44:12.313764 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngc4t\" (UniqueName: \"kubernetes.io/projected/da19355e-8626-43f9-a9f7-2e09b176dd8b-kube-api-access-ngc4t\") pod \"certified-operators-c6qk7\" (UID: \"da19355e-8626-43f9-a9f7-2e09b176dd8b\") " pod="openshift-marketplace/certified-operators-c6qk7" Dec 06 09:44:12 crc kubenswrapper[4672]: I1206 09:44:12.313921 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da19355e-8626-43f9-a9f7-2e09b176dd8b-catalog-content\") pod \"certified-operators-c6qk7\" (UID: \"da19355e-8626-43f9-a9f7-2e09b176dd8b\") " pod="openshift-marketplace/certified-operators-c6qk7" Dec 06 09:44:12 crc kubenswrapper[4672]: I1206 09:44:12.319782 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:44:12 crc kubenswrapper[4672]: I1206 09:44:12.319832 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:44:12 crc kubenswrapper[4672]: I1206 09:44:12.319869 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 09:44:12 crc kubenswrapper[4672]: I1206 09:44:12.320389 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9db37941c2f5797e3cc2e07c2a5ea926cd6224bccf10d013946bef80402ff7bb"} pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 09:44:12 crc kubenswrapper[4672]: I1206 09:44:12.320436 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" containerID="cri-o://9db37941c2f5797e3cc2e07c2a5ea926cd6224bccf10d013946bef80402ff7bb" gracePeriod=600 Dec 06 09:44:12 crc kubenswrapper[4672]: I1206 09:44:12.415503 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da19355e-8626-43f9-a9f7-2e09b176dd8b-catalog-content\") pod \"certified-operators-c6qk7\" (UID: \"da19355e-8626-43f9-a9f7-2e09b176dd8b\") " pod="openshift-marketplace/certified-operators-c6qk7" Dec 06 09:44:12 crc kubenswrapper[4672]: I1206 09:44:12.415657 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da19355e-8626-43f9-a9f7-2e09b176dd8b-utilities\") pod \"certified-operators-c6qk7\" (UID: \"da19355e-8626-43f9-a9f7-2e09b176dd8b\") " pod="openshift-marketplace/certified-operators-c6qk7" Dec 06 09:44:12 crc kubenswrapper[4672]: I1206 09:44:12.415700 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngc4t\" (UniqueName: \"kubernetes.io/projected/da19355e-8626-43f9-a9f7-2e09b176dd8b-kube-api-access-ngc4t\") pod \"certified-operators-c6qk7\" (UID: \"da19355e-8626-43f9-a9f7-2e09b176dd8b\") " pod="openshift-marketplace/certified-operators-c6qk7" Dec 06 09:44:12 crc kubenswrapper[4672]: I1206 09:44:12.416551 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da19355e-8626-43f9-a9f7-2e09b176dd8b-catalog-content\") pod \"certified-operators-c6qk7\" (UID: \"da19355e-8626-43f9-a9f7-2e09b176dd8b\") " pod="openshift-marketplace/certified-operators-c6qk7" Dec 06 09:44:12 crc kubenswrapper[4672]: I1206 09:44:12.417130 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da19355e-8626-43f9-a9f7-2e09b176dd8b-utilities\") pod \"certified-operators-c6qk7\" (UID: \"da19355e-8626-43f9-a9f7-2e09b176dd8b\") " pod="openshift-marketplace/certified-operators-c6qk7" Dec 06 09:44:12 crc kubenswrapper[4672]: I1206 09:44:12.446551 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngc4t\" (UniqueName: \"kubernetes.io/projected/da19355e-8626-43f9-a9f7-2e09b176dd8b-kube-api-access-ngc4t\") pod \"certified-operators-c6qk7\" (UID: \"da19355e-8626-43f9-a9f7-2e09b176dd8b\") " pod="openshift-marketplace/certified-operators-c6qk7" Dec 06 09:44:12 crc kubenswrapper[4672]: E1206 09:44:12.460898 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:44:12 crc kubenswrapper[4672]: I1206 09:44:12.562710 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c6qk7" Dec 06 09:44:13 crc kubenswrapper[4672]: I1206 09:44:13.072736 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c6qk7"] Dec 06 09:44:13 crc kubenswrapper[4672]: I1206 09:44:13.164465 4672 generic.go:334] "Generic (PLEG): container finished" podID="0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0" containerID="1e07ca4a0513b513ae8a775afee3cbde45029e9ba240310e5c1470073483f343" exitCode=0 Dec 06 09:44:13 crc kubenswrapper[4672]: I1206 09:44:13.164520 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2z9s" event={"ID":"0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0","Type":"ContainerDied","Data":"1e07ca4a0513b513ae8a775afee3cbde45029e9ba240310e5c1470073483f343"} Dec 06 09:44:13 crc kubenswrapper[4672]: I1206 09:44:13.166952 4672 generic.go:334] "Generic (PLEG): container finished" podID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerID="9db37941c2f5797e3cc2e07c2a5ea926cd6224bccf10d013946bef80402ff7bb" exitCode=0 Dec 06 09:44:13 crc kubenswrapper[4672]: I1206 09:44:13.167022 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerDied","Data":"9db37941c2f5797e3cc2e07c2a5ea926cd6224bccf10d013946bef80402ff7bb"} Dec 06 09:44:13 crc kubenswrapper[4672]: I1206 09:44:13.167074 4672 scope.go:117] "RemoveContainer" containerID="71a9b1f852f62d2842628c107ad52332f067c0c03486844efb4bf89c1bd15b0e" Dec 06 09:44:13 crc kubenswrapper[4672]: I1206 09:44:13.167485 4672 scope.go:117] "RemoveContainer" containerID="9db37941c2f5797e3cc2e07c2a5ea926cd6224bccf10d013946bef80402ff7bb" Dec 06 09:44:13 crc kubenswrapper[4672]: E1206 09:44:13.167793 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:44:13 crc kubenswrapper[4672]: I1206 09:44:13.168815 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6qk7" event={"ID":"da19355e-8626-43f9-a9f7-2e09b176dd8b","Type":"ContainerStarted","Data":"9d813aaa8e8b8ffcffa37dc43fcb7bc196b94e34a054641fb27f1b31dc746c4d"} Dec 06 09:44:14 crc kubenswrapper[4672]: I1206 09:44:14.187924 4672 generic.go:334] "Generic (PLEG): container finished" podID="da19355e-8626-43f9-a9f7-2e09b176dd8b" containerID="9aa007af845193a28eb1dfff14f70c31209e685cdd55b97c04dd567e954fc0cf" exitCode=0 Dec 06 09:44:14 crc kubenswrapper[4672]: I1206 09:44:14.188023 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6qk7" event={"ID":"da19355e-8626-43f9-a9f7-2e09b176dd8b","Type":"ContainerDied","Data":"9aa007af845193a28eb1dfff14f70c31209e685cdd55b97c04dd567e954fc0cf"} Dec 06 09:44:14 crc kubenswrapper[4672]: I1206 09:44:14.193491 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2z9s" event={"ID":"0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0","Type":"ContainerStarted","Data":"bfd35bdbb2ea000528dcd225a02d6b0ecdac3cafa6f8ff953ca349f19f80bf00"} Dec 06 09:44:14 crc kubenswrapper[4672]: I1206 09:44:14.249920 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s2z9s" podStartSLOduration=2.851667056 podStartE2EDuration="5.249903291s" podCreationTimestamp="2025-12-06 09:44:09 +0000 UTC" firstStartedPulling="2025-12-06 09:44:11.146890402 +0000 UTC m=+2268.891150699" lastFinishedPulling="2025-12-06 09:44:13.545126647 +0000 UTC m=+2271.289386934" observedRunningTime="2025-12-06 09:44:14.247732022 +0000 UTC m=+2271.991992309" watchObservedRunningTime="2025-12-06 09:44:14.249903291 +0000 UTC m=+2271.994163578" Dec 06 09:44:15 crc kubenswrapper[4672]: I1206 09:44:15.203053 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6qk7" event={"ID":"da19355e-8626-43f9-a9f7-2e09b176dd8b","Type":"ContainerStarted","Data":"a0f13418766b67dd499058bd1b499f96130ad24a32e1d0183acd0b2cb879db7c"} Dec 06 09:44:18 crc kubenswrapper[4672]: I1206 09:44:18.236119 4672 generic.go:334] "Generic (PLEG): container finished" podID="da19355e-8626-43f9-a9f7-2e09b176dd8b" containerID="a0f13418766b67dd499058bd1b499f96130ad24a32e1d0183acd0b2cb879db7c" exitCode=0 Dec 06 09:44:18 crc kubenswrapper[4672]: I1206 09:44:18.236475 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6qk7" event={"ID":"da19355e-8626-43f9-a9f7-2e09b176dd8b","Type":"ContainerDied","Data":"a0f13418766b67dd499058bd1b499f96130ad24a32e1d0183acd0b2cb879db7c"} Dec 06 09:44:19 crc kubenswrapper[4672]: I1206 09:44:19.267447 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6qk7" event={"ID":"da19355e-8626-43f9-a9f7-2e09b176dd8b","Type":"ContainerStarted","Data":"57b22a022434572d20255b9c6c23c131a52af087cf048bad3c82dafa0823cc5f"} Dec 06 09:44:19 crc kubenswrapper[4672]: I1206 09:44:19.302729 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c6qk7" podStartSLOduration=2.824454779 podStartE2EDuration="7.30270977s" podCreationTimestamp="2025-12-06 09:44:12 +0000 UTC" firstStartedPulling="2025-12-06 09:44:14.189921056 +0000 UTC m=+2271.934181343" lastFinishedPulling="2025-12-06 09:44:18.668176027 +0000 UTC m=+2276.412436334" observedRunningTime="2025-12-06 09:44:19.297928711 +0000 UTC m=+2277.042188998" watchObservedRunningTime="2025-12-06 09:44:19.30270977 +0000 UTC m=+2277.046970067" Dec 06 09:44:19 crc kubenswrapper[4672]: I1206 09:44:19.380825 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s2z9s" Dec 06 09:44:19 crc kubenswrapper[4672]: I1206 09:44:19.380867 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s2z9s" Dec 06 09:44:19 crc kubenswrapper[4672]: I1206 09:44:19.425719 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s2z9s" Dec 06 09:44:20 crc kubenswrapper[4672]: I1206 09:44:20.356153 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s2z9s" Dec 06 09:44:21 crc kubenswrapper[4672]: I1206 09:44:21.618748 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s2z9s"] Dec 06 09:44:22 crc kubenswrapper[4672]: I1206 09:44:22.298806 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s2z9s" podUID="0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0" containerName="registry-server" containerID="cri-o://bfd35bdbb2ea000528dcd225a02d6b0ecdac3cafa6f8ff953ca349f19f80bf00" gracePeriod=2 Dec 06 09:44:22 crc kubenswrapper[4672]: I1206 09:44:22.575692 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c6qk7" Dec 06 09:44:22 crc kubenswrapper[4672]: I1206 09:44:22.576033 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c6qk7" Dec 06 09:44:22 crc kubenswrapper[4672]: I1206 09:44:22.642804 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c6qk7" Dec 06 09:44:22 crc kubenswrapper[4672]: I1206 09:44:22.754773 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s2z9s" Dec 06 09:44:22 crc kubenswrapper[4672]: I1206 09:44:22.837719 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0-utilities\") pod \"0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0\" (UID: \"0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0\") " Dec 06 09:44:22 crc kubenswrapper[4672]: I1206 09:44:22.837940 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0-catalog-content\") pod \"0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0\" (UID: \"0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0\") " Dec 06 09:44:22 crc kubenswrapper[4672]: I1206 09:44:22.838184 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78ggg\" (UniqueName: \"kubernetes.io/projected/0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0-kube-api-access-78ggg\") pod \"0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0\" (UID: \"0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0\") " Dec 06 09:44:22 crc kubenswrapper[4672]: I1206 09:44:22.838525 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0-utilities" (OuterVolumeSpecName: "utilities") pod "0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0" (UID: "0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:44:22 crc kubenswrapper[4672]: I1206 09:44:22.839008 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:44:22 crc kubenswrapper[4672]: I1206 09:44:22.848812 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0-kube-api-access-78ggg" (OuterVolumeSpecName: "kube-api-access-78ggg") pod "0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0" (UID: "0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0"). InnerVolumeSpecName "kube-api-access-78ggg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:44:22 crc kubenswrapper[4672]: I1206 09:44:22.892782 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0" (UID: "0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:44:22 crc kubenswrapper[4672]: I1206 09:44:22.941194 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78ggg\" (UniqueName: \"kubernetes.io/projected/0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0-kube-api-access-78ggg\") on node \"crc\" DevicePath \"\"" Dec 06 09:44:22 crc kubenswrapper[4672]: I1206 09:44:22.941224 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:44:23 crc kubenswrapper[4672]: I1206 09:44:23.309464 4672 generic.go:334] "Generic (PLEG): container finished" podID="0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0" containerID="bfd35bdbb2ea000528dcd225a02d6b0ecdac3cafa6f8ff953ca349f19f80bf00" exitCode=0 Dec 06 09:44:23 crc kubenswrapper[4672]: I1206 09:44:23.309559 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s2z9s" Dec 06 09:44:23 crc kubenswrapper[4672]: I1206 09:44:23.309577 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2z9s" event={"ID":"0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0","Type":"ContainerDied","Data":"bfd35bdbb2ea000528dcd225a02d6b0ecdac3cafa6f8ff953ca349f19f80bf00"} Dec 06 09:44:23 crc kubenswrapper[4672]: I1206 09:44:23.309642 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2z9s" event={"ID":"0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0","Type":"ContainerDied","Data":"b29f7efef0c217810391184d3194e89f772b72f44e5098d8a5a481df721daa66"} Dec 06 09:44:23 crc kubenswrapper[4672]: I1206 09:44:23.309665 4672 scope.go:117] "RemoveContainer" containerID="bfd35bdbb2ea000528dcd225a02d6b0ecdac3cafa6f8ff953ca349f19f80bf00" Dec 06 09:44:23 crc kubenswrapper[4672]: I1206 09:44:23.337198 4672 scope.go:117] "RemoveContainer" containerID="1e07ca4a0513b513ae8a775afee3cbde45029e9ba240310e5c1470073483f343" Dec 06 09:44:23 crc kubenswrapper[4672]: I1206 09:44:23.366095 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s2z9s"] Dec 06 09:44:23 crc kubenswrapper[4672]: I1206 09:44:23.385259 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s2z9s"] Dec 06 09:44:23 crc kubenswrapper[4672]: I1206 09:44:23.400803 4672 scope.go:117] "RemoveContainer" containerID="6af6b3d6dd0193116d2d48ecdf54dac434bb985dd7e5f7febafc0b17399bb7e9" Dec 06 09:44:23 crc kubenswrapper[4672]: I1206 09:44:23.417438 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c6qk7" Dec 06 09:44:23 crc kubenswrapper[4672]: I1206 09:44:23.441527 4672 scope.go:117] "RemoveContainer" containerID="bfd35bdbb2ea000528dcd225a02d6b0ecdac3cafa6f8ff953ca349f19f80bf00" Dec 06 09:44:23 crc kubenswrapper[4672]: E1206 09:44:23.442247 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfd35bdbb2ea000528dcd225a02d6b0ecdac3cafa6f8ff953ca349f19f80bf00\": container with ID starting with bfd35bdbb2ea000528dcd225a02d6b0ecdac3cafa6f8ff953ca349f19f80bf00 not found: ID does not exist" containerID="bfd35bdbb2ea000528dcd225a02d6b0ecdac3cafa6f8ff953ca349f19f80bf00" Dec 06 09:44:23 crc kubenswrapper[4672]: I1206 09:44:23.442362 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfd35bdbb2ea000528dcd225a02d6b0ecdac3cafa6f8ff953ca349f19f80bf00"} err="failed to get container status \"bfd35bdbb2ea000528dcd225a02d6b0ecdac3cafa6f8ff953ca349f19f80bf00\": rpc error: code = NotFound desc = could not find container \"bfd35bdbb2ea000528dcd225a02d6b0ecdac3cafa6f8ff953ca349f19f80bf00\": container with ID starting with bfd35bdbb2ea000528dcd225a02d6b0ecdac3cafa6f8ff953ca349f19f80bf00 not found: ID does not exist" Dec 06 09:44:23 crc kubenswrapper[4672]: I1206 09:44:23.442461 4672 scope.go:117] "RemoveContainer" containerID="1e07ca4a0513b513ae8a775afee3cbde45029e9ba240310e5c1470073483f343" Dec 06 09:44:23 crc kubenswrapper[4672]: E1206 09:44:23.442875 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e07ca4a0513b513ae8a775afee3cbde45029e9ba240310e5c1470073483f343\": container with ID starting with 1e07ca4a0513b513ae8a775afee3cbde45029e9ba240310e5c1470073483f343 not found: ID does not exist" containerID="1e07ca4a0513b513ae8a775afee3cbde45029e9ba240310e5c1470073483f343" Dec 06 09:44:23 crc kubenswrapper[4672]: I1206 09:44:23.442925 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e07ca4a0513b513ae8a775afee3cbde45029e9ba240310e5c1470073483f343"} err="failed to get container status \"1e07ca4a0513b513ae8a775afee3cbde45029e9ba240310e5c1470073483f343\": rpc error: code = NotFound desc = could not find container \"1e07ca4a0513b513ae8a775afee3cbde45029e9ba240310e5c1470073483f343\": container with ID starting with 1e07ca4a0513b513ae8a775afee3cbde45029e9ba240310e5c1470073483f343 not found: ID does not exist" Dec 06 09:44:23 crc kubenswrapper[4672]: I1206 09:44:23.442958 4672 scope.go:117] "RemoveContainer" containerID="6af6b3d6dd0193116d2d48ecdf54dac434bb985dd7e5f7febafc0b17399bb7e9" Dec 06 09:44:23 crc kubenswrapper[4672]: E1206 09:44:23.443920 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6af6b3d6dd0193116d2d48ecdf54dac434bb985dd7e5f7febafc0b17399bb7e9\": container with ID starting with 6af6b3d6dd0193116d2d48ecdf54dac434bb985dd7e5f7febafc0b17399bb7e9 not found: ID does not exist" containerID="6af6b3d6dd0193116d2d48ecdf54dac434bb985dd7e5f7febafc0b17399bb7e9" Dec 06 09:44:23 crc kubenswrapper[4672]: I1206 09:44:23.444016 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6af6b3d6dd0193116d2d48ecdf54dac434bb985dd7e5f7febafc0b17399bb7e9"} err="failed to get container status \"6af6b3d6dd0193116d2d48ecdf54dac434bb985dd7e5f7febafc0b17399bb7e9\": rpc error: code = NotFound desc = could not find container \"6af6b3d6dd0193116d2d48ecdf54dac434bb985dd7e5f7febafc0b17399bb7e9\": container with ID starting with 6af6b3d6dd0193116d2d48ecdf54dac434bb985dd7e5f7febafc0b17399bb7e9 not found: ID does not exist" Dec 06 09:44:24 crc kubenswrapper[4672]: I1206 09:44:24.583072 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0" path="/var/lib/kubelet/pods/0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0/volumes" Dec 06 09:44:25 crc kubenswrapper[4672]: I1206 09:44:25.014452 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c6qk7"] Dec 06 09:44:25 crc kubenswrapper[4672]: I1206 09:44:25.328370 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c6qk7" podUID="da19355e-8626-43f9-a9f7-2e09b176dd8b" containerName="registry-server" containerID="cri-o://57b22a022434572d20255b9c6c23c131a52af087cf048bad3c82dafa0823cc5f" gracePeriod=2 Dec 06 09:44:25 crc kubenswrapper[4672]: I1206 09:44:25.748783 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c6qk7" Dec 06 09:44:25 crc kubenswrapper[4672]: I1206 09:44:25.799363 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da19355e-8626-43f9-a9f7-2e09b176dd8b-utilities\") pod \"da19355e-8626-43f9-a9f7-2e09b176dd8b\" (UID: \"da19355e-8626-43f9-a9f7-2e09b176dd8b\") " Dec 06 09:44:25 crc kubenswrapper[4672]: I1206 09:44:25.799573 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da19355e-8626-43f9-a9f7-2e09b176dd8b-catalog-content\") pod \"da19355e-8626-43f9-a9f7-2e09b176dd8b\" (UID: \"da19355e-8626-43f9-a9f7-2e09b176dd8b\") " Dec 06 09:44:25 crc kubenswrapper[4672]: I1206 09:44:25.799734 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngc4t\" (UniqueName: \"kubernetes.io/projected/da19355e-8626-43f9-a9f7-2e09b176dd8b-kube-api-access-ngc4t\") pod \"da19355e-8626-43f9-a9f7-2e09b176dd8b\" (UID: \"da19355e-8626-43f9-a9f7-2e09b176dd8b\") " Dec 06 09:44:25 crc kubenswrapper[4672]: I1206 09:44:25.811695 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da19355e-8626-43f9-a9f7-2e09b176dd8b-utilities" (OuterVolumeSpecName: "utilities") pod "da19355e-8626-43f9-a9f7-2e09b176dd8b" (UID: "da19355e-8626-43f9-a9f7-2e09b176dd8b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:44:25 crc kubenswrapper[4672]: I1206 09:44:25.812968 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da19355e-8626-43f9-a9f7-2e09b176dd8b-kube-api-access-ngc4t" (OuterVolumeSpecName: "kube-api-access-ngc4t") pod "da19355e-8626-43f9-a9f7-2e09b176dd8b" (UID: "da19355e-8626-43f9-a9f7-2e09b176dd8b"). InnerVolumeSpecName "kube-api-access-ngc4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:44:25 crc kubenswrapper[4672]: I1206 09:44:25.852853 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da19355e-8626-43f9-a9f7-2e09b176dd8b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da19355e-8626-43f9-a9f7-2e09b176dd8b" (UID: "da19355e-8626-43f9-a9f7-2e09b176dd8b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:44:25 crc kubenswrapper[4672]: I1206 09:44:25.901896 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngc4t\" (UniqueName: \"kubernetes.io/projected/da19355e-8626-43f9-a9f7-2e09b176dd8b-kube-api-access-ngc4t\") on node \"crc\" DevicePath \"\"" Dec 06 09:44:25 crc kubenswrapper[4672]: I1206 09:44:25.902228 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da19355e-8626-43f9-a9f7-2e09b176dd8b-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:44:25 crc kubenswrapper[4672]: I1206 09:44:25.902311 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da19355e-8626-43f9-a9f7-2e09b176dd8b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:44:26 crc kubenswrapper[4672]: I1206 09:44:26.336965 4672 generic.go:334] "Generic (PLEG): container finished" podID="da19355e-8626-43f9-a9f7-2e09b176dd8b" containerID="57b22a022434572d20255b9c6c23c131a52af087cf048bad3c82dafa0823cc5f" exitCode=0 Dec 06 09:44:26 crc kubenswrapper[4672]: I1206 09:44:26.337005 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6qk7" event={"ID":"da19355e-8626-43f9-a9f7-2e09b176dd8b","Type":"ContainerDied","Data":"57b22a022434572d20255b9c6c23c131a52af087cf048bad3c82dafa0823cc5f"} Dec 06 09:44:26 crc kubenswrapper[4672]: I1206 09:44:26.337029 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6qk7" event={"ID":"da19355e-8626-43f9-a9f7-2e09b176dd8b","Type":"ContainerDied","Data":"9d813aaa8e8b8ffcffa37dc43fcb7bc196b94e34a054641fb27f1b31dc746c4d"} Dec 06 09:44:26 crc kubenswrapper[4672]: I1206 09:44:26.337049 4672 scope.go:117] "RemoveContainer" containerID="57b22a022434572d20255b9c6c23c131a52af087cf048bad3c82dafa0823cc5f" Dec 06 09:44:26 crc kubenswrapper[4672]: I1206 09:44:26.337059 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c6qk7" Dec 06 09:44:26 crc kubenswrapper[4672]: I1206 09:44:26.361055 4672 scope.go:117] "RemoveContainer" containerID="a0f13418766b67dd499058bd1b499f96130ad24a32e1d0183acd0b2cb879db7c" Dec 06 09:44:26 crc kubenswrapper[4672]: I1206 09:44:26.371326 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c6qk7"] Dec 06 09:44:26 crc kubenswrapper[4672]: I1206 09:44:26.388463 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c6qk7"] Dec 06 09:44:26 crc kubenswrapper[4672]: I1206 09:44:26.401796 4672 scope.go:117] "RemoveContainer" containerID="9aa007af845193a28eb1dfff14f70c31209e685cdd55b97c04dd567e954fc0cf" Dec 06 09:44:26 crc kubenswrapper[4672]: I1206 09:44:26.427564 4672 scope.go:117] "RemoveContainer" containerID="57b22a022434572d20255b9c6c23c131a52af087cf048bad3c82dafa0823cc5f" Dec 06 09:44:26 crc kubenswrapper[4672]: E1206 09:44:26.428055 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57b22a022434572d20255b9c6c23c131a52af087cf048bad3c82dafa0823cc5f\": container with ID starting with 57b22a022434572d20255b9c6c23c131a52af087cf048bad3c82dafa0823cc5f not found: ID does not exist" containerID="57b22a022434572d20255b9c6c23c131a52af087cf048bad3c82dafa0823cc5f" Dec 06 09:44:26 crc kubenswrapper[4672]: I1206 09:44:26.428106 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57b22a022434572d20255b9c6c23c131a52af087cf048bad3c82dafa0823cc5f"} err="failed to get container status \"57b22a022434572d20255b9c6c23c131a52af087cf048bad3c82dafa0823cc5f\": rpc error: code = NotFound desc = could not find container \"57b22a022434572d20255b9c6c23c131a52af087cf048bad3c82dafa0823cc5f\": container with ID starting with 57b22a022434572d20255b9c6c23c131a52af087cf048bad3c82dafa0823cc5f not found: ID does not exist" Dec 06 09:44:26 crc kubenswrapper[4672]: I1206 09:44:26.428137 4672 scope.go:117] "RemoveContainer" containerID="a0f13418766b67dd499058bd1b499f96130ad24a32e1d0183acd0b2cb879db7c" Dec 06 09:44:26 crc kubenswrapper[4672]: E1206 09:44:26.428557 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0f13418766b67dd499058bd1b499f96130ad24a32e1d0183acd0b2cb879db7c\": container with ID starting with a0f13418766b67dd499058bd1b499f96130ad24a32e1d0183acd0b2cb879db7c not found: ID does not exist" containerID="a0f13418766b67dd499058bd1b499f96130ad24a32e1d0183acd0b2cb879db7c" Dec 06 09:44:26 crc kubenswrapper[4672]: I1206 09:44:26.428584 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0f13418766b67dd499058bd1b499f96130ad24a32e1d0183acd0b2cb879db7c"} err="failed to get container status \"a0f13418766b67dd499058bd1b499f96130ad24a32e1d0183acd0b2cb879db7c\": rpc error: code = NotFound desc = could not find container \"a0f13418766b67dd499058bd1b499f96130ad24a32e1d0183acd0b2cb879db7c\": container with ID starting with a0f13418766b67dd499058bd1b499f96130ad24a32e1d0183acd0b2cb879db7c not found: ID does not exist" Dec 06 09:44:26 crc kubenswrapper[4672]: I1206 09:44:26.428618 4672 scope.go:117] "RemoveContainer" containerID="9aa007af845193a28eb1dfff14f70c31209e685cdd55b97c04dd567e954fc0cf" Dec 06 09:44:26 crc kubenswrapper[4672]: E1206 09:44:26.428880 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9aa007af845193a28eb1dfff14f70c31209e685cdd55b97c04dd567e954fc0cf\": container with ID starting with 9aa007af845193a28eb1dfff14f70c31209e685cdd55b97c04dd567e954fc0cf not found: ID does not exist" containerID="9aa007af845193a28eb1dfff14f70c31209e685cdd55b97c04dd567e954fc0cf" Dec 06 09:44:26 crc kubenswrapper[4672]: I1206 09:44:26.428911 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aa007af845193a28eb1dfff14f70c31209e685cdd55b97c04dd567e954fc0cf"} err="failed to get container status \"9aa007af845193a28eb1dfff14f70c31209e685cdd55b97c04dd567e954fc0cf\": rpc error: code = NotFound desc = could not find container \"9aa007af845193a28eb1dfff14f70c31209e685cdd55b97c04dd567e954fc0cf\": container with ID starting with 9aa007af845193a28eb1dfff14f70c31209e685cdd55b97c04dd567e954fc0cf not found: ID does not exist" Dec 06 09:44:26 crc kubenswrapper[4672]: I1206 09:44:26.570716 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da19355e-8626-43f9-a9f7-2e09b176dd8b" path="/var/lib/kubelet/pods/da19355e-8626-43f9-a9f7-2e09b176dd8b/volumes" Dec 06 09:44:28 crc kubenswrapper[4672]: I1206 09:44:28.557621 4672 scope.go:117] "RemoveContainer" containerID="9db37941c2f5797e3cc2e07c2a5ea926cd6224bccf10d013946bef80402ff7bb" Dec 06 09:44:28 crc kubenswrapper[4672]: E1206 09:44:28.557932 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:44:42 crc kubenswrapper[4672]: I1206 09:44:42.562580 4672 scope.go:117] "RemoveContainer" containerID="9db37941c2f5797e3cc2e07c2a5ea926cd6224bccf10d013946bef80402ff7bb" Dec 06 09:44:42 crc kubenswrapper[4672]: E1206 09:44:42.563430 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:44:55 crc kubenswrapper[4672]: I1206 09:44:55.556365 4672 scope.go:117] "RemoveContainer" containerID="9db37941c2f5797e3cc2e07c2a5ea926cd6224bccf10d013946bef80402ff7bb" Dec 06 09:44:55 crc kubenswrapper[4672]: E1206 09:44:55.557213 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:45:00 crc kubenswrapper[4672]: I1206 09:45:00.176899 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416905-jm4xv"] Dec 06 09:45:00 crc kubenswrapper[4672]: E1206 09:45:00.180910 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da19355e-8626-43f9-a9f7-2e09b176dd8b" containerName="registry-server" Dec 06 09:45:00 crc kubenswrapper[4672]: I1206 09:45:00.181179 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="da19355e-8626-43f9-a9f7-2e09b176dd8b" containerName="registry-server" Dec 06 09:45:00 crc kubenswrapper[4672]: E1206 09:45:00.181396 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0" containerName="registry-server" Dec 06 09:45:00 crc kubenswrapper[4672]: I1206 09:45:00.181575 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0" containerName="registry-server" Dec 06 09:45:00 crc kubenswrapper[4672]: E1206 09:45:00.181846 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0" containerName="extract-utilities" Dec 06 09:45:00 crc kubenswrapper[4672]: I1206 09:45:00.182015 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0" containerName="extract-utilities" Dec 06 09:45:00 crc kubenswrapper[4672]: E1206 09:45:00.182214 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0" containerName="extract-content" Dec 06 09:45:00 crc kubenswrapper[4672]: I1206 09:45:00.182398 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0" containerName="extract-content" Dec 06 09:45:00 crc kubenswrapper[4672]: E1206 09:45:00.182565 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da19355e-8626-43f9-a9f7-2e09b176dd8b" containerName="extract-content" Dec 06 09:45:00 crc kubenswrapper[4672]: I1206 09:45:00.182820 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="da19355e-8626-43f9-a9f7-2e09b176dd8b" containerName="extract-content" Dec 06 09:45:00 crc kubenswrapper[4672]: E1206 09:45:00.183055 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da19355e-8626-43f9-a9f7-2e09b176dd8b" containerName="extract-utilities" Dec 06 09:45:00 crc kubenswrapper[4672]: I1206 09:45:00.183226 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="da19355e-8626-43f9-a9f7-2e09b176dd8b" containerName="extract-utilities" Dec 06 09:45:00 crc kubenswrapper[4672]: I1206 09:45:00.183892 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="da19355e-8626-43f9-a9f7-2e09b176dd8b" containerName="registry-server" Dec 06 09:45:00 crc kubenswrapper[4672]: I1206 09:45:00.184128 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e6c3ad6-bdcd-49c1-a231-ffd9dda571d0" containerName="registry-server" Dec 06 09:45:00 crc kubenswrapper[4672]: I1206 09:45:00.185763 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-jm4xv" Dec 06 09:45:00 crc kubenswrapper[4672]: I1206 09:45:00.188645 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 09:45:00 crc kubenswrapper[4672]: I1206 09:45:00.189307 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 09:45:00 crc kubenswrapper[4672]: I1206 09:45:00.198979 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416905-jm4xv"] Dec 06 09:45:00 crc kubenswrapper[4672]: I1206 09:45:00.257536 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9btj\" (UniqueName: \"kubernetes.io/projected/42b72307-c01a-44b8-88ce-9a267335daff-kube-api-access-g9btj\") pod \"collect-profiles-29416905-jm4xv\" (UID: \"42b72307-c01a-44b8-88ce-9a267335daff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-jm4xv" Dec 06 09:45:00 crc kubenswrapper[4672]: I1206 09:45:00.257903 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42b72307-c01a-44b8-88ce-9a267335daff-config-volume\") pod \"collect-profiles-29416905-jm4xv\" (UID: \"42b72307-c01a-44b8-88ce-9a267335daff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-jm4xv" Dec 06 09:45:00 crc kubenswrapper[4672]: I1206 09:45:00.257927 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42b72307-c01a-44b8-88ce-9a267335daff-secret-volume\") pod \"collect-profiles-29416905-jm4xv\" (UID: \"42b72307-c01a-44b8-88ce-9a267335daff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-jm4xv" Dec 06 09:45:00 crc kubenswrapper[4672]: I1206 09:45:00.359062 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9btj\" (UniqueName: \"kubernetes.io/projected/42b72307-c01a-44b8-88ce-9a267335daff-kube-api-access-g9btj\") pod \"collect-profiles-29416905-jm4xv\" (UID: \"42b72307-c01a-44b8-88ce-9a267335daff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-jm4xv" Dec 06 09:45:00 crc kubenswrapper[4672]: I1206 09:45:00.359141 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42b72307-c01a-44b8-88ce-9a267335daff-config-volume\") pod \"collect-profiles-29416905-jm4xv\" (UID: \"42b72307-c01a-44b8-88ce-9a267335daff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-jm4xv" Dec 06 09:45:00 crc kubenswrapper[4672]: I1206 09:45:00.359166 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42b72307-c01a-44b8-88ce-9a267335daff-secret-volume\") pod \"collect-profiles-29416905-jm4xv\" (UID: \"42b72307-c01a-44b8-88ce-9a267335daff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-jm4xv" Dec 06 09:45:00 crc kubenswrapper[4672]: I1206 09:45:00.361345 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42b72307-c01a-44b8-88ce-9a267335daff-config-volume\") pod \"collect-profiles-29416905-jm4xv\" (UID: \"42b72307-c01a-44b8-88ce-9a267335daff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-jm4xv" Dec 06 09:45:00 crc kubenswrapper[4672]: I1206 09:45:00.367273 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42b72307-c01a-44b8-88ce-9a267335daff-secret-volume\") pod \"collect-profiles-29416905-jm4xv\" (UID: \"42b72307-c01a-44b8-88ce-9a267335daff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-jm4xv" Dec 06 09:45:00 crc kubenswrapper[4672]: I1206 09:45:00.381909 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9btj\" (UniqueName: \"kubernetes.io/projected/42b72307-c01a-44b8-88ce-9a267335daff-kube-api-access-g9btj\") pod \"collect-profiles-29416905-jm4xv\" (UID: \"42b72307-c01a-44b8-88ce-9a267335daff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-jm4xv" Dec 06 09:45:00 crc kubenswrapper[4672]: I1206 09:45:00.513147 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-jm4xv" Dec 06 09:45:00 crc kubenswrapper[4672]: I1206 09:45:00.962992 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416905-jm4xv"] Dec 06 09:45:00 crc kubenswrapper[4672]: W1206 09:45:00.972265 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42b72307_c01a_44b8_88ce_9a267335daff.slice/crio-99e203c7e30757ee169d20120e834aae1aa615bfc1590ecdf11072654935294d WatchSource:0}: Error finding container 99e203c7e30757ee169d20120e834aae1aa615bfc1590ecdf11072654935294d: Status 404 returned error can't find the container with id 99e203c7e30757ee169d20120e834aae1aa615bfc1590ecdf11072654935294d Dec 06 09:45:01 crc kubenswrapper[4672]: I1206 09:45:01.652238 4672 generic.go:334] "Generic (PLEG): container finished" podID="42b72307-c01a-44b8-88ce-9a267335daff" containerID="7d95e0c8ff81809b4bdf44e73b16dcab4d27146c291fa055d1d0e0ac2caf7529" exitCode=0 Dec 06 09:45:01 crc kubenswrapper[4672]: I1206 09:45:01.652469 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-jm4xv" event={"ID":"42b72307-c01a-44b8-88ce-9a267335daff","Type":"ContainerDied","Data":"7d95e0c8ff81809b4bdf44e73b16dcab4d27146c291fa055d1d0e0ac2caf7529"} Dec 06 09:45:01 crc kubenswrapper[4672]: I1206 09:45:01.652753 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-jm4xv" event={"ID":"42b72307-c01a-44b8-88ce-9a267335daff","Type":"ContainerStarted","Data":"99e203c7e30757ee169d20120e834aae1aa615bfc1590ecdf11072654935294d"} Dec 06 09:45:02 crc kubenswrapper[4672]: I1206 09:45:02.970170 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-jm4xv" Dec 06 09:45:03 crc kubenswrapper[4672]: I1206 09:45:03.016583 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9btj\" (UniqueName: \"kubernetes.io/projected/42b72307-c01a-44b8-88ce-9a267335daff-kube-api-access-g9btj\") pod \"42b72307-c01a-44b8-88ce-9a267335daff\" (UID: \"42b72307-c01a-44b8-88ce-9a267335daff\") " Dec 06 09:45:03 crc kubenswrapper[4672]: I1206 09:45:03.016996 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42b72307-c01a-44b8-88ce-9a267335daff-secret-volume\") pod \"42b72307-c01a-44b8-88ce-9a267335daff\" (UID: \"42b72307-c01a-44b8-88ce-9a267335daff\") " Dec 06 09:45:03 crc kubenswrapper[4672]: I1206 09:45:03.017138 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42b72307-c01a-44b8-88ce-9a267335daff-config-volume\") pod \"42b72307-c01a-44b8-88ce-9a267335daff\" (UID: \"42b72307-c01a-44b8-88ce-9a267335daff\") " Dec 06 09:45:03 crc kubenswrapper[4672]: I1206 09:45:03.017727 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42b72307-c01a-44b8-88ce-9a267335daff-config-volume" (OuterVolumeSpecName: "config-volume") pod "42b72307-c01a-44b8-88ce-9a267335daff" (UID: "42b72307-c01a-44b8-88ce-9a267335daff"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:45:03 crc kubenswrapper[4672]: I1206 09:45:03.024808 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42b72307-c01a-44b8-88ce-9a267335daff-kube-api-access-g9btj" (OuterVolumeSpecName: "kube-api-access-g9btj") pod "42b72307-c01a-44b8-88ce-9a267335daff" (UID: "42b72307-c01a-44b8-88ce-9a267335daff"). InnerVolumeSpecName "kube-api-access-g9btj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:45:03 crc kubenswrapper[4672]: I1206 09:45:03.030498 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42b72307-c01a-44b8-88ce-9a267335daff-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "42b72307-c01a-44b8-88ce-9a267335daff" (UID: "42b72307-c01a-44b8-88ce-9a267335daff"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:45:03 crc kubenswrapper[4672]: I1206 09:45:03.119386 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9btj\" (UniqueName: \"kubernetes.io/projected/42b72307-c01a-44b8-88ce-9a267335daff-kube-api-access-g9btj\") on node \"crc\" DevicePath \"\"" Dec 06 09:45:03 crc kubenswrapper[4672]: I1206 09:45:03.119432 4672 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42b72307-c01a-44b8-88ce-9a267335daff-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 09:45:03 crc kubenswrapper[4672]: I1206 09:45:03.119442 4672 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42b72307-c01a-44b8-88ce-9a267335daff-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 09:45:03 crc kubenswrapper[4672]: I1206 09:45:03.673342 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-jm4xv" event={"ID":"42b72307-c01a-44b8-88ce-9a267335daff","Type":"ContainerDied","Data":"99e203c7e30757ee169d20120e834aae1aa615bfc1590ecdf11072654935294d"} Dec 06 09:45:03 crc kubenswrapper[4672]: I1206 09:45:03.673420 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-jm4xv" Dec 06 09:45:03 crc kubenswrapper[4672]: I1206 09:45:03.673425 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99e203c7e30757ee169d20120e834aae1aa615bfc1590ecdf11072654935294d" Dec 06 09:45:04 crc kubenswrapper[4672]: I1206 09:45:04.046831 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416860-wvm9x"] Dec 06 09:45:04 crc kubenswrapper[4672]: I1206 09:45:04.055377 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416860-wvm9x"] Dec 06 09:45:04 crc kubenswrapper[4672]: I1206 09:45:04.573697 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8a761a8-3e6d-42eb-b0f8-db388dcf6952" path="/var/lib/kubelet/pods/b8a761a8-3e6d-42eb-b0f8-db388dcf6952/volumes" Dec 06 09:45:09 crc kubenswrapper[4672]: I1206 09:45:09.557738 4672 scope.go:117] "RemoveContainer" containerID="9db37941c2f5797e3cc2e07c2a5ea926cd6224bccf10d013946bef80402ff7bb" Dec 06 09:45:09 crc kubenswrapper[4672]: E1206 09:45:09.560071 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:45:17 crc kubenswrapper[4672]: I1206 09:45:17.792805 4672 generic.go:334] "Generic (PLEG): container finished" podID="87cce220-e210-44d8-ac72-946b6e9bb4c4" containerID="4e7ff482a63fef5ff3b0b48cbf361b361d2d7ad2d9b5f69d175097af8870f974" exitCode=0 Dec 06 09:45:17 crc kubenswrapper[4672]: I1206 09:45:17.792903 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9" event={"ID":"87cce220-e210-44d8-ac72-946b6e9bb4c4","Type":"ContainerDied","Data":"4e7ff482a63fef5ff3b0b48cbf361b361d2d7ad2d9b5f69d175097af8870f974"} Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.213000 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9" Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.344006 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87cce220-e210-44d8-ac72-946b6e9bb4c4-bootstrap-combined-ca-bundle\") pod \"87cce220-e210-44d8-ac72-946b6e9bb4c4\" (UID: \"87cce220-e210-44d8-ac72-946b6e9bb4c4\") " Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.344105 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87cce220-e210-44d8-ac72-946b6e9bb4c4-ssh-key\") pod \"87cce220-e210-44d8-ac72-946b6e9bb4c4\" (UID: \"87cce220-e210-44d8-ac72-946b6e9bb4c4\") " Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.344160 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87cce220-e210-44d8-ac72-946b6e9bb4c4-inventory\") pod \"87cce220-e210-44d8-ac72-946b6e9bb4c4\" (UID: \"87cce220-e210-44d8-ac72-946b6e9bb4c4\") " Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.344228 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj2kl\" (UniqueName: \"kubernetes.io/projected/87cce220-e210-44d8-ac72-946b6e9bb4c4-kube-api-access-kj2kl\") pod \"87cce220-e210-44d8-ac72-946b6e9bb4c4\" (UID: \"87cce220-e210-44d8-ac72-946b6e9bb4c4\") " Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.344269 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/87cce220-e210-44d8-ac72-946b6e9bb4c4-ceph\") pod \"87cce220-e210-44d8-ac72-946b6e9bb4c4\" (UID: \"87cce220-e210-44d8-ac72-946b6e9bb4c4\") " Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.357659 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cce220-e210-44d8-ac72-946b6e9bb4c4-ceph" (OuterVolumeSpecName: "ceph") pod "87cce220-e210-44d8-ac72-946b6e9bb4c4" (UID: "87cce220-e210-44d8-ac72-946b6e9bb4c4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.358974 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cce220-e210-44d8-ac72-946b6e9bb4c4-kube-api-access-kj2kl" (OuterVolumeSpecName: "kube-api-access-kj2kl") pod "87cce220-e210-44d8-ac72-946b6e9bb4c4" (UID: "87cce220-e210-44d8-ac72-946b6e9bb4c4"). InnerVolumeSpecName "kube-api-access-kj2kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.365201 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cce220-e210-44d8-ac72-946b6e9bb4c4-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "87cce220-e210-44d8-ac72-946b6e9bb4c4" (UID: "87cce220-e210-44d8-ac72-946b6e9bb4c4"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.378484 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cce220-e210-44d8-ac72-946b6e9bb4c4-inventory" (OuterVolumeSpecName: "inventory") pod "87cce220-e210-44d8-ac72-946b6e9bb4c4" (UID: "87cce220-e210-44d8-ac72-946b6e9bb4c4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.388255 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cce220-e210-44d8-ac72-946b6e9bb4c4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "87cce220-e210-44d8-ac72-946b6e9bb4c4" (UID: "87cce220-e210-44d8-ac72-946b6e9bb4c4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.448052 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87cce220-e210-44d8-ac72-946b6e9bb4c4-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.448444 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj2kl\" (UniqueName: \"kubernetes.io/projected/87cce220-e210-44d8-ac72-946b6e9bb4c4-kube-api-access-kj2kl\") on node \"crc\" DevicePath \"\"" Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.448500 4672 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/87cce220-e210-44d8-ac72-946b6e9bb4c4-ceph\") on node \"crc\" DevicePath \"\"" Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.448520 4672 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87cce220-e210-44d8-ac72-946b6e9bb4c4-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.448535 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87cce220-e210-44d8-ac72-946b6e9bb4c4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.816713 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9" Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.816725 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9" event={"ID":"87cce220-e210-44d8-ac72-946b6e9bb4c4","Type":"ContainerDied","Data":"473294c5f99fe7f6ba38cc1bf740f7013a59f8df83d7695ba09fa18e50df1779"} Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.817354 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="473294c5f99fe7f6ba38cc1bf740f7013a59f8df83d7695ba09fa18e50df1779" Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.961365 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2"] Dec 06 09:45:19 crc kubenswrapper[4672]: E1206 09:45:19.961878 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42b72307-c01a-44b8-88ce-9a267335daff" containerName="collect-profiles" Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.961905 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="42b72307-c01a-44b8-88ce-9a267335daff" containerName="collect-profiles" Dec 06 09:45:19 crc kubenswrapper[4672]: E1206 09:45:19.961944 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87cce220-e210-44d8-ac72-946b6e9bb4c4" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.961955 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="87cce220-e210-44d8-ac72-946b6e9bb4c4" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.962174 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="42b72307-c01a-44b8-88ce-9a267335daff" containerName="collect-profiles" Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.962204 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="87cce220-e210-44d8-ac72-946b6e9bb4c4" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.962923 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2" Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.965132 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.965628 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p6qrb" Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.966505 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.967313 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.969797 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 09:45:19 crc kubenswrapper[4672]: I1206 09:45:19.971032 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2"] Dec 06 09:45:20 crc kubenswrapper[4672]: I1206 09:45:20.059278 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3188b54-be64-4ee4-a4ff-4af6f300e58a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2\" (UID: \"d3188b54-be64-4ee4-a4ff-4af6f300e58a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2" Dec 06 09:45:20 crc kubenswrapper[4672]: I1206 09:45:20.059553 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g42zx\" (UniqueName: \"kubernetes.io/projected/d3188b54-be64-4ee4-a4ff-4af6f300e58a-kube-api-access-g42zx\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2\" (UID: \"d3188b54-be64-4ee4-a4ff-4af6f300e58a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2" Dec 06 09:45:20 crc kubenswrapper[4672]: I1206 09:45:20.059724 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d3188b54-be64-4ee4-a4ff-4af6f300e58a-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2\" (UID: \"d3188b54-be64-4ee4-a4ff-4af6f300e58a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2" Dec 06 09:45:20 crc kubenswrapper[4672]: I1206 09:45:20.059804 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3188b54-be64-4ee4-a4ff-4af6f300e58a-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2\" (UID: \"d3188b54-be64-4ee4-a4ff-4af6f300e58a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2" Dec 06 09:45:20 crc kubenswrapper[4672]: I1206 09:45:20.161803 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g42zx\" (UniqueName: \"kubernetes.io/projected/d3188b54-be64-4ee4-a4ff-4af6f300e58a-kube-api-access-g42zx\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2\" (UID: \"d3188b54-be64-4ee4-a4ff-4af6f300e58a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2" Dec 06 09:45:20 crc kubenswrapper[4672]: I1206 09:45:20.161891 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d3188b54-be64-4ee4-a4ff-4af6f300e58a-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2\" (UID: \"d3188b54-be64-4ee4-a4ff-4af6f300e58a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2" Dec 06 09:45:20 crc kubenswrapper[4672]: I1206 09:45:20.161914 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3188b54-be64-4ee4-a4ff-4af6f300e58a-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2\" (UID: \"d3188b54-be64-4ee4-a4ff-4af6f300e58a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2" Dec 06 09:45:20 crc kubenswrapper[4672]: I1206 09:45:20.161994 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3188b54-be64-4ee4-a4ff-4af6f300e58a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2\" (UID: \"d3188b54-be64-4ee4-a4ff-4af6f300e58a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2" Dec 06 09:45:20 crc kubenswrapper[4672]: I1206 09:45:20.170000 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d3188b54-be64-4ee4-a4ff-4af6f300e58a-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2\" (UID: \"d3188b54-be64-4ee4-a4ff-4af6f300e58a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2" Dec 06 09:45:20 crc kubenswrapper[4672]: I1206 09:45:20.170712 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3188b54-be64-4ee4-a4ff-4af6f300e58a-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2\" (UID: \"d3188b54-be64-4ee4-a4ff-4af6f300e58a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2" Dec 06 09:45:20 crc kubenswrapper[4672]: I1206 09:45:20.174088 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3188b54-be64-4ee4-a4ff-4af6f300e58a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2\" (UID: \"d3188b54-be64-4ee4-a4ff-4af6f300e58a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2" Dec 06 09:45:20 crc kubenswrapper[4672]: I1206 09:45:20.178381 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g42zx\" (UniqueName: \"kubernetes.io/projected/d3188b54-be64-4ee4-a4ff-4af6f300e58a-kube-api-access-g42zx\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2\" (UID: \"d3188b54-be64-4ee4-a4ff-4af6f300e58a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2" Dec 06 09:45:20 crc kubenswrapper[4672]: I1206 09:45:20.279181 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2" Dec 06 09:45:20 crc kubenswrapper[4672]: I1206 09:45:20.802724 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2"] Dec 06 09:45:20 crc kubenswrapper[4672]: I1206 09:45:20.846790 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2" event={"ID":"d3188b54-be64-4ee4-a4ff-4af6f300e58a","Type":"ContainerStarted","Data":"d8ec913733651a5504ea47173782ae3a8839c56113d9c8bece7f37d2f91d68ba"} Dec 06 09:45:21 crc kubenswrapper[4672]: I1206 09:45:21.855550 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2" event={"ID":"d3188b54-be64-4ee4-a4ff-4af6f300e58a","Type":"ContainerStarted","Data":"abf6bd3f6507883f69e9c3b5bc58f0548a2fe78e497623194fc7c1d9c90589a6"} Dec 06 09:45:21 crc kubenswrapper[4672]: I1206 09:45:21.875083 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2" podStartSLOduration=2.442757805 podStartE2EDuration="2.875060423s" podCreationTimestamp="2025-12-06 09:45:19 +0000 UTC" firstStartedPulling="2025-12-06 09:45:20.816876256 +0000 UTC m=+2338.561136543" lastFinishedPulling="2025-12-06 09:45:21.249178874 +0000 UTC m=+2338.993439161" observedRunningTime="2025-12-06 09:45:21.86897545 +0000 UTC m=+2339.613235757" watchObservedRunningTime="2025-12-06 09:45:21.875060423 +0000 UTC m=+2339.619320710" Dec 06 09:45:24 crc kubenswrapper[4672]: I1206 09:45:24.556895 4672 scope.go:117] "RemoveContainer" containerID="9db37941c2f5797e3cc2e07c2a5ea926cd6224bccf10d013946bef80402ff7bb" Dec 06 09:45:24 crc kubenswrapper[4672]: E1206 09:45:24.557438 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:45:27 crc kubenswrapper[4672]: I1206 09:45:27.757006 4672 scope.go:117] "RemoveContainer" containerID="713878a4e078961632df76317843927d6c71d5d3568ea257d4d5594048833128" Dec 06 09:45:38 crc kubenswrapper[4672]: I1206 09:45:38.557434 4672 scope.go:117] "RemoveContainer" containerID="9db37941c2f5797e3cc2e07c2a5ea926cd6224bccf10d013946bef80402ff7bb" Dec 06 09:45:38 crc kubenswrapper[4672]: E1206 09:45:38.558287 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:45:50 crc kubenswrapper[4672]: I1206 09:45:50.095627 4672 generic.go:334] "Generic (PLEG): container finished" podID="d3188b54-be64-4ee4-a4ff-4af6f300e58a" containerID="abf6bd3f6507883f69e9c3b5bc58f0548a2fe78e497623194fc7c1d9c90589a6" exitCode=0 Dec 06 09:45:50 crc kubenswrapper[4672]: I1206 09:45:50.096219 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2" event={"ID":"d3188b54-be64-4ee4-a4ff-4af6f300e58a","Type":"ContainerDied","Data":"abf6bd3f6507883f69e9c3b5bc58f0548a2fe78e497623194fc7c1d9c90589a6"} Dec 06 09:45:50 crc kubenswrapper[4672]: I1206 09:45:50.557819 4672 scope.go:117] "RemoveContainer" containerID="9db37941c2f5797e3cc2e07c2a5ea926cd6224bccf10d013946bef80402ff7bb" Dec 06 09:45:50 crc kubenswrapper[4672]: E1206 09:45:50.558174 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:45:51 crc kubenswrapper[4672]: I1206 09:45:51.116954 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-kpmch" podUID="8244458a-10b4-4c4f-8f9e-dc93e90329af" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.78:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 09:45:51 crc kubenswrapper[4672]: I1206 09:45:51.538035 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2" Dec 06 09:45:51 crc kubenswrapper[4672]: I1206 09:45:51.582386 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g42zx\" (UniqueName: \"kubernetes.io/projected/d3188b54-be64-4ee4-a4ff-4af6f300e58a-kube-api-access-g42zx\") pod \"d3188b54-be64-4ee4-a4ff-4af6f300e58a\" (UID: \"d3188b54-be64-4ee4-a4ff-4af6f300e58a\") " Dec 06 09:45:51 crc kubenswrapper[4672]: I1206 09:45:51.582456 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3188b54-be64-4ee4-a4ff-4af6f300e58a-ssh-key\") pod \"d3188b54-be64-4ee4-a4ff-4af6f300e58a\" (UID: \"d3188b54-be64-4ee4-a4ff-4af6f300e58a\") " Dec 06 09:45:51 crc kubenswrapper[4672]: I1206 09:45:51.582549 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d3188b54-be64-4ee4-a4ff-4af6f300e58a-ceph\") pod \"d3188b54-be64-4ee4-a4ff-4af6f300e58a\" (UID: \"d3188b54-be64-4ee4-a4ff-4af6f300e58a\") " Dec 06 09:45:51 crc kubenswrapper[4672]: I1206 09:45:51.582639 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3188b54-be64-4ee4-a4ff-4af6f300e58a-inventory\") pod \"d3188b54-be64-4ee4-a4ff-4af6f300e58a\" (UID: \"d3188b54-be64-4ee4-a4ff-4af6f300e58a\") " Dec 06 09:45:51 crc kubenswrapper[4672]: I1206 09:45:51.588143 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3188b54-be64-4ee4-a4ff-4af6f300e58a-ceph" (OuterVolumeSpecName: "ceph") pod "d3188b54-be64-4ee4-a4ff-4af6f300e58a" (UID: "d3188b54-be64-4ee4-a4ff-4af6f300e58a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:45:51 crc kubenswrapper[4672]: I1206 09:45:51.588856 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3188b54-be64-4ee4-a4ff-4af6f300e58a-kube-api-access-g42zx" (OuterVolumeSpecName: "kube-api-access-g42zx") pod "d3188b54-be64-4ee4-a4ff-4af6f300e58a" (UID: "d3188b54-be64-4ee4-a4ff-4af6f300e58a"). InnerVolumeSpecName "kube-api-access-g42zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:45:51 crc kubenswrapper[4672]: I1206 09:45:51.616397 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3188b54-be64-4ee4-a4ff-4af6f300e58a-inventory" (OuterVolumeSpecName: "inventory") pod "d3188b54-be64-4ee4-a4ff-4af6f300e58a" (UID: "d3188b54-be64-4ee4-a4ff-4af6f300e58a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:45:51 crc kubenswrapper[4672]: I1206 09:45:51.618043 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3188b54-be64-4ee4-a4ff-4af6f300e58a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d3188b54-be64-4ee4-a4ff-4af6f300e58a" (UID: "d3188b54-be64-4ee4-a4ff-4af6f300e58a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:45:51 crc kubenswrapper[4672]: I1206 09:45:51.684837 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g42zx\" (UniqueName: \"kubernetes.io/projected/d3188b54-be64-4ee4-a4ff-4af6f300e58a-kube-api-access-g42zx\") on node \"crc\" DevicePath \"\"" Dec 06 09:45:51 crc kubenswrapper[4672]: I1206 09:45:51.684882 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3188b54-be64-4ee4-a4ff-4af6f300e58a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:45:51 crc kubenswrapper[4672]: I1206 09:45:51.684895 4672 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d3188b54-be64-4ee4-a4ff-4af6f300e58a-ceph\") on node \"crc\" DevicePath \"\"" Dec 06 09:45:51 crc kubenswrapper[4672]: I1206 09:45:51.684907 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3188b54-be64-4ee4-a4ff-4af6f300e58a-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:45:52 crc kubenswrapper[4672]: I1206 09:45:52.122910 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2" event={"ID":"d3188b54-be64-4ee4-a4ff-4af6f300e58a","Type":"ContainerDied","Data":"d8ec913733651a5504ea47173782ae3a8839c56113d9c8bece7f37d2f91d68ba"} Dec 06 09:45:52 crc kubenswrapper[4672]: I1206 09:45:52.122951 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8ec913733651a5504ea47173782ae3a8839c56113d9c8bece7f37d2f91d68ba" Dec 06 09:45:52 crc kubenswrapper[4672]: I1206 09:45:52.122985 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2" Dec 06 09:45:52 crc kubenswrapper[4672]: I1206 09:45:52.261054 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn"] Dec 06 09:45:52 crc kubenswrapper[4672]: E1206 09:45:52.262510 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3188b54-be64-4ee4-a4ff-4af6f300e58a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 06 09:45:52 crc kubenswrapper[4672]: I1206 09:45:52.262682 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3188b54-be64-4ee4-a4ff-4af6f300e58a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 06 09:45:52 crc kubenswrapper[4672]: I1206 09:45:52.263072 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3188b54-be64-4ee4-a4ff-4af6f300e58a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 06 09:45:52 crc kubenswrapper[4672]: I1206 09:45:52.264071 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn" Dec 06 09:45:52 crc kubenswrapper[4672]: I1206 09:45:52.267798 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 06 09:45:52 crc kubenswrapper[4672]: I1206 09:45:52.268222 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 09:45:52 crc kubenswrapper[4672]: I1206 09:45:52.268239 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:45:52 crc kubenswrapper[4672]: I1206 09:45:52.271032 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 09:45:52 crc kubenswrapper[4672]: I1206 09:45:52.271379 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p6qrb" Dec 06 09:45:52 crc kubenswrapper[4672]: I1206 09:45:52.281422 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn"] Dec 06 09:45:52 crc kubenswrapper[4672]: I1206 09:45:52.331929 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53ed8161-58e0-4b3b-91bf-190216b16b12-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn\" (UID: \"53ed8161-58e0-4b3b-91bf-190216b16b12\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn" Dec 06 09:45:52 crc kubenswrapper[4672]: I1206 09:45:52.331979 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/53ed8161-58e0-4b3b-91bf-190216b16b12-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn\" (UID: \"53ed8161-58e0-4b3b-91bf-190216b16b12\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn" Dec 06 09:45:52 crc kubenswrapper[4672]: I1206 09:45:52.332094 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5nhz\" (UniqueName: \"kubernetes.io/projected/53ed8161-58e0-4b3b-91bf-190216b16b12-kube-api-access-z5nhz\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn\" (UID: \"53ed8161-58e0-4b3b-91bf-190216b16b12\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn" Dec 06 09:45:52 crc kubenswrapper[4672]: I1206 09:45:52.332123 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53ed8161-58e0-4b3b-91bf-190216b16b12-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn\" (UID: \"53ed8161-58e0-4b3b-91bf-190216b16b12\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn" Dec 06 09:45:52 crc kubenswrapper[4672]: I1206 09:45:52.434109 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/53ed8161-58e0-4b3b-91bf-190216b16b12-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn\" (UID: \"53ed8161-58e0-4b3b-91bf-190216b16b12\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn" Dec 06 09:45:52 crc kubenswrapper[4672]: I1206 09:45:52.434282 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5nhz\" (UniqueName: \"kubernetes.io/projected/53ed8161-58e0-4b3b-91bf-190216b16b12-kube-api-access-z5nhz\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn\" (UID: \"53ed8161-58e0-4b3b-91bf-190216b16b12\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn" Dec 06 09:45:52 crc kubenswrapper[4672]: I1206 09:45:52.434312 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53ed8161-58e0-4b3b-91bf-190216b16b12-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn\" (UID: \"53ed8161-58e0-4b3b-91bf-190216b16b12\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn" Dec 06 09:45:52 crc kubenswrapper[4672]: I1206 09:45:52.434723 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53ed8161-58e0-4b3b-91bf-190216b16b12-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn\" (UID: \"53ed8161-58e0-4b3b-91bf-190216b16b12\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn" Dec 06 09:45:52 crc kubenswrapper[4672]: I1206 09:45:52.439975 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53ed8161-58e0-4b3b-91bf-190216b16b12-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn\" (UID: \"53ed8161-58e0-4b3b-91bf-190216b16b12\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn" Dec 06 09:45:52 crc kubenswrapper[4672]: I1206 09:45:52.441070 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53ed8161-58e0-4b3b-91bf-190216b16b12-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn\" (UID: \"53ed8161-58e0-4b3b-91bf-190216b16b12\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn" Dec 06 09:45:52 crc kubenswrapper[4672]: I1206 09:45:52.443085 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/53ed8161-58e0-4b3b-91bf-190216b16b12-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn\" (UID: \"53ed8161-58e0-4b3b-91bf-190216b16b12\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn" Dec 06 09:45:52 crc kubenswrapper[4672]: I1206 09:45:52.451008 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5nhz\" (UniqueName: \"kubernetes.io/projected/53ed8161-58e0-4b3b-91bf-190216b16b12-kube-api-access-z5nhz\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn\" (UID: \"53ed8161-58e0-4b3b-91bf-190216b16b12\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn" Dec 06 09:45:52 crc kubenswrapper[4672]: I1206 09:45:52.656486 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn" Dec 06 09:45:53 crc kubenswrapper[4672]: I1206 09:45:53.190373 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn"] Dec 06 09:45:54 crc kubenswrapper[4672]: I1206 09:45:54.141720 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn" event={"ID":"53ed8161-58e0-4b3b-91bf-190216b16b12","Type":"ContainerStarted","Data":"a6f86ac6d166988646645c132afe86b4c4a25e9b5754cf048e297b5b38a0eaf4"} Dec 06 09:45:57 crc kubenswrapper[4672]: I1206 09:45:57.177677 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn" event={"ID":"53ed8161-58e0-4b3b-91bf-190216b16b12","Type":"ContainerStarted","Data":"155a49eb8b2bc33feef5357e61f0c3fa10717498a55d9dfdaa1b00c569d384f9"} Dec 06 09:45:57 crc kubenswrapper[4672]: I1206 09:45:57.198422 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn" podStartSLOduration=2.645371601 podStartE2EDuration="5.198393741s" podCreationTimestamp="2025-12-06 09:45:52 +0000 UTC" firstStartedPulling="2025-12-06 09:45:53.200301497 +0000 UTC m=+2370.944561794" lastFinishedPulling="2025-12-06 09:45:55.753323647 +0000 UTC m=+2373.497583934" observedRunningTime="2025-12-06 09:45:57.191520497 +0000 UTC m=+2374.935780784" watchObservedRunningTime="2025-12-06 09:45:57.198393741 +0000 UTC m=+2374.942654028" Dec 06 09:46:01 crc kubenswrapper[4672]: I1206 09:46:01.557469 4672 scope.go:117] "RemoveContainer" containerID="9db37941c2f5797e3cc2e07c2a5ea926cd6224bccf10d013946bef80402ff7bb" Dec 06 09:46:01 crc kubenswrapper[4672]: E1206 09:46:01.558244 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:46:02 crc kubenswrapper[4672]: I1206 09:46:02.226082 4672 generic.go:334] "Generic (PLEG): container finished" podID="53ed8161-58e0-4b3b-91bf-190216b16b12" containerID="155a49eb8b2bc33feef5357e61f0c3fa10717498a55d9dfdaa1b00c569d384f9" exitCode=0 Dec 06 09:46:02 crc kubenswrapper[4672]: I1206 09:46:02.226224 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn" event={"ID":"53ed8161-58e0-4b3b-91bf-190216b16b12","Type":"ContainerDied","Data":"155a49eb8b2bc33feef5357e61f0c3fa10717498a55d9dfdaa1b00c569d384f9"} Dec 06 09:46:03 crc kubenswrapper[4672]: I1206 09:46:03.688939 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn" Dec 06 09:46:03 crc kubenswrapper[4672]: I1206 09:46:03.822178 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/53ed8161-58e0-4b3b-91bf-190216b16b12-ceph\") pod \"53ed8161-58e0-4b3b-91bf-190216b16b12\" (UID: \"53ed8161-58e0-4b3b-91bf-190216b16b12\") " Dec 06 09:46:03 crc kubenswrapper[4672]: I1206 09:46:03.822221 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53ed8161-58e0-4b3b-91bf-190216b16b12-inventory\") pod \"53ed8161-58e0-4b3b-91bf-190216b16b12\" (UID: \"53ed8161-58e0-4b3b-91bf-190216b16b12\") " Dec 06 09:46:03 crc kubenswrapper[4672]: I1206 09:46:03.822282 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53ed8161-58e0-4b3b-91bf-190216b16b12-ssh-key\") pod \"53ed8161-58e0-4b3b-91bf-190216b16b12\" (UID: \"53ed8161-58e0-4b3b-91bf-190216b16b12\") " Dec 06 09:46:03 crc kubenswrapper[4672]: I1206 09:46:03.822329 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5nhz\" (UniqueName: \"kubernetes.io/projected/53ed8161-58e0-4b3b-91bf-190216b16b12-kube-api-access-z5nhz\") pod \"53ed8161-58e0-4b3b-91bf-190216b16b12\" (UID: \"53ed8161-58e0-4b3b-91bf-190216b16b12\") " Dec 06 09:46:03 crc kubenswrapper[4672]: I1206 09:46:03.830160 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53ed8161-58e0-4b3b-91bf-190216b16b12-kube-api-access-z5nhz" (OuterVolumeSpecName: "kube-api-access-z5nhz") pod "53ed8161-58e0-4b3b-91bf-190216b16b12" (UID: "53ed8161-58e0-4b3b-91bf-190216b16b12"). InnerVolumeSpecName "kube-api-access-z5nhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:46:03 crc kubenswrapper[4672]: I1206 09:46:03.842806 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53ed8161-58e0-4b3b-91bf-190216b16b12-ceph" (OuterVolumeSpecName: "ceph") pod "53ed8161-58e0-4b3b-91bf-190216b16b12" (UID: "53ed8161-58e0-4b3b-91bf-190216b16b12"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:46:03 crc kubenswrapper[4672]: I1206 09:46:03.853839 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53ed8161-58e0-4b3b-91bf-190216b16b12-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "53ed8161-58e0-4b3b-91bf-190216b16b12" (UID: "53ed8161-58e0-4b3b-91bf-190216b16b12"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:46:03 crc kubenswrapper[4672]: I1206 09:46:03.863298 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53ed8161-58e0-4b3b-91bf-190216b16b12-inventory" (OuterVolumeSpecName: "inventory") pod "53ed8161-58e0-4b3b-91bf-190216b16b12" (UID: "53ed8161-58e0-4b3b-91bf-190216b16b12"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:46:03 crc kubenswrapper[4672]: I1206 09:46:03.925472 4672 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/53ed8161-58e0-4b3b-91bf-190216b16b12-ceph\") on node \"crc\" DevicePath \"\"" Dec 06 09:46:03 crc kubenswrapper[4672]: I1206 09:46:03.925513 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53ed8161-58e0-4b3b-91bf-190216b16b12-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:46:03 crc kubenswrapper[4672]: I1206 09:46:03.925526 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53ed8161-58e0-4b3b-91bf-190216b16b12-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:46:03 crc kubenswrapper[4672]: I1206 09:46:03.925537 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5nhz\" (UniqueName: \"kubernetes.io/projected/53ed8161-58e0-4b3b-91bf-190216b16b12-kube-api-access-z5nhz\") on node \"crc\" DevicePath \"\"" Dec 06 09:46:04 crc kubenswrapper[4672]: I1206 09:46:04.241170 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn" event={"ID":"53ed8161-58e0-4b3b-91bf-190216b16b12","Type":"ContainerDied","Data":"a6f86ac6d166988646645c132afe86b4c4a25e9b5754cf048e297b5b38a0eaf4"} Dec 06 09:46:04 crc kubenswrapper[4672]: I1206 09:46:04.241215 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6f86ac6d166988646645c132afe86b4c4a25e9b5754cf048e297b5b38a0eaf4" Dec 06 09:46:04 crc kubenswrapper[4672]: I1206 09:46:04.241236 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn" Dec 06 09:46:04 crc kubenswrapper[4672]: I1206 09:46:04.334981 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-gzmnw"] Dec 06 09:46:04 crc kubenswrapper[4672]: E1206 09:46:04.338960 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53ed8161-58e0-4b3b-91bf-190216b16b12" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 06 09:46:04 crc kubenswrapper[4672]: I1206 09:46:04.339068 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="53ed8161-58e0-4b3b-91bf-190216b16b12" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 06 09:46:04 crc kubenswrapper[4672]: I1206 09:46:04.339337 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="53ed8161-58e0-4b3b-91bf-190216b16b12" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 06 09:46:04 crc kubenswrapper[4672]: I1206 09:46:04.340333 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gzmnw" Dec 06 09:46:04 crc kubenswrapper[4672]: I1206 09:46:04.346247 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 06 09:46:04 crc kubenswrapper[4672]: I1206 09:46:04.346509 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p6qrb" Dec 06 09:46:04 crc kubenswrapper[4672]: I1206 09:46:04.346971 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 09:46:04 crc kubenswrapper[4672]: I1206 09:46:04.347241 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:46:04 crc kubenswrapper[4672]: I1206 09:46:04.348903 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 09:46:04 crc kubenswrapper[4672]: I1206 09:46:04.365076 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-gzmnw"] Dec 06 09:46:04 crc kubenswrapper[4672]: I1206 09:46:04.434298 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cad15908-dabc-4b48-9aa7-977801ce63ff-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gzmnw\" (UID: \"cad15908-dabc-4b48-9aa7-977801ce63ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gzmnw" Dec 06 09:46:04 crc kubenswrapper[4672]: I1206 09:46:04.434339 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cad15908-dabc-4b48-9aa7-977801ce63ff-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gzmnw\" (UID: \"cad15908-dabc-4b48-9aa7-977801ce63ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gzmnw" Dec 06 09:46:04 crc kubenswrapper[4672]: I1206 09:46:04.434460 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tpxl\" (UniqueName: \"kubernetes.io/projected/cad15908-dabc-4b48-9aa7-977801ce63ff-kube-api-access-6tpxl\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gzmnw\" (UID: \"cad15908-dabc-4b48-9aa7-977801ce63ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gzmnw" Dec 06 09:46:04 crc kubenswrapper[4672]: I1206 09:46:04.434689 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cad15908-dabc-4b48-9aa7-977801ce63ff-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gzmnw\" (UID: \"cad15908-dabc-4b48-9aa7-977801ce63ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gzmnw" Dec 06 09:46:04 crc kubenswrapper[4672]: I1206 09:46:04.536695 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tpxl\" (UniqueName: \"kubernetes.io/projected/cad15908-dabc-4b48-9aa7-977801ce63ff-kube-api-access-6tpxl\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gzmnw\" (UID: \"cad15908-dabc-4b48-9aa7-977801ce63ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gzmnw" Dec 06 09:46:04 crc kubenswrapper[4672]: I1206 09:46:04.536765 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cad15908-dabc-4b48-9aa7-977801ce63ff-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gzmnw\" (UID: \"cad15908-dabc-4b48-9aa7-977801ce63ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gzmnw" Dec 06 09:46:04 crc kubenswrapper[4672]: I1206 09:46:04.536847 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cad15908-dabc-4b48-9aa7-977801ce63ff-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gzmnw\" (UID: \"cad15908-dabc-4b48-9aa7-977801ce63ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gzmnw" Dec 06 09:46:04 crc kubenswrapper[4672]: I1206 09:46:04.536864 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cad15908-dabc-4b48-9aa7-977801ce63ff-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gzmnw\" (UID: \"cad15908-dabc-4b48-9aa7-977801ce63ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gzmnw" Dec 06 09:46:04 crc kubenswrapper[4672]: I1206 09:46:04.541993 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cad15908-dabc-4b48-9aa7-977801ce63ff-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gzmnw\" (UID: \"cad15908-dabc-4b48-9aa7-977801ce63ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gzmnw" Dec 06 09:46:04 crc kubenswrapper[4672]: I1206 09:46:04.544089 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cad15908-dabc-4b48-9aa7-977801ce63ff-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gzmnw\" (UID: \"cad15908-dabc-4b48-9aa7-977801ce63ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gzmnw" Dec 06 09:46:04 crc kubenswrapper[4672]: I1206 09:46:04.547512 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cad15908-dabc-4b48-9aa7-977801ce63ff-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gzmnw\" (UID: \"cad15908-dabc-4b48-9aa7-977801ce63ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gzmnw" Dec 06 09:46:04 crc kubenswrapper[4672]: I1206 09:46:04.556274 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tpxl\" (UniqueName: \"kubernetes.io/projected/cad15908-dabc-4b48-9aa7-977801ce63ff-kube-api-access-6tpxl\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gzmnw\" (UID: \"cad15908-dabc-4b48-9aa7-977801ce63ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gzmnw" Dec 06 09:46:04 crc kubenswrapper[4672]: I1206 09:46:04.656252 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gzmnw" Dec 06 09:46:05 crc kubenswrapper[4672]: I1206 09:46:05.258901 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-gzmnw"] Dec 06 09:46:05 crc kubenswrapper[4672]: I1206 09:46:05.268877 4672 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 09:46:06 crc kubenswrapper[4672]: I1206 09:46:06.266452 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gzmnw" event={"ID":"cad15908-dabc-4b48-9aa7-977801ce63ff","Type":"ContainerStarted","Data":"3ff77d5eb20fc5555a04472b702be03a9ef5e2e0171400955acb80f999fd90bc"} Dec 06 09:46:08 crc kubenswrapper[4672]: I1206 09:46:08.290347 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gzmnw" event={"ID":"cad15908-dabc-4b48-9aa7-977801ce63ff","Type":"ContainerStarted","Data":"0201a53077f6a643834a3ebd01cbd81ef40b35711ab1793ce6058fa3208c9382"} Dec 06 09:46:08 crc kubenswrapper[4672]: I1206 09:46:08.318059 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gzmnw" podStartSLOduration=2.167721628 podStartE2EDuration="4.318038508s" podCreationTimestamp="2025-12-06 09:46:04 +0000 UTC" firstStartedPulling="2025-12-06 09:46:05.26848507 +0000 UTC m=+2383.012745357" lastFinishedPulling="2025-12-06 09:46:07.41880195 +0000 UTC m=+2385.163062237" observedRunningTime="2025-12-06 09:46:08.315257423 +0000 UTC m=+2386.059517720" watchObservedRunningTime="2025-12-06 09:46:08.318038508 +0000 UTC m=+2386.062298795" Dec 06 09:46:15 crc kubenswrapper[4672]: I1206 09:46:15.557699 4672 scope.go:117] "RemoveContainer" containerID="9db37941c2f5797e3cc2e07c2a5ea926cd6224bccf10d013946bef80402ff7bb" Dec 06 09:46:15 crc kubenswrapper[4672]: E1206 09:46:15.558693 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:46:28 crc kubenswrapper[4672]: I1206 09:46:28.565207 4672 scope.go:117] "RemoveContainer" containerID="9db37941c2f5797e3cc2e07c2a5ea926cd6224bccf10d013946bef80402ff7bb" Dec 06 09:46:28 crc kubenswrapper[4672]: E1206 09:46:28.566263 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:46:39 crc kubenswrapper[4672]: I1206 09:46:39.557306 4672 scope.go:117] "RemoveContainer" containerID="9db37941c2f5797e3cc2e07c2a5ea926cd6224bccf10d013946bef80402ff7bb" Dec 06 09:46:39 crc kubenswrapper[4672]: E1206 09:46:39.558012 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:46:52 crc kubenswrapper[4672]: I1206 09:46:52.641091 4672 generic.go:334] "Generic (PLEG): container finished" podID="cad15908-dabc-4b48-9aa7-977801ce63ff" containerID="0201a53077f6a643834a3ebd01cbd81ef40b35711ab1793ce6058fa3208c9382" exitCode=0 Dec 06 09:46:52 crc kubenswrapper[4672]: I1206 09:46:52.641194 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gzmnw" event={"ID":"cad15908-dabc-4b48-9aa7-977801ce63ff","Type":"ContainerDied","Data":"0201a53077f6a643834a3ebd01cbd81ef40b35711ab1793ce6058fa3208c9382"} Dec 06 09:46:53 crc kubenswrapper[4672]: I1206 09:46:53.557114 4672 scope.go:117] "RemoveContainer" containerID="9db37941c2f5797e3cc2e07c2a5ea926cd6224bccf10d013946bef80402ff7bb" Dec 06 09:46:53 crc kubenswrapper[4672]: E1206 09:46:53.557734 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.077441 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gzmnw" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.202463 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cad15908-dabc-4b48-9aa7-977801ce63ff-ssh-key\") pod \"cad15908-dabc-4b48-9aa7-977801ce63ff\" (UID: \"cad15908-dabc-4b48-9aa7-977801ce63ff\") " Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.202870 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tpxl\" (UniqueName: \"kubernetes.io/projected/cad15908-dabc-4b48-9aa7-977801ce63ff-kube-api-access-6tpxl\") pod \"cad15908-dabc-4b48-9aa7-977801ce63ff\" (UID: \"cad15908-dabc-4b48-9aa7-977801ce63ff\") " Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.202896 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cad15908-dabc-4b48-9aa7-977801ce63ff-inventory\") pod \"cad15908-dabc-4b48-9aa7-977801ce63ff\" (UID: \"cad15908-dabc-4b48-9aa7-977801ce63ff\") " Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.203032 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cad15908-dabc-4b48-9aa7-977801ce63ff-ceph\") pod \"cad15908-dabc-4b48-9aa7-977801ce63ff\" (UID: \"cad15908-dabc-4b48-9aa7-977801ce63ff\") " Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.218984 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cad15908-dabc-4b48-9aa7-977801ce63ff-ceph" (OuterVolumeSpecName: "ceph") pod "cad15908-dabc-4b48-9aa7-977801ce63ff" (UID: "cad15908-dabc-4b48-9aa7-977801ce63ff"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.221930 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cad15908-dabc-4b48-9aa7-977801ce63ff-kube-api-access-6tpxl" (OuterVolumeSpecName: "kube-api-access-6tpxl") pod "cad15908-dabc-4b48-9aa7-977801ce63ff" (UID: "cad15908-dabc-4b48-9aa7-977801ce63ff"). InnerVolumeSpecName "kube-api-access-6tpxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.229640 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cad15908-dabc-4b48-9aa7-977801ce63ff-inventory" (OuterVolumeSpecName: "inventory") pod "cad15908-dabc-4b48-9aa7-977801ce63ff" (UID: "cad15908-dabc-4b48-9aa7-977801ce63ff"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.240793 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cad15908-dabc-4b48-9aa7-977801ce63ff-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cad15908-dabc-4b48-9aa7-977801ce63ff" (UID: "cad15908-dabc-4b48-9aa7-977801ce63ff"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.305894 4672 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cad15908-dabc-4b48-9aa7-977801ce63ff-ceph\") on node \"crc\" DevicePath \"\"" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.306136 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cad15908-dabc-4b48-9aa7-977801ce63ff-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.306201 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tpxl\" (UniqueName: \"kubernetes.io/projected/cad15908-dabc-4b48-9aa7-977801ce63ff-kube-api-access-6tpxl\") on node \"crc\" DevicePath \"\"" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.306316 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cad15908-dabc-4b48-9aa7-977801ce63ff-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.658744 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gzmnw" event={"ID":"cad15908-dabc-4b48-9aa7-977801ce63ff","Type":"ContainerDied","Data":"3ff77d5eb20fc5555a04472b702be03a9ef5e2e0171400955acb80f999fd90bc"} Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.658798 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gzmnw" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.658804 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ff77d5eb20fc5555a04472b702be03a9ef5e2e0171400955acb80f999fd90bc" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.771875 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d"] Dec 06 09:46:54 crc kubenswrapper[4672]: E1206 09:46:54.772363 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cad15908-dabc-4b48-9aa7-977801ce63ff" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.772380 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="cad15908-dabc-4b48-9aa7-977801ce63ff" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.772587 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="cad15908-dabc-4b48-9aa7-977801ce63ff" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.773310 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.776886 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.777065 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.777126 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.778828 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.778850 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p6qrb" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.786080 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d"] Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.816194 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc76ad12-899a-427b-abd7-57b3375a29ea-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d\" (UID: \"fc76ad12-899a-427b-abd7-57b3375a29ea\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.816266 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc76ad12-899a-427b-abd7-57b3375a29ea-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d\" (UID: \"fc76ad12-899a-427b-abd7-57b3375a29ea\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.816303 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzxs4\" (UniqueName: \"kubernetes.io/projected/fc76ad12-899a-427b-abd7-57b3375a29ea-kube-api-access-fzxs4\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d\" (UID: \"fc76ad12-899a-427b-abd7-57b3375a29ea\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.816418 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fc76ad12-899a-427b-abd7-57b3375a29ea-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d\" (UID: \"fc76ad12-899a-427b-abd7-57b3375a29ea\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.917433 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc76ad12-899a-427b-abd7-57b3375a29ea-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d\" (UID: \"fc76ad12-899a-427b-abd7-57b3375a29ea\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.917498 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc76ad12-899a-427b-abd7-57b3375a29ea-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d\" (UID: \"fc76ad12-899a-427b-abd7-57b3375a29ea\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.917531 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzxs4\" (UniqueName: \"kubernetes.io/projected/fc76ad12-899a-427b-abd7-57b3375a29ea-kube-api-access-fzxs4\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d\" (UID: \"fc76ad12-899a-427b-abd7-57b3375a29ea\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.917672 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fc76ad12-899a-427b-abd7-57b3375a29ea-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d\" (UID: \"fc76ad12-899a-427b-abd7-57b3375a29ea\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.921257 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc76ad12-899a-427b-abd7-57b3375a29ea-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d\" (UID: \"fc76ad12-899a-427b-abd7-57b3375a29ea\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.921732 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fc76ad12-899a-427b-abd7-57b3375a29ea-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d\" (UID: \"fc76ad12-899a-427b-abd7-57b3375a29ea\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.928038 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc76ad12-899a-427b-abd7-57b3375a29ea-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d\" (UID: \"fc76ad12-899a-427b-abd7-57b3375a29ea\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d" Dec 06 09:46:54 crc kubenswrapper[4672]: I1206 09:46:54.937407 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzxs4\" (UniqueName: \"kubernetes.io/projected/fc76ad12-899a-427b-abd7-57b3375a29ea-kube-api-access-fzxs4\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d\" (UID: \"fc76ad12-899a-427b-abd7-57b3375a29ea\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d" Dec 06 09:46:55 crc kubenswrapper[4672]: I1206 09:46:55.104640 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d" Dec 06 09:46:55 crc kubenswrapper[4672]: I1206 09:46:55.723239 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d"] Dec 06 09:46:55 crc kubenswrapper[4672]: W1206 09:46:55.734649 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc76ad12_899a_427b_abd7_57b3375a29ea.slice/crio-068e5a7850971764219cafff1a4b7cae2f929078d6645244167a1c2463b17162 WatchSource:0}: Error finding container 068e5a7850971764219cafff1a4b7cae2f929078d6645244167a1c2463b17162: Status 404 returned error can't find the container with id 068e5a7850971764219cafff1a4b7cae2f929078d6645244167a1c2463b17162 Dec 06 09:46:56 crc kubenswrapper[4672]: I1206 09:46:56.683350 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d" event={"ID":"fc76ad12-899a-427b-abd7-57b3375a29ea","Type":"ContainerStarted","Data":"068e5a7850971764219cafff1a4b7cae2f929078d6645244167a1c2463b17162"} Dec 06 09:46:57 crc kubenswrapper[4672]: I1206 09:46:57.709806 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d" event={"ID":"fc76ad12-899a-427b-abd7-57b3375a29ea","Type":"ContainerStarted","Data":"2e4a035037e0603f5c6b38f9d71e2a5f0766299ab4b3b38248bf56c8ae922a6d"} Dec 06 09:47:01 crc kubenswrapper[4672]: I1206 09:47:01.757345 4672 generic.go:334] "Generic (PLEG): container finished" podID="fc76ad12-899a-427b-abd7-57b3375a29ea" containerID="2e4a035037e0603f5c6b38f9d71e2a5f0766299ab4b3b38248bf56c8ae922a6d" exitCode=0 Dec 06 09:47:01 crc kubenswrapper[4672]: I1206 09:47:01.757437 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d" event={"ID":"fc76ad12-899a-427b-abd7-57b3375a29ea","Type":"ContainerDied","Data":"2e4a035037e0603f5c6b38f9d71e2a5f0766299ab4b3b38248bf56c8ae922a6d"} Dec 06 09:47:03 crc kubenswrapper[4672]: I1206 09:47:03.245376 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d" Dec 06 09:47:03 crc kubenswrapper[4672]: I1206 09:47:03.295638 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc76ad12-899a-427b-abd7-57b3375a29ea-ssh-key\") pod \"fc76ad12-899a-427b-abd7-57b3375a29ea\" (UID: \"fc76ad12-899a-427b-abd7-57b3375a29ea\") " Dec 06 09:47:03 crc kubenswrapper[4672]: I1206 09:47:03.295831 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzxs4\" (UniqueName: \"kubernetes.io/projected/fc76ad12-899a-427b-abd7-57b3375a29ea-kube-api-access-fzxs4\") pod \"fc76ad12-899a-427b-abd7-57b3375a29ea\" (UID: \"fc76ad12-899a-427b-abd7-57b3375a29ea\") " Dec 06 09:47:03 crc kubenswrapper[4672]: I1206 09:47:03.295933 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc76ad12-899a-427b-abd7-57b3375a29ea-inventory\") pod \"fc76ad12-899a-427b-abd7-57b3375a29ea\" (UID: \"fc76ad12-899a-427b-abd7-57b3375a29ea\") " Dec 06 09:47:03 crc kubenswrapper[4672]: I1206 09:47:03.299747 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fc76ad12-899a-427b-abd7-57b3375a29ea-ceph\") pod \"fc76ad12-899a-427b-abd7-57b3375a29ea\" (UID: \"fc76ad12-899a-427b-abd7-57b3375a29ea\") " Dec 06 09:47:03 crc kubenswrapper[4672]: I1206 09:47:03.300408 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc76ad12-899a-427b-abd7-57b3375a29ea-kube-api-access-fzxs4" (OuterVolumeSpecName: "kube-api-access-fzxs4") pod "fc76ad12-899a-427b-abd7-57b3375a29ea" (UID: "fc76ad12-899a-427b-abd7-57b3375a29ea"). InnerVolumeSpecName "kube-api-access-fzxs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:47:03 crc kubenswrapper[4672]: I1206 09:47:03.300910 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzxs4\" (UniqueName: \"kubernetes.io/projected/fc76ad12-899a-427b-abd7-57b3375a29ea-kube-api-access-fzxs4\") on node \"crc\" DevicePath \"\"" Dec 06 09:47:03 crc kubenswrapper[4672]: I1206 09:47:03.303333 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc76ad12-899a-427b-abd7-57b3375a29ea-ceph" (OuterVolumeSpecName: "ceph") pod "fc76ad12-899a-427b-abd7-57b3375a29ea" (UID: "fc76ad12-899a-427b-abd7-57b3375a29ea"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:47:03 crc kubenswrapper[4672]: I1206 09:47:03.319004 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc76ad12-899a-427b-abd7-57b3375a29ea-inventory" (OuterVolumeSpecName: "inventory") pod "fc76ad12-899a-427b-abd7-57b3375a29ea" (UID: "fc76ad12-899a-427b-abd7-57b3375a29ea"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:47:03 crc kubenswrapper[4672]: I1206 09:47:03.332106 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc76ad12-899a-427b-abd7-57b3375a29ea-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fc76ad12-899a-427b-abd7-57b3375a29ea" (UID: "fc76ad12-899a-427b-abd7-57b3375a29ea"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:47:03 crc kubenswrapper[4672]: I1206 09:47:03.402829 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc76ad12-899a-427b-abd7-57b3375a29ea-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:47:03 crc kubenswrapper[4672]: I1206 09:47:03.402876 4672 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fc76ad12-899a-427b-abd7-57b3375a29ea-ceph\") on node \"crc\" DevicePath \"\"" Dec 06 09:47:03 crc kubenswrapper[4672]: I1206 09:47:03.402884 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc76ad12-899a-427b-abd7-57b3375a29ea-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:47:03 crc kubenswrapper[4672]: I1206 09:47:03.776445 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d" event={"ID":"fc76ad12-899a-427b-abd7-57b3375a29ea","Type":"ContainerDied","Data":"068e5a7850971764219cafff1a4b7cae2f929078d6645244167a1c2463b17162"} Dec 06 09:47:03 crc kubenswrapper[4672]: I1206 09:47:03.776492 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="068e5a7850971764219cafff1a4b7cae2f929078d6645244167a1c2463b17162" Dec 06 09:47:03 crc kubenswrapper[4672]: I1206 09:47:03.776551 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d" Dec 06 09:47:03 crc kubenswrapper[4672]: I1206 09:47:03.926991 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc"] Dec 06 09:47:03 crc kubenswrapper[4672]: E1206 09:47:03.927586 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc76ad12-899a-427b-abd7-57b3375a29ea" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 06 09:47:03 crc kubenswrapper[4672]: I1206 09:47:03.927619 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc76ad12-899a-427b-abd7-57b3375a29ea" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 06 09:47:03 crc kubenswrapper[4672]: I1206 09:47:03.927823 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc76ad12-899a-427b-abd7-57b3375a29ea" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 06 09:47:03 crc kubenswrapper[4672]: I1206 09:47:03.928482 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc" Dec 06 09:47:03 crc kubenswrapper[4672]: I1206 09:47:03.942584 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 06 09:47:03 crc kubenswrapper[4672]: I1206 09:47:03.945258 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 09:47:03 crc kubenswrapper[4672]: I1206 09:47:03.945649 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:47:03 crc kubenswrapper[4672]: I1206 09:47:03.945922 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 09:47:03 crc kubenswrapper[4672]: I1206 09:47:03.946155 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p6qrb" Dec 06 09:47:03 crc kubenswrapper[4672]: I1206 09:47:03.955536 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc"] Dec 06 09:47:04 crc kubenswrapper[4672]: I1206 09:47:04.015078 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02d2290f-9fc0-4247-8db8-660f26601528-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc\" (UID: \"02d2290f-9fc0-4247-8db8-660f26601528\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc" Dec 06 09:47:04 crc kubenswrapper[4672]: I1206 09:47:04.015165 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/02d2290f-9fc0-4247-8db8-660f26601528-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc\" (UID: \"02d2290f-9fc0-4247-8db8-660f26601528\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc" Dec 06 09:47:04 crc kubenswrapper[4672]: I1206 09:47:04.015218 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/02d2290f-9fc0-4247-8db8-660f26601528-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc\" (UID: \"02d2290f-9fc0-4247-8db8-660f26601528\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc" Dec 06 09:47:04 crc kubenswrapper[4672]: I1206 09:47:04.015243 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmwbv\" (UniqueName: \"kubernetes.io/projected/02d2290f-9fc0-4247-8db8-660f26601528-kube-api-access-gmwbv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc\" (UID: \"02d2290f-9fc0-4247-8db8-660f26601528\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc" Dec 06 09:47:04 crc kubenswrapper[4672]: I1206 09:47:04.116545 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/02d2290f-9fc0-4247-8db8-660f26601528-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc\" (UID: \"02d2290f-9fc0-4247-8db8-660f26601528\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc" Dec 06 09:47:04 crc kubenswrapper[4672]: I1206 09:47:04.116637 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/02d2290f-9fc0-4247-8db8-660f26601528-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc\" (UID: \"02d2290f-9fc0-4247-8db8-660f26601528\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc" Dec 06 09:47:04 crc kubenswrapper[4672]: I1206 09:47:04.116665 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmwbv\" (UniqueName: \"kubernetes.io/projected/02d2290f-9fc0-4247-8db8-660f26601528-kube-api-access-gmwbv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc\" (UID: \"02d2290f-9fc0-4247-8db8-660f26601528\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc" Dec 06 09:47:04 crc kubenswrapper[4672]: I1206 09:47:04.116721 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02d2290f-9fc0-4247-8db8-660f26601528-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc\" (UID: \"02d2290f-9fc0-4247-8db8-660f26601528\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc" Dec 06 09:47:04 crc kubenswrapper[4672]: I1206 09:47:04.120220 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/02d2290f-9fc0-4247-8db8-660f26601528-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc\" (UID: \"02d2290f-9fc0-4247-8db8-660f26601528\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc" Dec 06 09:47:04 crc kubenswrapper[4672]: I1206 09:47:04.121248 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02d2290f-9fc0-4247-8db8-660f26601528-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc\" (UID: \"02d2290f-9fc0-4247-8db8-660f26601528\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc" Dec 06 09:47:04 crc kubenswrapper[4672]: I1206 09:47:04.135263 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/02d2290f-9fc0-4247-8db8-660f26601528-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc\" (UID: \"02d2290f-9fc0-4247-8db8-660f26601528\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc" Dec 06 09:47:04 crc kubenswrapper[4672]: I1206 09:47:04.170807 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmwbv\" (UniqueName: \"kubernetes.io/projected/02d2290f-9fc0-4247-8db8-660f26601528-kube-api-access-gmwbv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc\" (UID: \"02d2290f-9fc0-4247-8db8-660f26601528\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc" Dec 06 09:47:04 crc kubenswrapper[4672]: I1206 09:47:04.246397 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc" Dec 06 09:47:04 crc kubenswrapper[4672]: I1206 09:47:04.890535 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc"] Dec 06 09:47:05 crc kubenswrapper[4672]: I1206 09:47:05.800519 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc" event={"ID":"02d2290f-9fc0-4247-8db8-660f26601528","Type":"ContainerStarted","Data":"836c9c0c27d4c35133f5b22b5e2ea84e424c4a3e86558cf1cae4c02c00066a48"} Dec 06 09:47:05 crc kubenswrapper[4672]: I1206 09:47:05.801073 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc" event={"ID":"02d2290f-9fc0-4247-8db8-660f26601528","Type":"ContainerStarted","Data":"24515ef73b184dfe2ee126d57f1997ccf7fd789560c4b2e9100f4759f4162642"} Dec 06 09:47:05 crc kubenswrapper[4672]: I1206 09:47:05.821127 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc" podStartSLOduration=2.207696714 podStartE2EDuration="2.821112008s" podCreationTimestamp="2025-12-06 09:47:03 +0000 UTC" firstStartedPulling="2025-12-06 09:47:04.905784597 +0000 UTC m=+2442.650044894" lastFinishedPulling="2025-12-06 09:47:05.519199901 +0000 UTC m=+2443.263460188" observedRunningTime="2025-12-06 09:47:05.820132662 +0000 UTC m=+2443.564392959" watchObservedRunningTime="2025-12-06 09:47:05.821112008 +0000 UTC m=+2443.565372305" Dec 06 09:47:08 crc kubenswrapper[4672]: I1206 09:47:08.557222 4672 scope.go:117] "RemoveContainer" containerID="9db37941c2f5797e3cc2e07c2a5ea926cd6224bccf10d013946bef80402ff7bb" Dec 06 09:47:08 crc kubenswrapper[4672]: E1206 09:47:08.558323 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:47:21 crc kubenswrapper[4672]: I1206 09:47:21.557522 4672 scope.go:117] "RemoveContainer" containerID="9db37941c2f5797e3cc2e07c2a5ea926cd6224bccf10d013946bef80402ff7bb" Dec 06 09:47:21 crc kubenswrapper[4672]: E1206 09:47:21.559700 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:47:32 crc kubenswrapper[4672]: I1206 09:47:32.562843 4672 scope.go:117] "RemoveContainer" containerID="9db37941c2f5797e3cc2e07c2a5ea926cd6224bccf10d013946bef80402ff7bb" Dec 06 09:47:32 crc kubenswrapper[4672]: E1206 09:47:32.564778 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:47:47 crc kubenswrapper[4672]: I1206 09:47:47.557755 4672 scope.go:117] "RemoveContainer" containerID="9db37941c2f5797e3cc2e07c2a5ea926cd6224bccf10d013946bef80402ff7bb" Dec 06 09:47:47 crc kubenswrapper[4672]: E1206 09:47:47.558879 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:47:55 crc kubenswrapper[4672]: I1206 09:47:55.202065 4672 generic.go:334] "Generic (PLEG): container finished" podID="02d2290f-9fc0-4247-8db8-660f26601528" containerID="836c9c0c27d4c35133f5b22b5e2ea84e424c4a3e86558cf1cae4c02c00066a48" exitCode=0 Dec 06 09:47:55 crc kubenswrapper[4672]: I1206 09:47:55.202174 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc" event={"ID":"02d2290f-9fc0-4247-8db8-660f26601528","Type":"ContainerDied","Data":"836c9c0c27d4c35133f5b22b5e2ea84e424c4a3e86558cf1cae4c02c00066a48"} Dec 06 09:47:56 crc kubenswrapper[4672]: I1206 09:47:56.627369 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc" Dec 06 09:47:56 crc kubenswrapper[4672]: I1206 09:47:56.696450 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmwbv\" (UniqueName: \"kubernetes.io/projected/02d2290f-9fc0-4247-8db8-660f26601528-kube-api-access-gmwbv\") pod \"02d2290f-9fc0-4247-8db8-660f26601528\" (UID: \"02d2290f-9fc0-4247-8db8-660f26601528\") " Dec 06 09:47:56 crc kubenswrapper[4672]: I1206 09:47:56.696566 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/02d2290f-9fc0-4247-8db8-660f26601528-ssh-key\") pod \"02d2290f-9fc0-4247-8db8-660f26601528\" (UID: \"02d2290f-9fc0-4247-8db8-660f26601528\") " Dec 06 09:47:56 crc kubenswrapper[4672]: I1206 09:47:56.696671 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02d2290f-9fc0-4247-8db8-660f26601528-inventory\") pod \"02d2290f-9fc0-4247-8db8-660f26601528\" (UID: \"02d2290f-9fc0-4247-8db8-660f26601528\") " Dec 06 09:47:56 crc kubenswrapper[4672]: I1206 09:47:56.696698 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/02d2290f-9fc0-4247-8db8-660f26601528-ceph\") pod \"02d2290f-9fc0-4247-8db8-660f26601528\" (UID: \"02d2290f-9fc0-4247-8db8-660f26601528\") " Dec 06 09:47:56 crc kubenswrapper[4672]: I1206 09:47:56.702270 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02d2290f-9fc0-4247-8db8-660f26601528-ceph" (OuterVolumeSpecName: "ceph") pod "02d2290f-9fc0-4247-8db8-660f26601528" (UID: "02d2290f-9fc0-4247-8db8-660f26601528"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:47:56 crc kubenswrapper[4672]: I1206 09:47:56.703013 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02d2290f-9fc0-4247-8db8-660f26601528-kube-api-access-gmwbv" (OuterVolumeSpecName: "kube-api-access-gmwbv") pod "02d2290f-9fc0-4247-8db8-660f26601528" (UID: "02d2290f-9fc0-4247-8db8-660f26601528"). InnerVolumeSpecName "kube-api-access-gmwbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:47:56 crc kubenswrapper[4672]: I1206 09:47:56.721939 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02d2290f-9fc0-4247-8db8-660f26601528-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "02d2290f-9fc0-4247-8db8-660f26601528" (UID: "02d2290f-9fc0-4247-8db8-660f26601528"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:47:56 crc kubenswrapper[4672]: I1206 09:47:56.729314 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02d2290f-9fc0-4247-8db8-660f26601528-inventory" (OuterVolumeSpecName: "inventory") pod "02d2290f-9fc0-4247-8db8-660f26601528" (UID: "02d2290f-9fc0-4247-8db8-660f26601528"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:47:56 crc kubenswrapper[4672]: I1206 09:47:56.798880 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmwbv\" (UniqueName: \"kubernetes.io/projected/02d2290f-9fc0-4247-8db8-660f26601528-kube-api-access-gmwbv\") on node \"crc\" DevicePath \"\"" Dec 06 09:47:56 crc kubenswrapper[4672]: I1206 09:47:56.799059 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/02d2290f-9fc0-4247-8db8-660f26601528-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:47:56 crc kubenswrapper[4672]: I1206 09:47:56.799174 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02d2290f-9fc0-4247-8db8-660f26601528-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:47:56 crc kubenswrapper[4672]: I1206 09:47:56.799292 4672 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/02d2290f-9fc0-4247-8db8-660f26601528-ceph\") on node \"crc\" DevicePath \"\"" Dec 06 09:47:57 crc kubenswrapper[4672]: I1206 09:47:57.222159 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc" Dec 06 09:47:57 crc kubenswrapper[4672]: I1206 09:47:57.223020 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc" event={"ID":"02d2290f-9fc0-4247-8db8-660f26601528","Type":"ContainerDied","Data":"24515ef73b184dfe2ee126d57f1997ccf7fd789560c4b2e9100f4759f4162642"} Dec 06 09:47:57 crc kubenswrapper[4672]: I1206 09:47:57.223094 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24515ef73b184dfe2ee126d57f1997ccf7fd789560c4b2e9100f4759f4162642" Dec 06 09:47:57 crc kubenswrapper[4672]: I1206 09:47:57.323010 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-pmvr2"] Dec 06 09:47:57 crc kubenswrapper[4672]: E1206 09:47:57.323538 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02d2290f-9fc0-4247-8db8-660f26601528" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 06 09:47:57 crc kubenswrapper[4672]: I1206 09:47:57.323563 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="02d2290f-9fc0-4247-8db8-660f26601528" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 06 09:47:57 crc kubenswrapper[4672]: I1206 09:47:57.323868 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="02d2290f-9fc0-4247-8db8-660f26601528" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 06 09:47:57 crc kubenswrapper[4672]: I1206 09:47:57.324660 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-pmvr2" Dec 06 09:47:57 crc kubenswrapper[4672]: I1206 09:47:57.328529 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p6qrb" Dec 06 09:47:57 crc kubenswrapper[4672]: I1206 09:47:57.328547 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:47:57 crc kubenswrapper[4672]: I1206 09:47:57.328658 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 06 09:47:57 crc kubenswrapper[4672]: I1206 09:47:57.328890 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 09:47:57 crc kubenswrapper[4672]: I1206 09:47:57.331150 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 09:47:57 crc kubenswrapper[4672]: I1206 09:47:57.337321 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-pmvr2"] Dec 06 09:47:57 crc kubenswrapper[4672]: I1206 09:47:57.409515 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e6867be1-002c-4eae-b841-885e3e5e5d20-ceph\") pod \"ssh-known-hosts-edpm-deployment-pmvr2\" (UID: \"e6867be1-002c-4eae-b841-885e3e5e5d20\") " pod="openstack/ssh-known-hosts-edpm-deployment-pmvr2" Dec 06 09:47:57 crc kubenswrapper[4672]: I1206 09:47:57.409667 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6867be1-002c-4eae-b841-885e3e5e5d20-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-pmvr2\" (UID: \"e6867be1-002c-4eae-b841-885e3e5e5d20\") " pod="openstack/ssh-known-hosts-edpm-deployment-pmvr2" Dec 06 09:47:57 crc kubenswrapper[4672]: I1206 09:47:57.409943 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm98j\" (UniqueName: \"kubernetes.io/projected/e6867be1-002c-4eae-b841-885e3e5e5d20-kube-api-access-lm98j\") pod \"ssh-known-hosts-edpm-deployment-pmvr2\" (UID: \"e6867be1-002c-4eae-b841-885e3e5e5d20\") " pod="openstack/ssh-known-hosts-edpm-deployment-pmvr2" Dec 06 09:47:57 crc kubenswrapper[4672]: I1206 09:47:57.410113 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e6867be1-002c-4eae-b841-885e3e5e5d20-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-pmvr2\" (UID: \"e6867be1-002c-4eae-b841-885e3e5e5d20\") " pod="openstack/ssh-known-hosts-edpm-deployment-pmvr2" Dec 06 09:47:57 crc kubenswrapper[4672]: I1206 09:47:57.512034 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e6867be1-002c-4eae-b841-885e3e5e5d20-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-pmvr2\" (UID: \"e6867be1-002c-4eae-b841-885e3e5e5d20\") " pod="openstack/ssh-known-hosts-edpm-deployment-pmvr2" Dec 06 09:47:57 crc kubenswrapper[4672]: I1206 09:47:57.512673 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e6867be1-002c-4eae-b841-885e3e5e5d20-ceph\") pod \"ssh-known-hosts-edpm-deployment-pmvr2\" (UID: \"e6867be1-002c-4eae-b841-885e3e5e5d20\") " pod="openstack/ssh-known-hosts-edpm-deployment-pmvr2" Dec 06 09:47:57 crc kubenswrapper[4672]: I1206 09:47:57.512733 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6867be1-002c-4eae-b841-885e3e5e5d20-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-pmvr2\" (UID: \"e6867be1-002c-4eae-b841-885e3e5e5d20\") " pod="openstack/ssh-known-hosts-edpm-deployment-pmvr2" Dec 06 09:47:57 crc kubenswrapper[4672]: I1206 09:47:57.512795 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm98j\" (UniqueName: \"kubernetes.io/projected/e6867be1-002c-4eae-b841-885e3e5e5d20-kube-api-access-lm98j\") pod \"ssh-known-hosts-edpm-deployment-pmvr2\" (UID: \"e6867be1-002c-4eae-b841-885e3e5e5d20\") " pod="openstack/ssh-known-hosts-edpm-deployment-pmvr2" Dec 06 09:47:57 crc kubenswrapper[4672]: I1206 09:47:57.517042 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e6867be1-002c-4eae-b841-885e3e5e5d20-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-pmvr2\" (UID: \"e6867be1-002c-4eae-b841-885e3e5e5d20\") " pod="openstack/ssh-known-hosts-edpm-deployment-pmvr2" Dec 06 09:47:57 crc kubenswrapper[4672]: I1206 09:47:57.517081 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e6867be1-002c-4eae-b841-885e3e5e5d20-ceph\") pod \"ssh-known-hosts-edpm-deployment-pmvr2\" (UID: \"e6867be1-002c-4eae-b841-885e3e5e5d20\") " pod="openstack/ssh-known-hosts-edpm-deployment-pmvr2" Dec 06 09:47:57 crc kubenswrapper[4672]: I1206 09:47:57.517108 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6867be1-002c-4eae-b841-885e3e5e5d20-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-pmvr2\" (UID: \"e6867be1-002c-4eae-b841-885e3e5e5d20\") " pod="openstack/ssh-known-hosts-edpm-deployment-pmvr2" Dec 06 09:47:57 crc kubenswrapper[4672]: I1206 09:47:57.531635 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm98j\" (UniqueName: \"kubernetes.io/projected/e6867be1-002c-4eae-b841-885e3e5e5d20-kube-api-access-lm98j\") pod \"ssh-known-hosts-edpm-deployment-pmvr2\" (UID: \"e6867be1-002c-4eae-b841-885e3e5e5d20\") " pod="openstack/ssh-known-hosts-edpm-deployment-pmvr2" Dec 06 09:47:57 crc kubenswrapper[4672]: I1206 09:47:57.693325 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-pmvr2" Dec 06 09:47:58 crc kubenswrapper[4672]: I1206 09:47:58.298880 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-pmvr2"] Dec 06 09:47:58 crc kubenswrapper[4672]: W1206 09:47:58.302743 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6867be1_002c_4eae_b841_885e3e5e5d20.slice/crio-05c50835841f45bfe28edc4455500d957f5862213e10222decbed8a8cc559d88 WatchSource:0}: Error finding container 05c50835841f45bfe28edc4455500d957f5862213e10222decbed8a8cc559d88: Status 404 returned error can't find the container with id 05c50835841f45bfe28edc4455500d957f5862213e10222decbed8a8cc559d88 Dec 06 09:47:59 crc kubenswrapper[4672]: I1206 09:47:59.249437 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-pmvr2" event={"ID":"e6867be1-002c-4eae-b841-885e3e5e5d20","Type":"ContainerStarted","Data":"05c50835841f45bfe28edc4455500d957f5862213e10222decbed8a8cc559d88"} Dec 06 09:48:00 crc kubenswrapper[4672]: I1206 09:48:00.261723 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-pmvr2" event={"ID":"e6867be1-002c-4eae-b841-885e3e5e5d20","Type":"ContainerStarted","Data":"0b8fb36fa01ab23d21fab21911600c6a3dc40a5b535abb18c110266789c58df6"} Dec 06 09:48:02 crc kubenswrapper[4672]: I1206 09:48:02.565186 4672 scope.go:117] "RemoveContainer" containerID="9db37941c2f5797e3cc2e07c2a5ea926cd6224bccf10d013946bef80402ff7bb" Dec 06 09:48:02 crc kubenswrapper[4672]: E1206 09:48:02.565890 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:48:13 crc kubenswrapper[4672]: I1206 09:48:13.377489 4672 generic.go:334] "Generic (PLEG): container finished" podID="e6867be1-002c-4eae-b841-885e3e5e5d20" containerID="0b8fb36fa01ab23d21fab21911600c6a3dc40a5b535abb18c110266789c58df6" exitCode=0 Dec 06 09:48:13 crc kubenswrapper[4672]: I1206 09:48:13.377560 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-pmvr2" event={"ID":"e6867be1-002c-4eae-b841-885e3e5e5d20","Type":"ContainerDied","Data":"0b8fb36fa01ab23d21fab21911600c6a3dc40a5b535abb18c110266789c58df6"} Dec 06 09:48:14 crc kubenswrapper[4672]: I1206 09:48:14.815379 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-pmvr2" Dec 06 09:48:14 crc kubenswrapper[4672]: I1206 09:48:14.868981 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e6867be1-002c-4eae-b841-885e3e5e5d20-inventory-0\") pod \"e6867be1-002c-4eae-b841-885e3e5e5d20\" (UID: \"e6867be1-002c-4eae-b841-885e3e5e5d20\") " Dec 06 09:48:14 crc kubenswrapper[4672]: I1206 09:48:14.869056 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e6867be1-002c-4eae-b841-885e3e5e5d20-ceph\") pod \"e6867be1-002c-4eae-b841-885e3e5e5d20\" (UID: \"e6867be1-002c-4eae-b841-885e3e5e5d20\") " Dec 06 09:48:14 crc kubenswrapper[4672]: I1206 09:48:14.869096 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6867be1-002c-4eae-b841-885e3e5e5d20-ssh-key-openstack-edpm-ipam\") pod \"e6867be1-002c-4eae-b841-885e3e5e5d20\" (UID: \"e6867be1-002c-4eae-b841-885e3e5e5d20\") " Dec 06 09:48:14 crc kubenswrapper[4672]: I1206 09:48:14.869134 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lm98j\" (UniqueName: \"kubernetes.io/projected/e6867be1-002c-4eae-b841-885e3e5e5d20-kube-api-access-lm98j\") pod \"e6867be1-002c-4eae-b841-885e3e5e5d20\" (UID: \"e6867be1-002c-4eae-b841-885e3e5e5d20\") " Dec 06 09:48:14 crc kubenswrapper[4672]: I1206 09:48:14.898837 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6867be1-002c-4eae-b841-885e3e5e5d20-ceph" (OuterVolumeSpecName: "ceph") pod "e6867be1-002c-4eae-b841-885e3e5e5d20" (UID: "e6867be1-002c-4eae-b841-885e3e5e5d20"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:48:14 crc kubenswrapper[4672]: I1206 09:48:14.898864 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6867be1-002c-4eae-b841-885e3e5e5d20-kube-api-access-lm98j" (OuterVolumeSpecName: "kube-api-access-lm98j") pod "e6867be1-002c-4eae-b841-885e3e5e5d20" (UID: "e6867be1-002c-4eae-b841-885e3e5e5d20"). InnerVolumeSpecName "kube-api-access-lm98j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:48:14 crc kubenswrapper[4672]: I1206 09:48:14.902733 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6867be1-002c-4eae-b841-885e3e5e5d20-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e6867be1-002c-4eae-b841-885e3e5e5d20" (UID: "e6867be1-002c-4eae-b841-885e3e5e5d20"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:48:14 crc kubenswrapper[4672]: I1206 09:48:14.906882 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6867be1-002c-4eae-b841-885e3e5e5d20-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "e6867be1-002c-4eae-b841-885e3e5e5d20" (UID: "e6867be1-002c-4eae-b841-885e3e5e5d20"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:48:14 crc kubenswrapper[4672]: I1206 09:48:14.970812 4672 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e6867be1-002c-4eae-b841-885e3e5e5d20-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:14 crc kubenswrapper[4672]: I1206 09:48:14.971340 4672 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e6867be1-002c-4eae-b841-885e3e5e5d20-ceph\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:14 crc kubenswrapper[4672]: I1206 09:48:14.971352 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6867be1-002c-4eae-b841-885e3e5e5d20-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:14 crc kubenswrapper[4672]: I1206 09:48:14.971363 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lm98j\" (UniqueName: \"kubernetes.io/projected/e6867be1-002c-4eae-b841-885e3e5e5d20-kube-api-access-lm98j\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:15 crc kubenswrapper[4672]: I1206 09:48:15.401831 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-pmvr2" event={"ID":"e6867be1-002c-4eae-b841-885e3e5e5d20","Type":"ContainerDied","Data":"05c50835841f45bfe28edc4455500d957f5862213e10222decbed8a8cc559d88"} Dec 06 09:48:15 crc kubenswrapper[4672]: I1206 09:48:15.401879 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05c50835841f45bfe28edc4455500d957f5862213e10222decbed8a8cc559d88" Dec 06 09:48:15 crc kubenswrapper[4672]: I1206 09:48:15.401926 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-pmvr2" Dec 06 09:48:15 crc kubenswrapper[4672]: I1206 09:48:15.490805 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9f6tf"] Dec 06 09:48:15 crc kubenswrapper[4672]: E1206 09:48:15.491243 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6867be1-002c-4eae-b841-885e3e5e5d20" containerName="ssh-known-hosts-edpm-deployment" Dec 06 09:48:15 crc kubenswrapper[4672]: I1206 09:48:15.491264 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6867be1-002c-4eae-b841-885e3e5e5d20" containerName="ssh-known-hosts-edpm-deployment" Dec 06 09:48:15 crc kubenswrapper[4672]: I1206 09:48:15.491477 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6867be1-002c-4eae-b841-885e3e5e5d20" containerName="ssh-known-hosts-edpm-deployment" Dec 06 09:48:15 crc kubenswrapper[4672]: I1206 09:48:15.492151 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9f6tf" Dec 06 09:48:15 crc kubenswrapper[4672]: I1206 09:48:15.494523 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 09:48:15 crc kubenswrapper[4672]: I1206 09:48:15.494816 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 06 09:48:15 crc kubenswrapper[4672]: I1206 09:48:15.495059 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p6qrb" Dec 06 09:48:15 crc kubenswrapper[4672]: I1206 09:48:15.495061 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:48:15 crc kubenswrapper[4672]: I1206 09:48:15.495982 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 09:48:15 crc kubenswrapper[4672]: I1206 09:48:15.508325 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9f6tf"] Dec 06 09:48:15 crc kubenswrapper[4672]: I1206 09:48:15.557133 4672 scope.go:117] "RemoveContainer" containerID="9db37941c2f5797e3cc2e07c2a5ea926cd6224bccf10d013946bef80402ff7bb" Dec 06 09:48:15 crc kubenswrapper[4672]: E1206 09:48:15.557353 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:48:15 crc kubenswrapper[4672]: I1206 09:48:15.582703 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c6bfe13-aab7-4455-9879-a1e1e7276407-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9f6tf\" (UID: \"5c6bfe13-aab7-4455-9879-a1e1e7276407\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9f6tf" Dec 06 09:48:15 crc kubenswrapper[4672]: I1206 09:48:15.582781 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c6bfe13-aab7-4455-9879-a1e1e7276407-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9f6tf\" (UID: \"5c6bfe13-aab7-4455-9879-a1e1e7276407\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9f6tf" Dec 06 09:48:15 crc kubenswrapper[4672]: I1206 09:48:15.582827 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l29l8\" (UniqueName: \"kubernetes.io/projected/5c6bfe13-aab7-4455-9879-a1e1e7276407-kube-api-access-l29l8\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9f6tf\" (UID: \"5c6bfe13-aab7-4455-9879-a1e1e7276407\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9f6tf" Dec 06 09:48:15 crc kubenswrapper[4672]: I1206 09:48:15.582854 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5c6bfe13-aab7-4455-9879-a1e1e7276407-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9f6tf\" (UID: \"5c6bfe13-aab7-4455-9879-a1e1e7276407\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9f6tf" Dec 06 09:48:15 crc kubenswrapper[4672]: I1206 09:48:15.685272 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c6bfe13-aab7-4455-9879-a1e1e7276407-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9f6tf\" (UID: \"5c6bfe13-aab7-4455-9879-a1e1e7276407\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9f6tf" Dec 06 09:48:15 crc kubenswrapper[4672]: I1206 09:48:15.685324 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c6bfe13-aab7-4455-9879-a1e1e7276407-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9f6tf\" (UID: \"5c6bfe13-aab7-4455-9879-a1e1e7276407\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9f6tf" Dec 06 09:48:15 crc kubenswrapper[4672]: I1206 09:48:15.685357 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5c6bfe13-aab7-4455-9879-a1e1e7276407-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9f6tf\" (UID: \"5c6bfe13-aab7-4455-9879-a1e1e7276407\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9f6tf" Dec 06 09:48:15 crc kubenswrapper[4672]: I1206 09:48:15.685376 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l29l8\" (UniqueName: \"kubernetes.io/projected/5c6bfe13-aab7-4455-9879-a1e1e7276407-kube-api-access-l29l8\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9f6tf\" (UID: \"5c6bfe13-aab7-4455-9879-a1e1e7276407\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9f6tf" Dec 06 09:48:15 crc kubenswrapper[4672]: I1206 09:48:15.692354 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c6bfe13-aab7-4455-9879-a1e1e7276407-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9f6tf\" (UID: \"5c6bfe13-aab7-4455-9879-a1e1e7276407\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9f6tf" Dec 06 09:48:15 crc kubenswrapper[4672]: I1206 09:48:15.696474 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c6bfe13-aab7-4455-9879-a1e1e7276407-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9f6tf\" (UID: \"5c6bfe13-aab7-4455-9879-a1e1e7276407\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9f6tf" Dec 06 09:48:15 crc kubenswrapper[4672]: I1206 09:48:15.696478 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5c6bfe13-aab7-4455-9879-a1e1e7276407-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9f6tf\" (UID: \"5c6bfe13-aab7-4455-9879-a1e1e7276407\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9f6tf" Dec 06 09:48:15 crc kubenswrapper[4672]: I1206 09:48:15.704824 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l29l8\" (UniqueName: \"kubernetes.io/projected/5c6bfe13-aab7-4455-9879-a1e1e7276407-kube-api-access-l29l8\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9f6tf\" (UID: \"5c6bfe13-aab7-4455-9879-a1e1e7276407\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9f6tf" Dec 06 09:48:15 crc kubenswrapper[4672]: I1206 09:48:15.824230 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9f6tf" Dec 06 09:48:16 crc kubenswrapper[4672]: I1206 09:48:16.389511 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9f6tf"] Dec 06 09:48:16 crc kubenswrapper[4672]: W1206 09:48:16.405743 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c6bfe13_aab7_4455_9879_a1e1e7276407.slice/crio-a2004734be5f8e02c388d8bf9f3e5f1dee4fae3bce7e80c9b9ef8392a585a7d2 WatchSource:0}: Error finding container a2004734be5f8e02c388d8bf9f3e5f1dee4fae3bce7e80c9b9ef8392a585a7d2: Status 404 returned error can't find the container with id a2004734be5f8e02c388d8bf9f3e5f1dee4fae3bce7e80c9b9ef8392a585a7d2 Dec 06 09:48:17 crc kubenswrapper[4672]: I1206 09:48:17.422429 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9f6tf" event={"ID":"5c6bfe13-aab7-4455-9879-a1e1e7276407","Type":"ContainerStarted","Data":"b22f0bd138c79363b534bd8ec91d5765e2b9405de8cf773430851573080d79c4"} Dec 06 09:48:17 crc kubenswrapper[4672]: I1206 09:48:17.423054 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9f6tf" event={"ID":"5c6bfe13-aab7-4455-9879-a1e1e7276407","Type":"ContainerStarted","Data":"a2004734be5f8e02c388d8bf9f3e5f1dee4fae3bce7e80c9b9ef8392a585a7d2"} Dec 06 09:48:17 crc kubenswrapper[4672]: I1206 09:48:17.445567 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9f6tf" podStartSLOduration=2.018120193 podStartE2EDuration="2.445551466s" podCreationTimestamp="2025-12-06 09:48:15 +0000 UTC" firstStartedPulling="2025-12-06 09:48:16.407871158 +0000 UTC m=+2514.152131445" lastFinishedPulling="2025-12-06 09:48:16.835302421 +0000 UTC m=+2514.579562718" observedRunningTime="2025-12-06 09:48:17.440931501 +0000 UTC m=+2515.185191788" watchObservedRunningTime="2025-12-06 09:48:17.445551466 +0000 UTC m=+2515.189811753" Dec 06 09:48:25 crc kubenswrapper[4672]: E1206 09:48:25.179006 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c6bfe13_aab7_4455_9879_a1e1e7276407.slice/crio-conmon-b22f0bd138c79363b534bd8ec91d5765e2b9405de8cf773430851573080d79c4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c6bfe13_aab7_4455_9879_a1e1e7276407.slice/crio-b22f0bd138c79363b534bd8ec91d5765e2b9405de8cf773430851573080d79c4.scope\": RecentStats: unable to find data in memory cache]" Dec 06 09:48:25 crc kubenswrapper[4672]: I1206 09:48:25.482158 4672 generic.go:334] "Generic (PLEG): container finished" podID="5c6bfe13-aab7-4455-9879-a1e1e7276407" containerID="b22f0bd138c79363b534bd8ec91d5765e2b9405de8cf773430851573080d79c4" exitCode=0 Dec 06 09:48:25 crc kubenswrapper[4672]: I1206 09:48:25.482202 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9f6tf" event={"ID":"5c6bfe13-aab7-4455-9879-a1e1e7276407","Type":"ContainerDied","Data":"b22f0bd138c79363b534bd8ec91d5765e2b9405de8cf773430851573080d79c4"} Dec 06 09:48:26 crc kubenswrapper[4672]: I1206 09:48:26.557706 4672 scope.go:117] "RemoveContainer" containerID="9db37941c2f5797e3cc2e07c2a5ea926cd6224bccf10d013946bef80402ff7bb" Dec 06 09:48:26 crc kubenswrapper[4672]: E1206 09:48:26.558107 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:48:26 crc kubenswrapper[4672]: I1206 09:48:26.887569 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9f6tf" Dec 06 09:48:26 crc kubenswrapper[4672]: I1206 09:48:26.998576 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c6bfe13-aab7-4455-9879-a1e1e7276407-ssh-key\") pod \"5c6bfe13-aab7-4455-9879-a1e1e7276407\" (UID: \"5c6bfe13-aab7-4455-9879-a1e1e7276407\") " Dec 06 09:48:26 crc kubenswrapper[4672]: I1206 09:48:26.998756 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c6bfe13-aab7-4455-9879-a1e1e7276407-inventory\") pod \"5c6bfe13-aab7-4455-9879-a1e1e7276407\" (UID: \"5c6bfe13-aab7-4455-9879-a1e1e7276407\") " Dec 06 09:48:26 crc kubenswrapper[4672]: I1206 09:48:26.998789 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l29l8\" (UniqueName: \"kubernetes.io/projected/5c6bfe13-aab7-4455-9879-a1e1e7276407-kube-api-access-l29l8\") pod \"5c6bfe13-aab7-4455-9879-a1e1e7276407\" (UID: \"5c6bfe13-aab7-4455-9879-a1e1e7276407\") " Dec 06 09:48:26 crc kubenswrapper[4672]: I1206 09:48:26.998868 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5c6bfe13-aab7-4455-9879-a1e1e7276407-ceph\") pod \"5c6bfe13-aab7-4455-9879-a1e1e7276407\" (UID: \"5c6bfe13-aab7-4455-9879-a1e1e7276407\") " Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.003997 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c6bfe13-aab7-4455-9879-a1e1e7276407-kube-api-access-l29l8" (OuterVolumeSpecName: "kube-api-access-l29l8") pod "5c6bfe13-aab7-4455-9879-a1e1e7276407" (UID: "5c6bfe13-aab7-4455-9879-a1e1e7276407"). InnerVolumeSpecName "kube-api-access-l29l8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.021394 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c6bfe13-aab7-4455-9879-a1e1e7276407-ceph" (OuterVolumeSpecName: "ceph") pod "5c6bfe13-aab7-4455-9879-a1e1e7276407" (UID: "5c6bfe13-aab7-4455-9879-a1e1e7276407"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.026969 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c6bfe13-aab7-4455-9879-a1e1e7276407-inventory" (OuterVolumeSpecName: "inventory") pod "5c6bfe13-aab7-4455-9879-a1e1e7276407" (UID: "5c6bfe13-aab7-4455-9879-a1e1e7276407"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.029324 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c6bfe13-aab7-4455-9879-a1e1e7276407-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5c6bfe13-aab7-4455-9879-a1e1e7276407" (UID: "5c6bfe13-aab7-4455-9879-a1e1e7276407"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.101835 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c6bfe13-aab7-4455-9879-a1e1e7276407-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.101907 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l29l8\" (UniqueName: \"kubernetes.io/projected/5c6bfe13-aab7-4455-9879-a1e1e7276407-kube-api-access-l29l8\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.101942 4672 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5c6bfe13-aab7-4455-9879-a1e1e7276407-ceph\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.101956 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c6bfe13-aab7-4455-9879-a1e1e7276407-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.499291 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9f6tf" event={"ID":"5c6bfe13-aab7-4455-9879-a1e1e7276407","Type":"ContainerDied","Data":"a2004734be5f8e02c388d8bf9f3e5f1dee4fae3bce7e80c9b9ef8392a585a7d2"} Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.499513 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2004734be5f8e02c388d8bf9f3e5f1dee4fae3bce7e80c9b9ef8392a585a7d2" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.499359 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9f6tf" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.588499 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt"] Dec 06 09:48:27 crc kubenswrapper[4672]: E1206 09:48:27.588872 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c6bfe13-aab7-4455-9879-a1e1e7276407" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.588886 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c6bfe13-aab7-4455-9879-a1e1e7276407" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.589075 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c6bfe13-aab7-4455-9879-a1e1e7276407" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.589623 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.593005 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.593241 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.593427 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p6qrb" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.593884 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.601010 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.614189 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt"] Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.713151 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bb01149f-0837-46f0-8636-b72f5fb85e9a-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt\" (UID: \"bb01149f-0837-46f0-8636-b72f5fb85e9a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.713431 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb01149f-0837-46f0-8636-b72f5fb85e9a-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt\" (UID: \"bb01149f-0837-46f0-8636-b72f5fb85e9a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.713516 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb01149f-0837-46f0-8636-b72f5fb85e9a-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt\" (UID: \"bb01149f-0837-46f0-8636-b72f5fb85e9a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.713638 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w5hc\" (UniqueName: \"kubernetes.io/projected/bb01149f-0837-46f0-8636-b72f5fb85e9a-kube-api-access-6w5hc\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt\" (UID: \"bb01149f-0837-46f0-8636-b72f5fb85e9a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.815173 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bb01149f-0837-46f0-8636-b72f5fb85e9a-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt\" (UID: \"bb01149f-0837-46f0-8636-b72f5fb85e9a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.815228 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb01149f-0837-46f0-8636-b72f5fb85e9a-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt\" (UID: \"bb01149f-0837-46f0-8636-b72f5fb85e9a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.815274 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb01149f-0837-46f0-8636-b72f5fb85e9a-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt\" (UID: \"bb01149f-0837-46f0-8636-b72f5fb85e9a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.815360 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w5hc\" (UniqueName: \"kubernetes.io/projected/bb01149f-0837-46f0-8636-b72f5fb85e9a-kube-api-access-6w5hc\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt\" (UID: \"bb01149f-0837-46f0-8636-b72f5fb85e9a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.819060 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb01149f-0837-46f0-8636-b72f5fb85e9a-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt\" (UID: \"bb01149f-0837-46f0-8636-b72f5fb85e9a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.821628 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb01149f-0837-46f0-8636-b72f5fb85e9a-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt\" (UID: \"bb01149f-0837-46f0-8636-b72f5fb85e9a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.828139 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bb01149f-0837-46f0-8636-b72f5fb85e9a-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt\" (UID: \"bb01149f-0837-46f0-8636-b72f5fb85e9a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.834838 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w5hc\" (UniqueName: \"kubernetes.io/projected/bb01149f-0837-46f0-8636-b72f5fb85e9a-kube-api-access-6w5hc\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt\" (UID: \"bb01149f-0837-46f0-8636-b72f5fb85e9a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt" Dec 06 09:48:27 crc kubenswrapper[4672]: I1206 09:48:27.907306 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt" Dec 06 09:48:28 crc kubenswrapper[4672]: I1206 09:48:28.421392 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt"] Dec 06 09:48:28 crc kubenswrapper[4672]: I1206 09:48:28.511946 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt" event={"ID":"bb01149f-0837-46f0-8636-b72f5fb85e9a","Type":"ContainerStarted","Data":"923efd5467f6316e71ce6c9f36f93e90c005a391b222bd11c4880a9795573473"} Dec 06 09:48:29 crc kubenswrapper[4672]: I1206 09:48:29.520699 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt" event={"ID":"bb01149f-0837-46f0-8636-b72f5fb85e9a","Type":"ContainerStarted","Data":"9800ad8b533ab08b9414f7ab366a79f29776385052a1271197d72cddf8331ed8"} Dec 06 09:48:29 crc kubenswrapper[4672]: I1206 09:48:29.544709 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt" podStartSLOduration=2.150288801 podStartE2EDuration="2.544688889s" podCreationTimestamp="2025-12-06 09:48:27 +0000 UTC" firstStartedPulling="2025-12-06 09:48:28.434963496 +0000 UTC m=+2526.179223783" lastFinishedPulling="2025-12-06 09:48:28.829363584 +0000 UTC m=+2526.573623871" observedRunningTime="2025-12-06 09:48:29.541616796 +0000 UTC m=+2527.285877083" watchObservedRunningTime="2025-12-06 09:48:29.544688889 +0000 UTC m=+2527.288949176" Dec 06 09:48:37 crc kubenswrapper[4672]: I1206 09:48:37.557674 4672 scope.go:117] "RemoveContainer" containerID="9db37941c2f5797e3cc2e07c2a5ea926cd6224bccf10d013946bef80402ff7bb" Dec 06 09:48:37 crc kubenswrapper[4672]: E1206 09:48:37.558466 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:48:39 crc kubenswrapper[4672]: I1206 09:48:39.627326 4672 generic.go:334] "Generic (PLEG): container finished" podID="bb01149f-0837-46f0-8636-b72f5fb85e9a" containerID="9800ad8b533ab08b9414f7ab366a79f29776385052a1271197d72cddf8331ed8" exitCode=0 Dec 06 09:48:39 crc kubenswrapper[4672]: I1206 09:48:39.627429 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt" event={"ID":"bb01149f-0837-46f0-8636-b72f5fb85e9a","Type":"ContainerDied","Data":"9800ad8b533ab08b9414f7ab366a79f29776385052a1271197d72cddf8331ed8"} Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.044534 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.199363 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb01149f-0837-46f0-8636-b72f5fb85e9a-ssh-key\") pod \"bb01149f-0837-46f0-8636-b72f5fb85e9a\" (UID: \"bb01149f-0837-46f0-8636-b72f5fb85e9a\") " Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.199550 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6w5hc\" (UniqueName: \"kubernetes.io/projected/bb01149f-0837-46f0-8636-b72f5fb85e9a-kube-api-access-6w5hc\") pod \"bb01149f-0837-46f0-8636-b72f5fb85e9a\" (UID: \"bb01149f-0837-46f0-8636-b72f5fb85e9a\") " Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.199644 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb01149f-0837-46f0-8636-b72f5fb85e9a-inventory\") pod \"bb01149f-0837-46f0-8636-b72f5fb85e9a\" (UID: \"bb01149f-0837-46f0-8636-b72f5fb85e9a\") " Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.199751 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bb01149f-0837-46f0-8636-b72f5fb85e9a-ceph\") pod \"bb01149f-0837-46f0-8636-b72f5fb85e9a\" (UID: \"bb01149f-0837-46f0-8636-b72f5fb85e9a\") " Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.205437 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb01149f-0837-46f0-8636-b72f5fb85e9a-kube-api-access-6w5hc" (OuterVolumeSpecName: "kube-api-access-6w5hc") pod "bb01149f-0837-46f0-8636-b72f5fb85e9a" (UID: "bb01149f-0837-46f0-8636-b72f5fb85e9a"). InnerVolumeSpecName "kube-api-access-6w5hc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.206286 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb01149f-0837-46f0-8636-b72f5fb85e9a-ceph" (OuterVolumeSpecName: "ceph") pod "bb01149f-0837-46f0-8636-b72f5fb85e9a" (UID: "bb01149f-0837-46f0-8636-b72f5fb85e9a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.231730 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb01149f-0837-46f0-8636-b72f5fb85e9a-inventory" (OuterVolumeSpecName: "inventory") pod "bb01149f-0837-46f0-8636-b72f5fb85e9a" (UID: "bb01149f-0837-46f0-8636-b72f5fb85e9a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.249494 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb01149f-0837-46f0-8636-b72f5fb85e9a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bb01149f-0837-46f0-8636-b72f5fb85e9a" (UID: "bb01149f-0837-46f0-8636-b72f5fb85e9a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.302105 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb01149f-0837-46f0-8636-b72f5fb85e9a-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.302376 4672 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bb01149f-0837-46f0-8636-b72f5fb85e9a-ceph\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.302488 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb01149f-0837-46f0-8636-b72f5fb85e9a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.302568 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6w5hc\" (UniqueName: \"kubernetes.io/projected/bb01149f-0837-46f0-8636-b72f5fb85e9a-kube-api-access-6w5hc\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.644491 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt" event={"ID":"bb01149f-0837-46f0-8636-b72f5fb85e9a","Type":"ContainerDied","Data":"923efd5467f6316e71ce6c9f36f93e90c005a391b222bd11c4880a9795573473"} Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.644532 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="923efd5467f6316e71ce6c9f36f93e90c005a391b222bd11c4880a9795573473" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.644579 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.743588 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj"] Dec 06 09:48:41 crc kubenswrapper[4672]: E1206 09:48:41.744044 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb01149f-0837-46f0-8636-b72f5fb85e9a" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.744062 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb01149f-0837-46f0-8636-b72f5fb85e9a" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.744270 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb01149f-0837-46f0-8636-b72f5fb85e9a" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.745020 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.748499 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.751871 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.752035 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.752211 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.752555 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.752711 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p6qrb" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.757342 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.757622 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.759802 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj"] Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.811145 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.811199 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ab019-8934-4783-ab57-b3ecccd11b05-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.811239 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.811261 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbbnq\" (UniqueName: \"kubernetes.io/projected/131ab019-8934-4783-ab57-b3ecccd11b05-kube-api-access-kbbnq\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.811286 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.811303 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.811319 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.811339 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ab019-8934-4783-ab57-b3ecccd11b05-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.811358 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.811376 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ab019-8934-4783-ab57-b3ecccd11b05-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.811429 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.811475 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.811520 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.913045 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.913115 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ab019-8934-4783-ab57-b3ecccd11b05-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.913148 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.913187 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ab019-8934-4783-ab57-b3ecccd11b05-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.913256 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.913294 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.913336 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.913379 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.913398 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ab019-8934-4783-ab57-b3ecccd11b05-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.913428 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.913445 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbbnq\" (UniqueName: \"kubernetes.io/projected/131ab019-8934-4783-ab57-b3ecccd11b05-kube-api-access-kbbnq\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.913463 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.913480 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.918514 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.919702 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.920217 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ab019-8934-4783-ab57-b3ecccd11b05-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.921323 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.921898 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.922238 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.922467 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ab019-8934-4783-ab57-b3ecccd11b05-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.923001 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.923277 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.923830 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.925467 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.926578 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ab019-8934-4783-ab57-b3ecccd11b05-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:41 crc kubenswrapper[4672]: I1206 09:48:41.938726 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbbnq\" (UniqueName: \"kubernetes.io/projected/131ab019-8934-4783-ab57-b3ecccd11b05-kube-api-access-kbbnq\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:42 crc kubenswrapper[4672]: I1206 09:48:42.061985 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:48:42 crc kubenswrapper[4672]: I1206 09:48:42.379539 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj"] Dec 06 09:48:42 crc kubenswrapper[4672]: I1206 09:48:42.652366 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" event={"ID":"131ab019-8934-4783-ab57-b3ecccd11b05","Type":"ContainerStarted","Data":"364d643fe8022c199cdb5355a27a440d7e50d0710ad48d5777452d6aa7d8f08e"} Dec 06 09:48:43 crc kubenswrapper[4672]: I1206 09:48:43.663563 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" event={"ID":"131ab019-8934-4783-ab57-b3ecccd11b05","Type":"ContainerStarted","Data":"3937c69a011652b0cf8ef998a739e641f9998bb2a72600af80da54220c3ea0c6"} Dec 06 09:48:52 crc kubenswrapper[4672]: I1206 09:48:52.564439 4672 scope.go:117] "RemoveContainer" containerID="9db37941c2f5797e3cc2e07c2a5ea926cd6224bccf10d013946bef80402ff7bb" Dec 06 09:48:52 crc kubenswrapper[4672]: E1206 09:48:52.565058 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:49:05 crc kubenswrapper[4672]: I1206 09:49:05.556795 4672 scope.go:117] "RemoveContainer" containerID="9db37941c2f5797e3cc2e07c2a5ea926cd6224bccf10d013946bef80402ff7bb" Dec 06 09:49:05 crc kubenswrapper[4672]: E1206 09:49:05.558876 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:49:14 crc kubenswrapper[4672]: I1206 09:49:14.913851 4672 generic.go:334] "Generic (PLEG): container finished" podID="131ab019-8934-4783-ab57-b3ecccd11b05" containerID="3937c69a011652b0cf8ef998a739e641f9998bb2a72600af80da54220c3ea0c6" exitCode=0 Dec 06 09:49:14 crc kubenswrapper[4672]: I1206 09:49:14.914001 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" event={"ID":"131ab019-8934-4783-ab57-b3ecccd11b05","Type":"ContainerDied","Data":"3937c69a011652b0cf8ef998a739e641f9998bb2a72600af80da54220c3ea0c6"} Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.337494 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.521625 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-repo-setup-combined-ca-bundle\") pod \"131ab019-8934-4783-ab57-b3ecccd11b05\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.521993 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-ovn-combined-ca-bundle\") pod \"131ab019-8934-4783-ab57-b3ecccd11b05\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.522033 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-ssh-key\") pod \"131ab019-8934-4783-ab57-b3ecccd11b05\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.522055 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ab019-8934-4783-ab57-b3ecccd11b05-openstack-edpm-ipam-ovn-default-certs-0\") pod \"131ab019-8934-4783-ab57-b3ecccd11b05\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.522164 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-inventory\") pod \"131ab019-8934-4783-ab57-b3ecccd11b05\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.522322 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-neutron-metadata-combined-ca-bundle\") pod \"131ab019-8934-4783-ab57-b3ecccd11b05\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.522425 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-ceph\") pod \"131ab019-8934-4783-ab57-b3ecccd11b05\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.522485 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-nova-combined-ca-bundle\") pod \"131ab019-8934-4783-ab57-b3ecccd11b05\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.522680 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-libvirt-combined-ca-bundle\") pod \"131ab019-8934-4783-ab57-b3ecccd11b05\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.522724 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-bootstrap-combined-ca-bundle\") pod \"131ab019-8934-4783-ab57-b3ecccd11b05\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.522792 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ab019-8934-4783-ab57-b3ecccd11b05-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"131ab019-8934-4783-ab57-b3ecccd11b05\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.522864 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbbnq\" (UniqueName: \"kubernetes.io/projected/131ab019-8934-4783-ab57-b3ecccd11b05-kube-api-access-kbbnq\") pod \"131ab019-8934-4783-ab57-b3ecccd11b05\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.522887 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ab019-8934-4783-ab57-b3ecccd11b05-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"131ab019-8934-4783-ab57-b3ecccd11b05\" (UID: \"131ab019-8934-4783-ab57-b3ecccd11b05\") " Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.528134 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-ceph" (OuterVolumeSpecName: "ceph") pod "131ab019-8934-4783-ab57-b3ecccd11b05" (UID: "131ab019-8934-4783-ab57-b3ecccd11b05"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.530379 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "131ab019-8934-4783-ab57-b3ecccd11b05" (UID: "131ab019-8934-4783-ab57-b3ecccd11b05"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.530386 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/131ab019-8934-4783-ab57-b3ecccd11b05-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "131ab019-8934-4783-ab57-b3ecccd11b05" (UID: "131ab019-8934-4783-ab57-b3ecccd11b05"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.530451 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "131ab019-8934-4783-ab57-b3ecccd11b05" (UID: "131ab019-8934-4783-ab57-b3ecccd11b05"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.530417 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "131ab019-8934-4783-ab57-b3ecccd11b05" (UID: "131ab019-8934-4783-ab57-b3ecccd11b05"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.530490 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/131ab019-8934-4783-ab57-b3ecccd11b05-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "131ab019-8934-4783-ab57-b3ecccd11b05" (UID: "131ab019-8934-4783-ab57-b3ecccd11b05"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.531277 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/131ab019-8934-4783-ab57-b3ecccd11b05-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "131ab019-8934-4783-ab57-b3ecccd11b05" (UID: "131ab019-8934-4783-ab57-b3ecccd11b05"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.533759 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "131ab019-8934-4783-ab57-b3ecccd11b05" (UID: "131ab019-8934-4783-ab57-b3ecccd11b05"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.534748 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "131ab019-8934-4783-ab57-b3ecccd11b05" (UID: "131ab019-8934-4783-ab57-b3ecccd11b05"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.535890 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/131ab019-8934-4783-ab57-b3ecccd11b05-kube-api-access-kbbnq" (OuterVolumeSpecName: "kube-api-access-kbbnq") pod "131ab019-8934-4783-ab57-b3ecccd11b05" (UID: "131ab019-8934-4783-ab57-b3ecccd11b05"). InnerVolumeSpecName "kube-api-access-kbbnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.546936 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "131ab019-8934-4783-ab57-b3ecccd11b05" (UID: "131ab019-8934-4783-ab57-b3ecccd11b05"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.555079 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "131ab019-8934-4783-ab57-b3ecccd11b05" (UID: "131ab019-8934-4783-ab57-b3ecccd11b05"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.559388 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-inventory" (OuterVolumeSpecName: "inventory") pod "131ab019-8934-4783-ab57-b3ecccd11b05" (UID: "131ab019-8934-4783-ab57-b3ecccd11b05"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.625059 4672 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.625111 4672 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.625123 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.625133 4672 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ab019-8934-4783-ab57-b3ecccd11b05-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.625144 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.625152 4672 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.625161 4672 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-ceph\") on node \"crc\" DevicePath \"\"" Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.625170 4672 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.625180 4672 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.625191 4672 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ab019-8934-4783-ab57-b3ecccd11b05-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.625199 4672 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ab019-8934-4783-ab57-b3ecccd11b05-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.625208 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbbnq\" (UniqueName: \"kubernetes.io/projected/131ab019-8934-4783-ab57-b3ecccd11b05-kube-api-access-kbbnq\") on node \"crc\" DevicePath \"\"" Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.625217 4672 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ab019-8934-4783-ab57-b3ecccd11b05-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.935913 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" event={"ID":"131ab019-8934-4783-ab57-b3ecccd11b05","Type":"ContainerDied","Data":"364d643fe8022c199cdb5355a27a440d7e50d0710ad48d5777452d6aa7d8f08e"} Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.936100 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="364d643fe8022c199cdb5355a27a440d7e50d0710ad48d5777452d6aa7d8f08e" Dec 06 09:49:16 crc kubenswrapper[4672]: I1206 09:49:16.935994 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj" Dec 06 09:49:17 crc kubenswrapper[4672]: I1206 09:49:17.131061 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v"] Dec 06 09:49:17 crc kubenswrapper[4672]: E1206 09:49:17.131659 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="131ab019-8934-4783-ab57-b3ecccd11b05" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 06 09:49:17 crc kubenswrapper[4672]: I1206 09:49:17.131689 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="131ab019-8934-4783-ab57-b3ecccd11b05" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 06 09:49:17 crc kubenswrapper[4672]: I1206 09:49:17.132005 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="131ab019-8934-4783-ab57-b3ecccd11b05" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 06 09:49:17 crc kubenswrapper[4672]: I1206 09:49:17.132752 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v" Dec 06 09:49:17 crc kubenswrapper[4672]: I1206 09:49:17.137248 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 09:49:17 crc kubenswrapper[4672]: I1206 09:49:17.137350 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p6qrb" Dec 06 09:49:17 crc kubenswrapper[4672]: I1206 09:49:17.137537 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 09:49:17 crc kubenswrapper[4672]: I1206 09:49:17.137636 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 06 09:49:17 crc kubenswrapper[4672]: I1206 09:49:17.137924 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:49:17 crc kubenswrapper[4672]: I1206 09:49:17.144370 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v"] Dec 06 09:49:17 crc kubenswrapper[4672]: I1206 09:49:17.238005 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a0bb0cdb-025d-4251-b0f0-06185ea6fe8f-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v\" (UID: \"a0bb0cdb-025d-4251-b0f0-06185ea6fe8f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v" Dec 06 09:49:17 crc kubenswrapper[4672]: I1206 09:49:17.238115 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk9g7\" (UniqueName: \"kubernetes.io/projected/a0bb0cdb-025d-4251-b0f0-06185ea6fe8f-kube-api-access-sk9g7\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v\" (UID: \"a0bb0cdb-025d-4251-b0f0-06185ea6fe8f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v" Dec 06 09:49:17 crc kubenswrapper[4672]: I1206 09:49:17.238371 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0bb0cdb-025d-4251-b0f0-06185ea6fe8f-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v\" (UID: \"a0bb0cdb-025d-4251-b0f0-06185ea6fe8f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v" Dec 06 09:49:17 crc kubenswrapper[4672]: I1206 09:49:17.238735 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0bb0cdb-025d-4251-b0f0-06185ea6fe8f-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v\" (UID: \"a0bb0cdb-025d-4251-b0f0-06185ea6fe8f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v" Dec 06 09:49:17 crc kubenswrapper[4672]: I1206 09:49:17.341393 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0bb0cdb-025d-4251-b0f0-06185ea6fe8f-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v\" (UID: \"a0bb0cdb-025d-4251-b0f0-06185ea6fe8f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v" Dec 06 09:49:17 crc kubenswrapper[4672]: I1206 09:49:17.341861 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a0bb0cdb-025d-4251-b0f0-06185ea6fe8f-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v\" (UID: \"a0bb0cdb-025d-4251-b0f0-06185ea6fe8f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v" Dec 06 09:49:17 crc kubenswrapper[4672]: I1206 09:49:17.341907 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk9g7\" (UniqueName: \"kubernetes.io/projected/a0bb0cdb-025d-4251-b0f0-06185ea6fe8f-kube-api-access-sk9g7\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v\" (UID: \"a0bb0cdb-025d-4251-b0f0-06185ea6fe8f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v" Dec 06 09:49:17 crc kubenswrapper[4672]: I1206 09:49:17.341978 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0bb0cdb-025d-4251-b0f0-06185ea6fe8f-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v\" (UID: \"a0bb0cdb-025d-4251-b0f0-06185ea6fe8f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v" Dec 06 09:49:17 crc kubenswrapper[4672]: I1206 09:49:17.345962 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a0bb0cdb-025d-4251-b0f0-06185ea6fe8f-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v\" (UID: \"a0bb0cdb-025d-4251-b0f0-06185ea6fe8f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v" Dec 06 09:49:17 crc kubenswrapper[4672]: I1206 09:49:17.346071 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0bb0cdb-025d-4251-b0f0-06185ea6fe8f-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v\" (UID: \"a0bb0cdb-025d-4251-b0f0-06185ea6fe8f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v" Dec 06 09:49:17 crc kubenswrapper[4672]: I1206 09:49:17.346459 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0bb0cdb-025d-4251-b0f0-06185ea6fe8f-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v\" (UID: \"a0bb0cdb-025d-4251-b0f0-06185ea6fe8f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v" Dec 06 09:49:17 crc kubenswrapper[4672]: I1206 09:49:17.372680 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk9g7\" (UniqueName: \"kubernetes.io/projected/a0bb0cdb-025d-4251-b0f0-06185ea6fe8f-kube-api-access-sk9g7\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v\" (UID: \"a0bb0cdb-025d-4251-b0f0-06185ea6fe8f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v" Dec 06 09:49:17 crc kubenswrapper[4672]: I1206 09:49:17.458545 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v" Dec 06 09:49:18 crc kubenswrapper[4672]: W1206 09:49:18.024689 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0bb0cdb_025d_4251_b0f0_06185ea6fe8f.slice/crio-e0c9f1e583978b6dfc7a40bb97b4b74e43699267a4d77bb15e4d9658efd2ad9f WatchSource:0}: Error finding container e0c9f1e583978b6dfc7a40bb97b4b74e43699267a4d77bb15e4d9658efd2ad9f: Status 404 returned error can't find the container with id e0c9f1e583978b6dfc7a40bb97b4b74e43699267a4d77bb15e4d9658efd2ad9f Dec 06 09:49:18 crc kubenswrapper[4672]: I1206 09:49:18.030870 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v"] Dec 06 09:49:18 crc kubenswrapper[4672]: I1206 09:49:18.557385 4672 scope.go:117] "RemoveContainer" containerID="9db37941c2f5797e3cc2e07c2a5ea926cd6224bccf10d013946bef80402ff7bb" Dec 06 09:49:18 crc kubenswrapper[4672]: I1206 09:49:18.957217 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerStarted","Data":"7cd1be7755bf34af00f4c17fd0804f052c17aac5446317d3315921d4c9466ed8"} Dec 06 09:49:18 crc kubenswrapper[4672]: I1206 09:49:18.958931 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v" event={"ID":"a0bb0cdb-025d-4251-b0f0-06185ea6fe8f","Type":"ContainerStarted","Data":"a6e35f0d129991050fa6949257f9da086beb4ab6b4ad0b90a4377bb9abead359"} Dec 06 09:49:18 crc kubenswrapper[4672]: I1206 09:49:18.958961 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v" event={"ID":"a0bb0cdb-025d-4251-b0f0-06185ea6fe8f","Type":"ContainerStarted","Data":"e0c9f1e583978b6dfc7a40bb97b4b74e43699267a4d77bb15e4d9658efd2ad9f"} Dec 06 09:49:19 crc kubenswrapper[4672]: I1206 09:49:19.004116 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v" podStartSLOduration=1.5183797700000001 podStartE2EDuration="2.004100208s" podCreationTimestamp="2025-12-06 09:49:17 +0000 UTC" firstStartedPulling="2025-12-06 09:49:18.027832271 +0000 UTC m=+2575.772092568" lastFinishedPulling="2025-12-06 09:49:18.513552689 +0000 UTC m=+2576.257813006" observedRunningTime="2025-12-06 09:49:18.999297328 +0000 UTC m=+2576.743557615" watchObservedRunningTime="2025-12-06 09:49:19.004100208 +0000 UTC m=+2576.748360495" Dec 06 09:49:25 crc kubenswrapper[4672]: I1206 09:49:25.003489 4672 generic.go:334] "Generic (PLEG): container finished" podID="a0bb0cdb-025d-4251-b0f0-06185ea6fe8f" containerID="a6e35f0d129991050fa6949257f9da086beb4ab6b4ad0b90a4377bb9abead359" exitCode=0 Dec 06 09:49:25 crc kubenswrapper[4672]: I1206 09:49:25.003591 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v" event={"ID":"a0bb0cdb-025d-4251-b0f0-06185ea6fe8f","Type":"ContainerDied","Data":"a6e35f0d129991050fa6949257f9da086beb4ab6b4ad0b90a4377bb9abead359"} Dec 06 09:49:26 crc kubenswrapper[4672]: I1206 09:49:26.581184 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v" Dec 06 09:49:26 crc kubenswrapper[4672]: I1206 09:49:26.764909 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0bb0cdb-025d-4251-b0f0-06185ea6fe8f-inventory\") pod \"a0bb0cdb-025d-4251-b0f0-06185ea6fe8f\" (UID: \"a0bb0cdb-025d-4251-b0f0-06185ea6fe8f\") " Dec 06 09:49:26 crc kubenswrapper[4672]: I1206 09:49:26.764972 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0bb0cdb-025d-4251-b0f0-06185ea6fe8f-ssh-key\") pod \"a0bb0cdb-025d-4251-b0f0-06185ea6fe8f\" (UID: \"a0bb0cdb-025d-4251-b0f0-06185ea6fe8f\") " Dec 06 09:49:26 crc kubenswrapper[4672]: I1206 09:49:26.765050 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk9g7\" (UniqueName: \"kubernetes.io/projected/a0bb0cdb-025d-4251-b0f0-06185ea6fe8f-kube-api-access-sk9g7\") pod \"a0bb0cdb-025d-4251-b0f0-06185ea6fe8f\" (UID: \"a0bb0cdb-025d-4251-b0f0-06185ea6fe8f\") " Dec 06 09:49:26 crc kubenswrapper[4672]: I1206 09:49:26.765083 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a0bb0cdb-025d-4251-b0f0-06185ea6fe8f-ceph\") pod \"a0bb0cdb-025d-4251-b0f0-06185ea6fe8f\" (UID: \"a0bb0cdb-025d-4251-b0f0-06185ea6fe8f\") " Dec 06 09:49:26 crc kubenswrapper[4672]: I1206 09:49:26.769613 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0bb0cdb-025d-4251-b0f0-06185ea6fe8f-kube-api-access-sk9g7" (OuterVolumeSpecName: "kube-api-access-sk9g7") pod "a0bb0cdb-025d-4251-b0f0-06185ea6fe8f" (UID: "a0bb0cdb-025d-4251-b0f0-06185ea6fe8f"). InnerVolumeSpecName "kube-api-access-sk9g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:49:26 crc kubenswrapper[4672]: I1206 09:49:26.772156 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0bb0cdb-025d-4251-b0f0-06185ea6fe8f-ceph" (OuterVolumeSpecName: "ceph") pod "a0bb0cdb-025d-4251-b0f0-06185ea6fe8f" (UID: "a0bb0cdb-025d-4251-b0f0-06185ea6fe8f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:49:26 crc kubenswrapper[4672]: I1206 09:49:26.791803 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0bb0cdb-025d-4251-b0f0-06185ea6fe8f-inventory" (OuterVolumeSpecName: "inventory") pod "a0bb0cdb-025d-4251-b0f0-06185ea6fe8f" (UID: "a0bb0cdb-025d-4251-b0f0-06185ea6fe8f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:49:26 crc kubenswrapper[4672]: I1206 09:49:26.809387 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0bb0cdb-025d-4251-b0f0-06185ea6fe8f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a0bb0cdb-025d-4251-b0f0-06185ea6fe8f" (UID: "a0bb0cdb-025d-4251-b0f0-06185ea6fe8f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:49:26 crc kubenswrapper[4672]: I1206 09:49:26.867228 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0bb0cdb-025d-4251-b0f0-06185ea6fe8f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:49:26 crc kubenswrapper[4672]: I1206 09:49:26.867261 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sk9g7\" (UniqueName: \"kubernetes.io/projected/a0bb0cdb-025d-4251-b0f0-06185ea6fe8f-kube-api-access-sk9g7\") on node \"crc\" DevicePath \"\"" Dec 06 09:49:26 crc kubenswrapper[4672]: I1206 09:49:26.867273 4672 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a0bb0cdb-025d-4251-b0f0-06185ea6fe8f-ceph\") on node \"crc\" DevicePath \"\"" Dec 06 09:49:26 crc kubenswrapper[4672]: I1206 09:49:26.867280 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0bb0cdb-025d-4251-b0f0-06185ea6fe8f-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.031765 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v" event={"ID":"a0bb0cdb-025d-4251-b0f0-06185ea6fe8f","Type":"ContainerDied","Data":"e0c9f1e583978b6dfc7a40bb97b4b74e43699267a4d77bb15e4d9658efd2ad9f"} Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.031819 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0c9f1e583978b6dfc7a40bb97b4b74e43699267a4d77bb15e4d9658efd2ad9f" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.031796 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.144049 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-ckzpr"] Dec 06 09:49:27 crc kubenswrapper[4672]: E1206 09:49:27.144579 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0bb0cdb-025d-4251-b0f0-06185ea6fe8f" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.144637 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0bb0cdb-025d-4251-b0f0-06185ea6fe8f" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.144960 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0bb0cdb-025d-4251-b0f0-06185ea6fe8f" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.145949 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ckzpr" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.149795 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.150079 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.150411 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p6qrb" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.150711 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.150974 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.151204 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.163532 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-ckzpr"] Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.276080 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ckzpr\" (UID: \"3c8ad536-4cb5-4454-bcc3-5b13cb92215d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ckzpr" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.276168 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ckzpr\" (UID: \"3c8ad536-4cb5-4454-bcc3-5b13cb92215d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ckzpr" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.276243 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ckzpr\" (UID: \"3c8ad536-4cb5-4454-bcc3-5b13cb92215d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ckzpr" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.276269 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ckzpr\" (UID: \"3c8ad536-4cb5-4454-bcc3-5b13cb92215d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ckzpr" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.276303 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92ddc\" (UniqueName: \"kubernetes.io/projected/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-kube-api-access-92ddc\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ckzpr\" (UID: \"3c8ad536-4cb5-4454-bcc3-5b13cb92215d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ckzpr" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.276349 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ckzpr\" (UID: \"3c8ad536-4cb5-4454-bcc3-5b13cb92215d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ckzpr" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.377409 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92ddc\" (UniqueName: \"kubernetes.io/projected/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-kube-api-access-92ddc\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ckzpr\" (UID: \"3c8ad536-4cb5-4454-bcc3-5b13cb92215d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ckzpr" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.377469 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ckzpr\" (UID: \"3c8ad536-4cb5-4454-bcc3-5b13cb92215d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ckzpr" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.377533 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ckzpr\" (UID: \"3c8ad536-4cb5-4454-bcc3-5b13cb92215d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ckzpr" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.377594 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ckzpr\" (UID: \"3c8ad536-4cb5-4454-bcc3-5b13cb92215d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ckzpr" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.377833 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ckzpr\" (UID: \"3c8ad536-4cb5-4454-bcc3-5b13cb92215d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ckzpr" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.377859 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ckzpr\" (UID: \"3c8ad536-4cb5-4454-bcc3-5b13cb92215d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ckzpr" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.380214 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ckzpr\" (UID: \"3c8ad536-4cb5-4454-bcc3-5b13cb92215d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ckzpr" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.382179 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ckzpr\" (UID: \"3c8ad536-4cb5-4454-bcc3-5b13cb92215d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ckzpr" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.382450 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ckzpr\" (UID: \"3c8ad536-4cb5-4454-bcc3-5b13cb92215d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ckzpr" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.382785 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ckzpr\" (UID: \"3c8ad536-4cb5-4454-bcc3-5b13cb92215d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ckzpr" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.383176 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ckzpr\" (UID: \"3c8ad536-4cb5-4454-bcc3-5b13cb92215d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ckzpr" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.394775 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92ddc\" (UniqueName: \"kubernetes.io/projected/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-kube-api-access-92ddc\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ckzpr\" (UID: \"3c8ad536-4cb5-4454-bcc3-5b13cb92215d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ckzpr" Dec 06 09:49:27 crc kubenswrapper[4672]: I1206 09:49:27.496423 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ckzpr" Dec 06 09:49:28 crc kubenswrapper[4672]: I1206 09:49:28.064468 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-ckzpr"] Dec 06 09:49:29 crc kubenswrapper[4672]: I1206 09:49:29.049748 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ckzpr" event={"ID":"3c8ad536-4cb5-4454-bcc3-5b13cb92215d","Type":"ContainerStarted","Data":"45c25f56884a3f6fc2871282fa3e39f18cd20786f0bdeb8fc84a46ece288592b"} Dec 06 09:49:29 crc kubenswrapper[4672]: I1206 09:49:29.050143 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ckzpr" event={"ID":"3c8ad536-4cb5-4454-bcc3-5b13cb92215d","Type":"ContainerStarted","Data":"07624152fa1fcbc29384bbc8d9fbef4a962742acd447030219d792fa054024fc"} Dec 06 09:49:29 crc kubenswrapper[4672]: I1206 09:49:29.073830 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ckzpr" podStartSLOduration=1.44194455 podStartE2EDuration="2.073807259s" podCreationTimestamp="2025-12-06 09:49:27 +0000 UTC" firstStartedPulling="2025-12-06 09:49:28.083549103 +0000 UTC m=+2585.827809400" lastFinishedPulling="2025-12-06 09:49:28.715411812 +0000 UTC m=+2586.459672109" observedRunningTime="2025-12-06 09:49:29.065947346 +0000 UTC m=+2586.810207663" watchObservedRunningTime="2025-12-06 09:49:29.073807259 +0000 UTC m=+2586.818067556" Dec 06 09:50:42 crc kubenswrapper[4672]: I1206 09:50:42.662107 4672 generic.go:334] "Generic (PLEG): container finished" podID="3c8ad536-4cb5-4454-bcc3-5b13cb92215d" containerID="45c25f56884a3f6fc2871282fa3e39f18cd20786f0bdeb8fc84a46ece288592b" exitCode=0 Dec 06 09:50:42 crc kubenswrapper[4672]: I1206 09:50:42.662206 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ckzpr" event={"ID":"3c8ad536-4cb5-4454-bcc3-5b13cb92215d","Type":"ContainerDied","Data":"45c25f56884a3f6fc2871282fa3e39f18cd20786f0bdeb8fc84a46ece288592b"} Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.174273 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ckzpr" Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.287998 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-inventory\") pod \"3c8ad536-4cb5-4454-bcc3-5b13cb92215d\" (UID: \"3c8ad536-4cb5-4454-bcc3-5b13cb92215d\") " Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.288122 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92ddc\" (UniqueName: \"kubernetes.io/projected/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-kube-api-access-92ddc\") pod \"3c8ad536-4cb5-4454-bcc3-5b13cb92215d\" (UID: \"3c8ad536-4cb5-4454-bcc3-5b13cb92215d\") " Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.288170 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-ssh-key\") pod \"3c8ad536-4cb5-4454-bcc3-5b13cb92215d\" (UID: \"3c8ad536-4cb5-4454-bcc3-5b13cb92215d\") " Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.288240 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-ovncontroller-config-0\") pod \"3c8ad536-4cb5-4454-bcc3-5b13cb92215d\" (UID: \"3c8ad536-4cb5-4454-bcc3-5b13cb92215d\") " Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.288293 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-ovn-combined-ca-bundle\") pod \"3c8ad536-4cb5-4454-bcc3-5b13cb92215d\" (UID: \"3c8ad536-4cb5-4454-bcc3-5b13cb92215d\") " Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.288338 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-ceph\") pod \"3c8ad536-4cb5-4454-bcc3-5b13cb92215d\" (UID: \"3c8ad536-4cb5-4454-bcc3-5b13cb92215d\") " Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.295535 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-kube-api-access-92ddc" (OuterVolumeSpecName: "kube-api-access-92ddc") pod "3c8ad536-4cb5-4454-bcc3-5b13cb92215d" (UID: "3c8ad536-4cb5-4454-bcc3-5b13cb92215d"). InnerVolumeSpecName "kube-api-access-92ddc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.297106 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "3c8ad536-4cb5-4454-bcc3-5b13cb92215d" (UID: "3c8ad536-4cb5-4454-bcc3-5b13cb92215d"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.299341 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-ceph" (OuterVolumeSpecName: "ceph") pod "3c8ad536-4cb5-4454-bcc3-5b13cb92215d" (UID: "3c8ad536-4cb5-4454-bcc3-5b13cb92215d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.316989 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3c8ad536-4cb5-4454-bcc3-5b13cb92215d" (UID: "3c8ad536-4cb5-4454-bcc3-5b13cb92215d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.321910 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "3c8ad536-4cb5-4454-bcc3-5b13cb92215d" (UID: "3c8ad536-4cb5-4454-bcc3-5b13cb92215d"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.323691 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-inventory" (OuterVolumeSpecName: "inventory") pod "3c8ad536-4cb5-4454-bcc3-5b13cb92215d" (UID: "3c8ad536-4cb5-4454-bcc3-5b13cb92215d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.390307 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.390342 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92ddc\" (UniqueName: \"kubernetes.io/projected/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-kube-api-access-92ddc\") on node \"crc\" DevicePath \"\"" Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.390354 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.390362 4672 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.390372 4672 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.390382 4672 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3c8ad536-4cb5-4454-bcc3-5b13cb92215d-ceph\") on node \"crc\" DevicePath \"\"" Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.682311 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ckzpr" event={"ID":"3c8ad536-4cb5-4454-bcc3-5b13cb92215d","Type":"ContainerDied","Data":"07624152fa1fcbc29384bbc8d9fbef4a962742acd447030219d792fa054024fc"} Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.682363 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07624152fa1fcbc29384bbc8d9fbef4a962742acd447030219d792fa054024fc" Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.682674 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ckzpr" Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.844747 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f"] Dec 06 09:50:44 crc kubenswrapper[4672]: E1206 09:50:44.845484 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c8ad536-4cb5-4454-bcc3-5b13cb92215d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.845514 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c8ad536-4cb5-4454-bcc3-5b13cb92215d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.845768 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c8ad536-4cb5-4454-bcc3-5b13cb92215d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.846482 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f" Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.848487 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.850325 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.851332 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.851546 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.851871 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.861472 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f"] Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.862175 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 09:50:44 crc kubenswrapper[4672]: I1206 09:50:44.862653 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p6qrb" Dec 06 09:50:45 crc kubenswrapper[4672]: I1206 09:50:45.002102 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f\" (UID: \"0a4871b2-574b-433c-8491-9147da825602\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f" Dec 06 09:50:45 crc kubenswrapper[4672]: I1206 09:50:45.002346 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f\" (UID: \"0a4871b2-574b-433c-8491-9147da825602\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f" Dec 06 09:50:45 crc kubenswrapper[4672]: I1206 09:50:45.002466 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc884\" (UniqueName: \"kubernetes.io/projected/0a4871b2-574b-433c-8491-9147da825602-kube-api-access-rc884\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f\" (UID: \"0a4871b2-574b-433c-8491-9147da825602\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f" Dec 06 09:50:45 crc kubenswrapper[4672]: I1206 09:50:45.002579 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f\" (UID: \"0a4871b2-574b-433c-8491-9147da825602\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f" Dec 06 09:50:45 crc kubenswrapper[4672]: I1206 09:50:45.002718 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f\" (UID: \"0a4871b2-574b-433c-8491-9147da825602\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f" Dec 06 09:50:45 crc kubenswrapper[4672]: I1206 09:50:45.002804 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f\" (UID: \"0a4871b2-574b-433c-8491-9147da825602\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f" Dec 06 09:50:45 crc kubenswrapper[4672]: I1206 09:50:45.002909 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f\" (UID: \"0a4871b2-574b-433c-8491-9147da825602\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f" Dec 06 09:50:45 crc kubenswrapper[4672]: I1206 09:50:45.104971 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc884\" (UniqueName: \"kubernetes.io/projected/0a4871b2-574b-433c-8491-9147da825602-kube-api-access-rc884\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f\" (UID: \"0a4871b2-574b-433c-8491-9147da825602\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f" Dec 06 09:50:45 crc kubenswrapper[4672]: I1206 09:50:45.105270 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f\" (UID: \"0a4871b2-574b-433c-8491-9147da825602\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f" Dec 06 09:50:45 crc kubenswrapper[4672]: I1206 09:50:45.105407 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f\" (UID: \"0a4871b2-574b-433c-8491-9147da825602\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f" Dec 06 09:50:45 crc kubenswrapper[4672]: I1206 09:50:45.105523 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f\" (UID: \"0a4871b2-574b-433c-8491-9147da825602\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f" Dec 06 09:50:45 crc kubenswrapper[4672]: I1206 09:50:45.105704 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f\" (UID: \"0a4871b2-574b-433c-8491-9147da825602\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f" Dec 06 09:50:45 crc kubenswrapper[4672]: I1206 09:50:45.105857 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f\" (UID: \"0a4871b2-574b-433c-8491-9147da825602\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f" Dec 06 09:50:45 crc kubenswrapper[4672]: I1206 09:50:45.105964 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f\" (UID: \"0a4871b2-574b-433c-8491-9147da825602\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f" Dec 06 09:50:45 crc kubenswrapper[4672]: I1206 09:50:45.120872 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f\" (UID: \"0a4871b2-574b-433c-8491-9147da825602\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f" Dec 06 09:50:45 crc kubenswrapper[4672]: I1206 09:50:45.121032 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f\" (UID: \"0a4871b2-574b-433c-8491-9147da825602\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f" Dec 06 09:50:45 crc kubenswrapper[4672]: I1206 09:50:45.122788 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f\" (UID: \"0a4871b2-574b-433c-8491-9147da825602\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f" Dec 06 09:50:45 crc kubenswrapper[4672]: I1206 09:50:45.126795 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f\" (UID: \"0a4871b2-574b-433c-8491-9147da825602\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f" Dec 06 09:50:45 crc kubenswrapper[4672]: I1206 09:50:45.129217 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f\" (UID: \"0a4871b2-574b-433c-8491-9147da825602\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f" Dec 06 09:50:45 crc kubenswrapper[4672]: I1206 09:50:45.129640 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f\" (UID: \"0a4871b2-574b-433c-8491-9147da825602\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f" Dec 06 09:50:45 crc kubenswrapper[4672]: I1206 09:50:45.130436 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc884\" (UniqueName: \"kubernetes.io/projected/0a4871b2-574b-433c-8491-9147da825602-kube-api-access-rc884\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f\" (UID: \"0a4871b2-574b-433c-8491-9147da825602\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f" Dec 06 09:50:45 crc kubenswrapper[4672]: I1206 09:50:45.211508 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f" Dec 06 09:50:45 crc kubenswrapper[4672]: I1206 09:50:45.766655 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f"] Dec 06 09:50:46 crc kubenswrapper[4672]: I1206 09:50:46.699165 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f" event={"ID":"0a4871b2-574b-433c-8491-9147da825602","Type":"ContainerStarted","Data":"dbf3ea471eec72224ae7af4d3a6a6376464207fd50e47b0e1bbf5b5bea2341fd"} Dec 06 09:50:46 crc kubenswrapper[4672]: I1206 09:50:46.700409 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f" event={"ID":"0a4871b2-574b-433c-8491-9147da825602","Type":"ContainerStarted","Data":"b3e7f6e1810eada2aa72ddc020ddde0b7ae69b026c1bd5de3ad690f0bae274ed"} Dec 06 09:50:46 crc kubenswrapper[4672]: I1206 09:50:46.722075 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f" podStartSLOduration=2.339845206 podStartE2EDuration="2.722051266s" podCreationTimestamp="2025-12-06 09:50:44 +0000 UTC" firstStartedPulling="2025-12-06 09:50:45.801536296 +0000 UTC m=+2663.545796583" lastFinishedPulling="2025-12-06 09:50:46.183742356 +0000 UTC m=+2663.928002643" observedRunningTime="2025-12-06 09:50:46.715743305 +0000 UTC m=+2664.460003602" watchObservedRunningTime="2025-12-06 09:50:46.722051266 +0000 UTC m=+2664.466311553" Dec 06 09:51:02 crc kubenswrapper[4672]: I1206 09:51:02.587758 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s2tbc"] Dec 06 09:51:02 crc kubenswrapper[4672]: I1206 09:51:02.591699 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s2tbc" Dec 06 09:51:02 crc kubenswrapper[4672]: I1206 09:51:02.600919 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s2tbc"] Dec 06 09:51:02 crc kubenswrapper[4672]: I1206 09:51:02.726427 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/152bf5c3-c214-49ec-8d3c-174075f82c0b-catalog-content\") pod \"redhat-marketplace-s2tbc\" (UID: \"152bf5c3-c214-49ec-8d3c-174075f82c0b\") " pod="openshift-marketplace/redhat-marketplace-s2tbc" Dec 06 09:51:02 crc kubenswrapper[4672]: I1206 09:51:02.726672 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/152bf5c3-c214-49ec-8d3c-174075f82c0b-utilities\") pod \"redhat-marketplace-s2tbc\" (UID: \"152bf5c3-c214-49ec-8d3c-174075f82c0b\") " pod="openshift-marketplace/redhat-marketplace-s2tbc" Dec 06 09:51:02 crc kubenswrapper[4672]: I1206 09:51:02.726707 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x92pz\" (UniqueName: \"kubernetes.io/projected/152bf5c3-c214-49ec-8d3c-174075f82c0b-kube-api-access-x92pz\") pod \"redhat-marketplace-s2tbc\" (UID: \"152bf5c3-c214-49ec-8d3c-174075f82c0b\") " pod="openshift-marketplace/redhat-marketplace-s2tbc" Dec 06 09:51:02 crc kubenswrapper[4672]: I1206 09:51:02.828328 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/152bf5c3-c214-49ec-8d3c-174075f82c0b-catalog-content\") pod \"redhat-marketplace-s2tbc\" (UID: \"152bf5c3-c214-49ec-8d3c-174075f82c0b\") " pod="openshift-marketplace/redhat-marketplace-s2tbc" Dec 06 09:51:02 crc kubenswrapper[4672]: I1206 09:51:02.828445 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/152bf5c3-c214-49ec-8d3c-174075f82c0b-utilities\") pod \"redhat-marketplace-s2tbc\" (UID: \"152bf5c3-c214-49ec-8d3c-174075f82c0b\") " pod="openshift-marketplace/redhat-marketplace-s2tbc" Dec 06 09:51:02 crc kubenswrapper[4672]: I1206 09:51:02.828482 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x92pz\" (UniqueName: \"kubernetes.io/projected/152bf5c3-c214-49ec-8d3c-174075f82c0b-kube-api-access-x92pz\") pod \"redhat-marketplace-s2tbc\" (UID: \"152bf5c3-c214-49ec-8d3c-174075f82c0b\") " pod="openshift-marketplace/redhat-marketplace-s2tbc" Dec 06 09:51:02 crc kubenswrapper[4672]: I1206 09:51:02.829478 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/152bf5c3-c214-49ec-8d3c-174075f82c0b-catalog-content\") pod \"redhat-marketplace-s2tbc\" (UID: \"152bf5c3-c214-49ec-8d3c-174075f82c0b\") " pod="openshift-marketplace/redhat-marketplace-s2tbc" Dec 06 09:51:02 crc kubenswrapper[4672]: I1206 09:51:02.830547 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/152bf5c3-c214-49ec-8d3c-174075f82c0b-utilities\") pod \"redhat-marketplace-s2tbc\" (UID: \"152bf5c3-c214-49ec-8d3c-174075f82c0b\") " pod="openshift-marketplace/redhat-marketplace-s2tbc" Dec 06 09:51:02 crc kubenswrapper[4672]: I1206 09:51:02.857000 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x92pz\" (UniqueName: \"kubernetes.io/projected/152bf5c3-c214-49ec-8d3c-174075f82c0b-kube-api-access-x92pz\") pod \"redhat-marketplace-s2tbc\" (UID: \"152bf5c3-c214-49ec-8d3c-174075f82c0b\") " pod="openshift-marketplace/redhat-marketplace-s2tbc" Dec 06 09:51:02 crc kubenswrapper[4672]: I1206 09:51:02.914965 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s2tbc" Dec 06 09:51:03 crc kubenswrapper[4672]: W1206 09:51:03.434045 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod152bf5c3_c214_49ec_8d3c_174075f82c0b.slice/crio-e84dc24a57d29abe5d34935911b044fa0a5ac83b8d9fa9975cbd70c4274e7d65 WatchSource:0}: Error finding container e84dc24a57d29abe5d34935911b044fa0a5ac83b8d9fa9975cbd70c4274e7d65: Status 404 returned error can't find the container with id e84dc24a57d29abe5d34935911b044fa0a5ac83b8d9fa9975cbd70c4274e7d65 Dec 06 09:51:03 crc kubenswrapper[4672]: I1206 09:51:03.443316 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s2tbc"] Dec 06 09:51:03 crc kubenswrapper[4672]: I1206 09:51:03.846092 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s2tbc" event={"ID":"152bf5c3-c214-49ec-8d3c-174075f82c0b","Type":"ContainerStarted","Data":"e84dc24a57d29abe5d34935911b044fa0a5ac83b8d9fa9975cbd70c4274e7d65"} Dec 06 09:51:04 crc kubenswrapper[4672]: I1206 09:51:04.855878 4672 generic.go:334] "Generic (PLEG): container finished" podID="152bf5c3-c214-49ec-8d3c-174075f82c0b" containerID="df8de846b90bc751f2df8e2833e9de5d2e2d2e2b1b2756c620b87a963414f2ed" exitCode=0 Dec 06 09:51:04 crc kubenswrapper[4672]: I1206 09:51:04.856067 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s2tbc" event={"ID":"152bf5c3-c214-49ec-8d3c-174075f82c0b","Type":"ContainerDied","Data":"df8de846b90bc751f2df8e2833e9de5d2e2d2e2b1b2756c620b87a963414f2ed"} Dec 06 09:51:06 crc kubenswrapper[4672]: I1206 09:51:06.874046 4672 generic.go:334] "Generic (PLEG): container finished" podID="152bf5c3-c214-49ec-8d3c-174075f82c0b" containerID="08574c1f3748b1f0e0ab008555272c390a6eae09892d7e79e97b8b2427b8180e" exitCode=0 Dec 06 09:51:06 crc kubenswrapper[4672]: I1206 09:51:06.874638 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s2tbc" event={"ID":"152bf5c3-c214-49ec-8d3c-174075f82c0b","Type":"ContainerDied","Data":"08574c1f3748b1f0e0ab008555272c390a6eae09892d7e79e97b8b2427b8180e"} Dec 06 09:51:06 crc kubenswrapper[4672]: I1206 09:51:06.876438 4672 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 09:51:07 crc kubenswrapper[4672]: I1206 09:51:07.884053 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s2tbc" event={"ID":"152bf5c3-c214-49ec-8d3c-174075f82c0b","Type":"ContainerStarted","Data":"957999b7c2b7ddf0deb07b9b1102fa4fc83d01dce9db63924d06f22a152f94f0"} Dec 06 09:51:12 crc kubenswrapper[4672]: I1206 09:51:12.915102 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s2tbc" Dec 06 09:51:12 crc kubenswrapper[4672]: I1206 09:51:12.917042 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s2tbc" Dec 06 09:51:12 crc kubenswrapper[4672]: I1206 09:51:12.982893 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s2tbc" Dec 06 09:51:13 crc kubenswrapper[4672]: I1206 09:51:13.005021 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s2tbc" podStartSLOduration=8.548639007 podStartE2EDuration="11.005000419s" podCreationTimestamp="2025-12-06 09:51:02 +0000 UTC" firstStartedPulling="2025-12-06 09:51:04.85962128 +0000 UTC m=+2682.603881567" lastFinishedPulling="2025-12-06 09:51:07.315982652 +0000 UTC m=+2685.060242979" observedRunningTime="2025-12-06 09:51:07.905028643 +0000 UTC m=+2685.649288940" watchObservedRunningTime="2025-12-06 09:51:13.005000419 +0000 UTC m=+2690.749260706" Dec 06 09:51:13 crc kubenswrapper[4672]: I1206 09:51:13.983235 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s2tbc" Dec 06 09:51:14 crc kubenswrapper[4672]: I1206 09:51:14.048121 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s2tbc"] Dec 06 09:51:15 crc kubenswrapper[4672]: I1206 09:51:15.951087 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-s2tbc" podUID="152bf5c3-c214-49ec-8d3c-174075f82c0b" containerName="registry-server" containerID="cri-o://957999b7c2b7ddf0deb07b9b1102fa4fc83d01dce9db63924d06f22a152f94f0" gracePeriod=2 Dec 06 09:51:16 crc kubenswrapper[4672]: I1206 09:51:16.400727 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s2tbc" Dec 06 09:51:16 crc kubenswrapper[4672]: I1206 09:51:16.537864 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/152bf5c3-c214-49ec-8d3c-174075f82c0b-utilities\") pod \"152bf5c3-c214-49ec-8d3c-174075f82c0b\" (UID: \"152bf5c3-c214-49ec-8d3c-174075f82c0b\") " Dec 06 09:51:16 crc kubenswrapper[4672]: I1206 09:51:16.538058 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/152bf5c3-c214-49ec-8d3c-174075f82c0b-catalog-content\") pod \"152bf5c3-c214-49ec-8d3c-174075f82c0b\" (UID: \"152bf5c3-c214-49ec-8d3c-174075f82c0b\") " Dec 06 09:51:16 crc kubenswrapper[4672]: I1206 09:51:16.538176 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x92pz\" (UniqueName: \"kubernetes.io/projected/152bf5c3-c214-49ec-8d3c-174075f82c0b-kube-api-access-x92pz\") pod \"152bf5c3-c214-49ec-8d3c-174075f82c0b\" (UID: \"152bf5c3-c214-49ec-8d3c-174075f82c0b\") " Dec 06 09:51:16 crc kubenswrapper[4672]: I1206 09:51:16.538940 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/152bf5c3-c214-49ec-8d3c-174075f82c0b-utilities" (OuterVolumeSpecName: "utilities") pod "152bf5c3-c214-49ec-8d3c-174075f82c0b" (UID: "152bf5c3-c214-49ec-8d3c-174075f82c0b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:51:16 crc kubenswrapper[4672]: I1206 09:51:16.546860 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/152bf5c3-c214-49ec-8d3c-174075f82c0b-kube-api-access-x92pz" (OuterVolumeSpecName: "kube-api-access-x92pz") pod "152bf5c3-c214-49ec-8d3c-174075f82c0b" (UID: "152bf5c3-c214-49ec-8d3c-174075f82c0b"). InnerVolumeSpecName "kube-api-access-x92pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:51:16 crc kubenswrapper[4672]: I1206 09:51:16.567185 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/152bf5c3-c214-49ec-8d3c-174075f82c0b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "152bf5c3-c214-49ec-8d3c-174075f82c0b" (UID: "152bf5c3-c214-49ec-8d3c-174075f82c0b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:51:16 crc kubenswrapper[4672]: I1206 09:51:16.640076 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/152bf5c3-c214-49ec-8d3c-174075f82c0b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:51:16 crc kubenswrapper[4672]: I1206 09:51:16.640108 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x92pz\" (UniqueName: \"kubernetes.io/projected/152bf5c3-c214-49ec-8d3c-174075f82c0b-kube-api-access-x92pz\") on node \"crc\" DevicePath \"\"" Dec 06 09:51:16 crc kubenswrapper[4672]: I1206 09:51:16.640121 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/152bf5c3-c214-49ec-8d3c-174075f82c0b-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:51:16 crc kubenswrapper[4672]: I1206 09:51:16.962056 4672 generic.go:334] "Generic (PLEG): container finished" podID="152bf5c3-c214-49ec-8d3c-174075f82c0b" containerID="957999b7c2b7ddf0deb07b9b1102fa4fc83d01dce9db63924d06f22a152f94f0" exitCode=0 Dec 06 09:51:16 crc kubenswrapper[4672]: I1206 09:51:16.962107 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s2tbc" event={"ID":"152bf5c3-c214-49ec-8d3c-174075f82c0b","Type":"ContainerDied","Data":"957999b7c2b7ddf0deb07b9b1102fa4fc83d01dce9db63924d06f22a152f94f0"} Dec 06 09:51:16 crc kubenswrapper[4672]: I1206 09:51:16.962133 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s2tbc" Dec 06 09:51:16 crc kubenswrapper[4672]: I1206 09:51:16.962160 4672 scope.go:117] "RemoveContainer" containerID="957999b7c2b7ddf0deb07b9b1102fa4fc83d01dce9db63924d06f22a152f94f0" Dec 06 09:51:16 crc kubenswrapper[4672]: I1206 09:51:16.962146 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s2tbc" event={"ID":"152bf5c3-c214-49ec-8d3c-174075f82c0b","Type":"ContainerDied","Data":"e84dc24a57d29abe5d34935911b044fa0a5ac83b8d9fa9975cbd70c4274e7d65"} Dec 06 09:51:16 crc kubenswrapper[4672]: I1206 09:51:16.984203 4672 scope.go:117] "RemoveContainer" containerID="08574c1f3748b1f0e0ab008555272c390a6eae09892d7e79e97b8b2427b8180e" Dec 06 09:51:17 crc kubenswrapper[4672]: I1206 09:51:17.006228 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s2tbc"] Dec 06 09:51:17 crc kubenswrapper[4672]: I1206 09:51:17.018840 4672 scope.go:117] "RemoveContainer" containerID="df8de846b90bc751f2df8e2833e9de5d2e2d2e2b1b2756c620b87a963414f2ed" Dec 06 09:51:17 crc kubenswrapper[4672]: I1206 09:51:17.020295 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-s2tbc"] Dec 06 09:51:17 crc kubenswrapper[4672]: I1206 09:51:17.054447 4672 scope.go:117] "RemoveContainer" containerID="957999b7c2b7ddf0deb07b9b1102fa4fc83d01dce9db63924d06f22a152f94f0" Dec 06 09:51:17 crc kubenswrapper[4672]: E1206 09:51:17.054909 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"957999b7c2b7ddf0deb07b9b1102fa4fc83d01dce9db63924d06f22a152f94f0\": container with ID starting with 957999b7c2b7ddf0deb07b9b1102fa4fc83d01dce9db63924d06f22a152f94f0 not found: ID does not exist" containerID="957999b7c2b7ddf0deb07b9b1102fa4fc83d01dce9db63924d06f22a152f94f0" Dec 06 09:51:17 crc kubenswrapper[4672]: I1206 09:51:17.054937 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"957999b7c2b7ddf0deb07b9b1102fa4fc83d01dce9db63924d06f22a152f94f0"} err="failed to get container status \"957999b7c2b7ddf0deb07b9b1102fa4fc83d01dce9db63924d06f22a152f94f0\": rpc error: code = NotFound desc = could not find container \"957999b7c2b7ddf0deb07b9b1102fa4fc83d01dce9db63924d06f22a152f94f0\": container with ID starting with 957999b7c2b7ddf0deb07b9b1102fa4fc83d01dce9db63924d06f22a152f94f0 not found: ID does not exist" Dec 06 09:51:17 crc kubenswrapper[4672]: I1206 09:51:17.054965 4672 scope.go:117] "RemoveContainer" containerID="08574c1f3748b1f0e0ab008555272c390a6eae09892d7e79e97b8b2427b8180e" Dec 06 09:51:17 crc kubenswrapper[4672]: E1206 09:51:17.055188 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08574c1f3748b1f0e0ab008555272c390a6eae09892d7e79e97b8b2427b8180e\": container with ID starting with 08574c1f3748b1f0e0ab008555272c390a6eae09892d7e79e97b8b2427b8180e not found: ID does not exist" containerID="08574c1f3748b1f0e0ab008555272c390a6eae09892d7e79e97b8b2427b8180e" Dec 06 09:51:17 crc kubenswrapper[4672]: I1206 09:51:17.055215 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08574c1f3748b1f0e0ab008555272c390a6eae09892d7e79e97b8b2427b8180e"} err="failed to get container status \"08574c1f3748b1f0e0ab008555272c390a6eae09892d7e79e97b8b2427b8180e\": rpc error: code = NotFound desc = could not find container \"08574c1f3748b1f0e0ab008555272c390a6eae09892d7e79e97b8b2427b8180e\": container with ID starting with 08574c1f3748b1f0e0ab008555272c390a6eae09892d7e79e97b8b2427b8180e not found: ID does not exist" Dec 06 09:51:17 crc kubenswrapper[4672]: I1206 09:51:17.055233 4672 scope.go:117] "RemoveContainer" containerID="df8de846b90bc751f2df8e2833e9de5d2e2d2e2b1b2756c620b87a963414f2ed" Dec 06 09:51:17 crc kubenswrapper[4672]: E1206 09:51:17.055533 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df8de846b90bc751f2df8e2833e9de5d2e2d2e2b1b2756c620b87a963414f2ed\": container with ID starting with df8de846b90bc751f2df8e2833e9de5d2e2d2e2b1b2756c620b87a963414f2ed not found: ID does not exist" containerID="df8de846b90bc751f2df8e2833e9de5d2e2d2e2b1b2756c620b87a963414f2ed" Dec 06 09:51:17 crc kubenswrapper[4672]: I1206 09:51:17.055580 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df8de846b90bc751f2df8e2833e9de5d2e2d2e2b1b2756c620b87a963414f2ed"} err="failed to get container status \"df8de846b90bc751f2df8e2833e9de5d2e2d2e2b1b2756c620b87a963414f2ed\": rpc error: code = NotFound desc = could not find container \"df8de846b90bc751f2df8e2833e9de5d2e2d2e2b1b2756c620b87a963414f2ed\": container with ID starting with df8de846b90bc751f2df8e2833e9de5d2e2d2e2b1b2756c620b87a963414f2ed not found: ID does not exist" Dec 06 09:51:18 crc kubenswrapper[4672]: I1206 09:51:18.568135 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="152bf5c3-c214-49ec-8d3c-174075f82c0b" path="/var/lib/kubelet/pods/152bf5c3-c214-49ec-8d3c-174075f82c0b/volumes" Dec 06 09:51:42 crc kubenswrapper[4672]: I1206 09:51:42.319688 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:51:42 crc kubenswrapper[4672]: I1206 09:51:42.320558 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:51:53 crc kubenswrapper[4672]: I1206 09:51:53.322625 4672 generic.go:334] "Generic (PLEG): container finished" podID="0a4871b2-574b-433c-8491-9147da825602" containerID="dbf3ea471eec72224ae7af4d3a6a6376464207fd50e47b0e1bbf5b5bea2341fd" exitCode=0 Dec 06 09:51:53 crc kubenswrapper[4672]: I1206 09:51:53.322760 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f" event={"ID":"0a4871b2-574b-433c-8491-9147da825602","Type":"ContainerDied","Data":"dbf3ea471eec72224ae7af4d3a6a6376464207fd50e47b0e1bbf5b5bea2341fd"} Dec 06 09:51:54 crc kubenswrapper[4672]: I1206 09:51:54.683832 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f" Dec 06 09:51:54 crc kubenswrapper[4672]: I1206 09:51:54.800810 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-ceph\") pod \"0a4871b2-574b-433c-8491-9147da825602\" (UID: \"0a4871b2-574b-433c-8491-9147da825602\") " Dec 06 09:51:54 crc kubenswrapper[4672]: I1206 09:51:54.800911 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-neutron-metadata-combined-ca-bundle\") pod \"0a4871b2-574b-433c-8491-9147da825602\" (UID: \"0a4871b2-574b-433c-8491-9147da825602\") " Dec 06 09:51:54 crc kubenswrapper[4672]: I1206 09:51:54.800961 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-nova-metadata-neutron-config-0\") pod \"0a4871b2-574b-433c-8491-9147da825602\" (UID: \"0a4871b2-574b-433c-8491-9147da825602\") " Dec 06 09:51:54 crc kubenswrapper[4672]: I1206 09:51:54.801318 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-inventory\") pod \"0a4871b2-574b-433c-8491-9147da825602\" (UID: \"0a4871b2-574b-433c-8491-9147da825602\") " Dec 06 09:51:54 crc kubenswrapper[4672]: I1206 09:51:54.801356 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-ssh-key\") pod \"0a4871b2-574b-433c-8491-9147da825602\" (UID: \"0a4871b2-574b-433c-8491-9147da825602\") " Dec 06 09:51:54 crc kubenswrapper[4672]: I1206 09:51:54.801489 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rc884\" (UniqueName: \"kubernetes.io/projected/0a4871b2-574b-433c-8491-9147da825602-kube-api-access-rc884\") pod \"0a4871b2-574b-433c-8491-9147da825602\" (UID: \"0a4871b2-574b-433c-8491-9147da825602\") " Dec 06 09:51:54 crc kubenswrapper[4672]: I1206 09:51:54.801536 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-neutron-ovn-metadata-agent-neutron-config-0\") pod \"0a4871b2-574b-433c-8491-9147da825602\" (UID: \"0a4871b2-574b-433c-8491-9147da825602\") " Dec 06 09:51:54 crc kubenswrapper[4672]: I1206 09:51:54.817307 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "0a4871b2-574b-433c-8491-9147da825602" (UID: "0a4871b2-574b-433c-8491-9147da825602"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:51:54 crc kubenswrapper[4672]: I1206 09:51:54.821372 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a4871b2-574b-433c-8491-9147da825602-kube-api-access-rc884" (OuterVolumeSpecName: "kube-api-access-rc884") pod "0a4871b2-574b-433c-8491-9147da825602" (UID: "0a4871b2-574b-433c-8491-9147da825602"). InnerVolumeSpecName "kube-api-access-rc884". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:51:54 crc kubenswrapper[4672]: I1206 09:51:54.822499 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-ceph" (OuterVolumeSpecName: "ceph") pod "0a4871b2-574b-433c-8491-9147da825602" (UID: "0a4871b2-574b-433c-8491-9147da825602"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:51:54 crc kubenswrapper[4672]: I1206 09:51:54.827578 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-inventory" (OuterVolumeSpecName: "inventory") pod "0a4871b2-574b-433c-8491-9147da825602" (UID: "0a4871b2-574b-433c-8491-9147da825602"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:51:54 crc kubenswrapper[4672]: I1206 09:51:54.831353 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "0a4871b2-574b-433c-8491-9147da825602" (UID: "0a4871b2-574b-433c-8491-9147da825602"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:51:54 crc kubenswrapper[4672]: I1206 09:51:54.834785 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0a4871b2-574b-433c-8491-9147da825602" (UID: "0a4871b2-574b-433c-8491-9147da825602"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:51:54 crc kubenswrapper[4672]: I1206 09:51:54.848056 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "0a4871b2-574b-433c-8491-9147da825602" (UID: "0a4871b2-574b-433c-8491-9147da825602"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:51:54 crc kubenswrapper[4672]: I1206 09:51:54.904249 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:51:54 crc kubenswrapper[4672]: I1206 09:51:54.904540 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:51:54 crc kubenswrapper[4672]: I1206 09:51:54.904654 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rc884\" (UniqueName: \"kubernetes.io/projected/0a4871b2-574b-433c-8491-9147da825602-kube-api-access-rc884\") on node \"crc\" DevicePath \"\"" Dec 06 09:51:54 crc kubenswrapper[4672]: I1206 09:51:54.904761 4672 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:51:54 crc kubenswrapper[4672]: I1206 09:51:54.904871 4672 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-ceph\") on node \"crc\" DevicePath \"\"" Dec 06 09:51:54 crc kubenswrapper[4672]: I1206 09:51:54.904959 4672 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:51:54 crc kubenswrapper[4672]: I1206 09:51:54.905045 4672 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0a4871b2-574b-433c-8491-9147da825602-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.340086 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f" event={"ID":"0a4871b2-574b-433c-8491-9147da825602","Type":"ContainerDied","Data":"b3e7f6e1810eada2aa72ddc020ddde0b7ae69b026c1bd5de3ad690f0bae274ed"} Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.340396 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3e7f6e1810eada2aa72ddc020ddde0b7ae69b026c1bd5de3ad690f0bae274ed" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.340153 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.525537 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r"] Dec 06 09:51:55 crc kubenswrapper[4672]: E1206 09:51:55.525896 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="152bf5c3-c214-49ec-8d3c-174075f82c0b" containerName="extract-utilities" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.525912 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="152bf5c3-c214-49ec-8d3c-174075f82c0b" containerName="extract-utilities" Dec 06 09:51:55 crc kubenswrapper[4672]: E1206 09:51:55.525927 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a4871b2-574b-433c-8491-9147da825602" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.525935 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a4871b2-574b-433c-8491-9147da825602" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 06 09:51:55 crc kubenswrapper[4672]: E1206 09:51:55.525961 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="152bf5c3-c214-49ec-8d3c-174075f82c0b" containerName="extract-content" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.525967 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="152bf5c3-c214-49ec-8d3c-174075f82c0b" containerName="extract-content" Dec 06 09:51:55 crc kubenswrapper[4672]: E1206 09:51:55.525983 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="152bf5c3-c214-49ec-8d3c-174075f82c0b" containerName="registry-server" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.525988 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="152bf5c3-c214-49ec-8d3c-174075f82c0b" containerName="registry-server" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.526143 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a4871b2-574b-433c-8491-9147da825602" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.526154 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="152bf5c3-c214-49ec-8d3c-174075f82c0b" containerName="registry-server" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.526784 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.531623 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.532058 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.532133 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.532177 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.532063 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p6qrb" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.534463 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.538617 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r"] Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.718490 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cc85e883-c516-489f-b15d-6e57e4236b75-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r\" (UID: \"cc85e883-c516-489f-b15d-6e57e4236b75\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.718533 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc85e883-c516-489f-b15d-6e57e4236b75-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r\" (UID: \"cc85e883-c516-489f-b15d-6e57e4236b75\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.718556 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc85e883-c516-489f-b15d-6e57e4236b75-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r\" (UID: \"cc85e883-c516-489f-b15d-6e57e4236b75\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.718615 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/cc85e883-c516-489f-b15d-6e57e4236b75-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r\" (UID: \"cc85e883-c516-489f-b15d-6e57e4236b75\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.718635 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc85e883-c516-489f-b15d-6e57e4236b75-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r\" (UID: \"cc85e883-c516-489f-b15d-6e57e4236b75\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.718685 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpxwt\" (UniqueName: \"kubernetes.io/projected/cc85e883-c516-489f-b15d-6e57e4236b75-kube-api-access-vpxwt\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r\" (UID: \"cc85e883-c516-489f-b15d-6e57e4236b75\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.820463 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cc85e883-c516-489f-b15d-6e57e4236b75-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r\" (UID: \"cc85e883-c516-489f-b15d-6e57e4236b75\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.820505 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc85e883-c516-489f-b15d-6e57e4236b75-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r\" (UID: \"cc85e883-c516-489f-b15d-6e57e4236b75\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.820523 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc85e883-c516-489f-b15d-6e57e4236b75-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r\" (UID: \"cc85e883-c516-489f-b15d-6e57e4236b75\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.820556 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/cc85e883-c516-489f-b15d-6e57e4236b75-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r\" (UID: \"cc85e883-c516-489f-b15d-6e57e4236b75\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.820576 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc85e883-c516-489f-b15d-6e57e4236b75-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r\" (UID: \"cc85e883-c516-489f-b15d-6e57e4236b75\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.820619 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpxwt\" (UniqueName: \"kubernetes.io/projected/cc85e883-c516-489f-b15d-6e57e4236b75-kube-api-access-vpxwt\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r\" (UID: \"cc85e883-c516-489f-b15d-6e57e4236b75\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.825068 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc85e883-c516-489f-b15d-6e57e4236b75-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r\" (UID: \"cc85e883-c516-489f-b15d-6e57e4236b75\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.825681 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc85e883-c516-489f-b15d-6e57e4236b75-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r\" (UID: \"cc85e883-c516-489f-b15d-6e57e4236b75\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.825985 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc85e883-c516-489f-b15d-6e57e4236b75-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r\" (UID: \"cc85e883-c516-489f-b15d-6e57e4236b75\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.829219 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cc85e883-c516-489f-b15d-6e57e4236b75-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r\" (UID: \"cc85e883-c516-489f-b15d-6e57e4236b75\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.839626 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpxwt\" (UniqueName: \"kubernetes.io/projected/cc85e883-c516-489f-b15d-6e57e4236b75-kube-api-access-vpxwt\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r\" (UID: \"cc85e883-c516-489f-b15d-6e57e4236b75\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.842140 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/cc85e883-c516-489f-b15d-6e57e4236b75-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r\" (UID: \"cc85e883-c516-489f-b15d-6e57e4236b75\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r" Dec 06 09:51:55 crc kubenswrapper[4672]: I1206 09:51:55.843015 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r" Dec 06 09:51:56 crc kubenswrapper[4672]: I1206 09:51:56.361809 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r"] Dec 06 09:51:57 crc kubenswrapper[4672]: I1206 09:51:57.358040 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r" event={"ID":"cc85e883-c516-489f-b15d-6e57e4236b75","Type":"ContainerStarted","Data":"40978211a115bc9a5378ec7e30af1e055397c6e62a1fef0d627bdb480eae7553"} Dec 06 09:51:57 crc kubenswrapper[4672]: I1206 09:51:57.358619 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r" event={"ID":"cc85e883-c516-489f-b15d-6e57e4236b75","Type":"ContainerStarted","Data":"e059a91d0593c64504621f0954cf0d09c1ab22420e243dea70153765ca5fe419"} Dec 06 09:51:57 crc kubenswrapper[4672]: I1206 09:51:57.382799 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r" podStartSLOduration=1.9206664770000001 podStartE2EDuration="2.382780659s" podCreationTimestamp="2025-12-06 09:51:55 +0000 UTC" firstStartedPulling="2025-12-06 09:51:56.377553185 +0000 UTC m=+2734.121813492" lastFinishedPulling="2025-12-06 09:51:56.839667387 +0000 UTC m=+2734.583927674" observedRunningTime="2025-12-06 09:51:57.37840022 +0000 UTC m=+2735.122660507" watchObservedRunningTime="2025-12-06 09:51:57.382780659 +0000 UTC m=+2735.127040946" Dec 06 09:52:12 crc kubenswrapper[4672]: I1206 09:52:12.320078 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:52:12 crc kubenswrapper[4672]: I1206 09:52:12.320681 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:52:42 crc kubenswrapper[4672]: I1206 09:52:42.319899 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:52:42 crc kubenswrapper[4672]: I1206 09:52:42.320476 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:52:42 crc kubenswrapper[4672]: I1206 09:52:42.320534 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 09:52:42 crc kubenswrapper[4672]: I1206 09:52:42.321374 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7cd1be7755bf34af00f4c17fd0804f052c17aac5446317d3315921d4c9466ed8"} pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 09:52:42 crc kubenswrapper[4672]: I1206 09:52:42.321439 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" containerID="cri-o://7cd1be7755bf34af00f4c17fd0804f052c17aac5446317d3315921d4c9466ed8" gracePeriod=600 Dec 06 09:52:42 crc kubenswrapper[4672]: I1206 09:52:42.776316 4672 generic.go:334] "Generic (PLEG): container finished" podID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerID="7cd1be7755bf34af00f4c17fd0804f052c17aac5446317d3315921d4c9466ed8" exitCode=0 Dec 06 09:52:42 crc kubenswrapper[4672]: I1206 09:52:42.776379 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerDied","Data":"7cd1be7755bf34af00f4c17fd0804f052c17aac5446317d3315921d4c9466ed8"} Dec 06 09:52:42 crc kubenswrapper[4672]: I1206 09:52:42.776438 4672 scope.go:117] "RemoveContainer" containerID="9db37941c2f5797e3cc2e07c2a5ea926cd6224bccf10d013946bef80402ff7bb" Dec 06 09:52:44 crc kubenswrapper[4672]: I1206 09:52:44.810075 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerStarted","Data":"7bbf2781550e7a61427a2b236cb3b966725940a4a76024981629c0dc4fd1af55"} Dec 06 09:53:47 crc kubenswrapper[4672]: I1206 09:53:47.866400 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vcsgx"] Dec 06 09:53:47 crc kubenswrapper[4672]: I1206 09:53:47.873315 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vcsgx" Dec 06 09:53:47 crc kubenswrapper[4672]: I1206 09:53:47.897822 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vcsgx"] Dec 06 09:53:47 crc kubenswrapper[4672]: I1206 09:53:47.970429 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/278c980d-8b4e-4f3b-93d8-de725cdfa39d-utilities\") pod \"redhat-operators-vcsgx\" (UID: \"278c980d-8b4e-4f3b-93d8-de725cdfa39d\") " pod="openshift-marketplace/redhat-operators-vcsgx" Dec 06 09:53:47 crc kubenswrapper[4672]: I1206 09:53:47.970647 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/278c980d-8b4e-4f3b-93d8-de725cdfa39d-catalog-content\") pod \"redhat-operators-vcsgx\" (UID: \"278c980d-8b4e-4f3b-93d8-de725cdfa39d\") " pod="openshift-marketplace/redhat-operators-vcsgx" Dec 06 09:53:47 crc kubenswrapper[4672]: I1206 09:53:47.970769 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tnfm\" (UniqueName: \"kubernetes.io/projected/278c980d-8b4e-4f3b-93d8-de725cdfa39d-kube-api-access-4tnfm\") pod \"redhat-operators-vcsgx\" (UID: \"278c980d-8b4e-4f3b-93d8-de725cdfa39d\") " pod="openshift-marketplace/redhat-operators-vcsgx" Dec 06 09:53:48 crc kubenswrapper[4672]: I1206 09:53:48.072412 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tnfm\" (UniqueName: \"kubernetes.io/projected/278c980d-8b4e-4f3b-93d8-de725cdfa39d-kube-api-access-4tnfm\") pod \"redhat-operators-vcsgx\" (UID: \"278c980d-8b4e-4f3b-93d8-de725cdfa39d\") " pod="openshift-marketplace/redhat-operators-vcsgx" Dec 06 09:53:48 crc kubenswrapper[4672]: I1206 09:53:48.072510 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/278c980d-8b4e-4f3b-93d8-de725cdfa39d-utilities\") pod \"redhat-operators-vcsgx\" (UID: \"278c980d-8b4e-4f3b-93d8-de725cdfa39d\") " pod="openshift-marketplace/redhat-operators-vcsgx" Dec 06 09:53:48 crc kubenswrapper[4672]: I1206 09:53:48.072647 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/278c980d-8b4e-4f3b-93d8-de725cdfa39d-catalog-content\") pod \"redhat-operators-vcsgx\" (UID: \"278c980d-8b4e-4f3b-93d8-de725cdfa39d\") " pod="openshift-marketplace/redhat-operators-vcsgx" Dec 06 09:53:48 crc kubenswrapper[4672]: I1206 09:53:48.073433 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/278c980d-8b4e-4f3b-93d8-de725cdfa39d-catalog-content\") pod \"redhat-operators-vcsgx\" (UID: \"278c980d-8b4e-4f3b-93d8-de725cdfa39d\") " pod="openshift-marketplace/redhat-operators-vcsgx" Dec 06 09:53:48 crc kubenswrapper[4672]: I1206 09:53:48.073778 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/278c980d-8b4e-4f3b-93d8-de725cdfa39d-utilities\") pod \"redhat-operators-vcsgx\" (UID: \"278c980d-8b4e-4f3b-93d8-de725cdfa39d\") " pod="openshift-marketplace/redhat-operators-vcsgx" Dec 06 09:53:48 crc kubenswrapper[4672]: I1206 09:53:48.091738 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tnfm\" (UniqueName: \"kubernetes.io/projected/278c980d-8b4e-4f3b-93d8-de725cdfa39d-kube-api-access-4tnfm\") pod \"redhat-operators-vcsgx\" (UID: \"278c980d-8b4e-4f3b-93d8-de725cdfa39d\") " pod="openshift-marketplace/redhat-operators-vcsgx" Dec 06 09:53:48 crc kubenswrapper[4672]: I1206 09:53:48.205180 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vcsgx" Dec 06 09:53:48 crc kubenswrapper[4672]: I1206 09:53:48.689052 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vcsgx"] Dec 06 09:53:49 crc kubenswrapper[4672]: I1206 09:53:49.709303 4672 generic.go:334] "Generic (PLEG): container finished" podID="278c980d-8b4e-4f3b-93d8-de725cdfa39d" containerID="8fc4076cabb871abbe31930c36e76af266571869635dd832f3884a427fd963ff" exitCode=0 Dec 06 09:53:49 crc kubenswrapper[4672]: I1206 09:53:49.709377 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcsgx" event={"ID":"278c980d-8b4e-4f3b-93d8-de725cdfa39d","Type":"ContainerDied","Data":"8fc4076cabb871abbe31930c36e76af266571869635dd832f3884a427fd963ff"} Dec 06 09:53:49 crc kubenswrapper[4672]: I1206 09:53:49.711305 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcsgx" event={"ID":"278c980d-8b4e-4f3b-93d8-de725cdfa39d","Type":"ContainerStarted","Data":"b0ccc720a092ce80cb9a09e13edd3de007e6d3f88c90cf14a1a9676579c28d15"} Dec 06 09:53:50 crc kubenswrapper[4672]: I1206 09:53:50.742573 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcsgx" event={"ID":"278c980d-8b4e-4f3b-93d8-de725cdfa39d","Type":"ContainerStarted","Data":"bea7610d2db19aa250b2ff47dbf4f7e67e11d6c3b5f38b6b2c30bc8fa9de4756"} Dec 06 09:53:52 crc kubenswrapper[4672]: I1206 09:53:52.764448 4672 generic.go:334] "Generic (PLEG): container finished" podID="278c980d-8b4e-4f3b-93d8-de725cdfa39d" containerID="bea7610d2db19aa250b2ff47dbf4f7e67e11d6c3b5f38b6b2c30bc8fa9de4756" exitCode=0 Dec 06 09:53:52 crc kubenswrapper[4672]: I1206 09:53:52.764491 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcsgx" event={"ID":"278c980d-8b4e-4f3b-93d8-de725cdfa39d","Type":"ContainerDied","Data":"bea7610d2db19aa250b2ff47dbf4f7e67e11d6c3b5f38b6b2c30bc8fa9de4756"} Dec 06 09:53:54 crc kubenswrapper[4672]: I1206 09:53:54.799085 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcsgx" event={"ID":"278c980d-8b4e-4f3b-93d8-de725cdfa39d","Type":"ContainerStarted","Data":"09decfa1d12521d22c41c10adfc220e5f6d4921b8aa7547fa801ea27dc321be2"} Dec 06 09:53:54 crc kubenswrapper[4672]: I1206 09:53:54.829177 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vcsgx" podStartSLOduration=3.67083802 podStartE2EDuration="7.829157567s" podCreationTimestamp="2025-12-06 09:53:47 +0000 UTC" firstStartedPulling="2025-12-06 09:53:49.712811113 +0000 UTC m=+2847.457071440" lastFinishedPulling="2025-12-06 09:53:53.8711307 +0000 UTC m=+2851.615390987" observedRunningTime="2025-12-06 09:53:54.824527972 +0000 UTC m=+2852.568788269" watchObservedRunningTime="2025-12-06 09:53:54.829157567 +0000 UTC m=+2852.573417874" Dec 06 09:53:58 crc kubenswrapper[4672]: I1206 09:53:58.205936 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vcsgx" Dec 06 09:53:58 crc kubenswrapper[4672]: I1206 09:53:58.206565 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vcsgx" Dec 06 09:53:59 crc kubenswrapper[4672]: I1206 09:53:59.261154 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vcsgx" podUID="278c980d-8b4e-4f3b-93d8-de725cdfa39d" containerName="registry-server" probeResult="failure" output=< Dec 06 09:53:59 crc kubenswrapper[4672]: timeout: failed to connect service ":50051" within 1s Dec 06 09:53:59 crc kubenswrapper[4672]: > Dec 06 09:54:08 crc kubenswrapper[4672]: I1206 09:54:08.305863 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vcsgx" Dec 06 09:54:08 crc kubenswrapper[4672]: I1206 09:54:08.370104 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vcsgx" Dec 06 09:54:08 crc kubenswrapper[4672]: I1206 09:54:08.545464 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vcsgx"] Dec 06 09:54:09 crc kubenswrapper[4672]: I1206 09:54:09.933222 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vcsgx" podUID="278c980d-8b4e-4f3b-93d8-de725cdfa39d" containerName="registry-server" containerID="cri-o://09decfa1d12521d22c41c10adfc220e5f6d4921b8aa7547fa801ea27dc321be2" gracePeriod=2 Dec 06 09:54:10 crc kubenswrapper[4672]: I1206 09:54:10.408553 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vcsgx" Dec 06 09:54:10 crc kubenswrapper[4672]: I1206 09:54:10.526254 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/278c980d-8b4e-4f3b-93d8-de725cdfa39d-catalog-content\") pod \"278c980d-8b4e-4f3b-93d8-de725cdfa39d\" (UID: \"278c980d-8b4e-4f3b-93d8-de725cdfa39d\") " Dec 06 09:54:10 crc kubenswrapper[4672]: I1206 09:54:10.526566 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/278c980d-8b4e-4f3b-93d8-de725cdfa39d-utilities\") pod \"278c980d-8b4e-4f3b-93d8-de725cdfa39d\" (UID: \"278c980d-8b4e-4f3b-93d8-de725cdfa39d\") " Dec 06 09:54:10 crc kubenswrapper[4672]: I1206 09:54:10.526685 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tnfm\" (UniqueName: \"kubernetes.io/projected/278c980d-8b4e-4f3b-93d8-de725cdfa39d-kube-api-access-4tnfm\") pod \"278c980d-8b4e-4f3b-93d8-de725cdfa39d\" (UID: \"278c980d-8b4e-4f3b-93d8-de725cdfa39d\") " Dec 06 09:54:10 crc kubenswrapper[4672]: I1206 09:54:10.527385 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/278c980d-8b4e-4f3b-93d8-de725cdfa39d-utilities" (OuterVolumeSpecName: "utilities") pod "278c980d-8b4e-4f3b-93d8-de725cdfa39d" (UID: "278c980d-8b4e-4f3b-93d8-de725cdfa39d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:54:10 crc kubenswrapper[4672]: I1206 09:54:10.545851 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/278c980d-8b4e-4f3b-93d8-de725cdfa39d-kube-api-access-4tnfm" (OuterVolumeSpecName: "kube-api-access-4tnfm") pod "278c980d-8b4e-4f3b-93d8-de725cdfa39d" (UID: "278c980d-8b4e-4f3b-93d8-de725cdfa39d"). InnerVolumeSpecName "kube-api-access-4tnfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:54:10 crc kubenswrapper[4672]: I1206 09:54:10.629276 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/278c980d-8b4e-4f3b-93d8-de725cdfa39d-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:54:10 crc kubenswrapper[4672]: I1206 09:54:10.629317 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tnfm\" (UniqueName: \"kubernetes.io/projected/278c980d-8b4e-4f3b-93d8-de725cdfa39d-kube-api-access-4tnfm\") on node \"crc\" DevicePath \"\"" Dec 06 09:54:10 crc kubenswrapper[4672]: I1206 09:54:10.648318 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/278c980d-8b4e-4f3b-93d8-de725cdfa39d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "278c980d-8b4e-4f3b-93d8-de725cdfa39d" (UID: "278c980d-8b4e-4f3b-93d8-de725cdfa39d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:54:10 crc kubenswrapper[4672]: I1206 09:54:10.730983 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/278c980d-8b4e-4f3b-93d8-de725cdfa39d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:54:10 crc kubenswrapper[4672]: I1206 09:54:10.949092 4672 generic.go:334] "Generic (PLEG): container finished" podID="278c980d-8b4e-4f3b-93d8-de725cdfa39d" containerID="09decfa1d12521d22c41c10adfc220e5f6d4921b8aa7547fa801ea27dc321be2" exitCode=0 Dec 06 09:54:10 crc kubenswrapper[4672]: I1206 09:54:10.949137 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcsgx" event={"ID":"278c980d-8b4e-4f3b-93d8-de725cdfa39d","Type":"ContainerDied","Data":"09decfa1d12521d22c41c10adfc220e5f6d4921b8aa7547fa801ea27dc321be2"} Dec 06 09:54:10 crc kubenswrapper[4672]: I1206 09:54:10.949187 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcsgx" event={"ID":"278c980d-8b4e-4f3b-93d8-de725cdfa39d","Type":"ContainerDied","Data":"b0ccc720a092ce80cb9a09e13edd3de007e6d3f88c90cf14a1a9676579c28d15"} Dec 06 09:54:10 crc kubenswrapper[4672]: I1206 09:54:10.949209 4672 scope.go:117] "RemoveContainer" containerID="09decfa1d12521d22c41c10adfc220e5f6d4921b8aa7547fa801ea27dc321be2" Dec 06 09:54:10 crc kubenswrapper[4672]: I1206 09:54:10.949221 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vcsgx" Dec 06 09:54:10 crc kubenswrapper[4672]: I1206 09:54:10.997946 4672 scope.go:117] "RemoveContainer" containerID="bea7610d2db19aa250b2ff47dbf4f7e67e11d6c3b5f38b6b2c30bc8fa9de4756" Dec 06 09:54:11 crc kubenswrapper[4672]: I1206 09:54:11.024489 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vcsgx"] Dec 06 09:54:11 crc kubenswrapper[4672]: I1206 09:54:11.043305 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vcsgx"] Dec 06 09:54:11 crc kubenswrapper[4672]: I1206 09:54:11.054111 4672 scope.go:117] "RemoveContainer" containerID="8fc4076cabb871abbe31930c36e76af266571869635dd832f3884a427fd963ff" Dec 06 09:54:11 crc kubenswrapper[4672]: I1206 09:54:11.093182 4672 scope.go:117] "RemoveContainer" containerID="09decfa1d12521d22c41c10adfc220e5f6d4921b8aa7547fa801ea27dc321be2" Dec 06 09:54:11 crc kubenswrapper[4672]: E1206 09:54:11.093703 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09decfa1d12521d22c41c10adfc220e5f6d4921b8aa7547fa801ea27dc321be2\": container with ID starting with 09decfa1d12521d22c41c10adfc220e5f6d4921b8aa7547fa801ea27dc321be2 not found: ID does not exist" containerID="09decfa1d12521d22c41c10adfc220e5f6d4921b8aa7547fa801ea27dc321be2" Dec 06 09:54:11 crc kubenswrapper[4672]: I1206 09:54:11.093771 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09decfa1d12521d22c41c10adfc220e5f6d4921b8aa7547fa801ea27dc321be2"} err="failed to get container status \"09decfa1d12521d22c41c10adfc220e5f6d4921b8aa7547fa801ea27dc321be2\": rpc error: code = NotFound desc = could not find container \"09decfa1d12521d22c41c10adfc220e5f6d4921b8aa7547fa801ea27dc321be2\": container with ID starting with 09decfa1d12521d22c41c10adfc220e5f6d4921b8aa7547fa801ea27dc321be2 not found: ID does not exist" Dec 06 09:54:11 crc kubenswrapper[4672]: I1206 09:54:11.093798 4672 scope.go:117] "RemoveContainer" containerID="bea7610d2db19aa250b2ff47dbf4f7e67e11d6c3b5f38b6b2c30bc8fa9de4756" Dec 06 09:54:11 crc kubenswrapper[4672]: E1206 09:54:11.094490 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bea7610d2db19aa250b2ff47dbf4f7e67e11d6c3b5f38b6b2c30bc8fa9de4756\": container with ID starting with bea7610d2db19aa250b2ff47dbf4f7e67e11d6c3b5f38b6b2c30bc8fa9de4756 not found: ID does not exist" containerID="bea7610d2db19aa250b2ff47dbf4f7e67e11d6c3b5f38b6b2c30bc8fa9de4756" Dec 06 09:54:11 crc kubenswrapper[4672]: I1206 09:54:11.094514 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bea7610d2db19aa250b2ff47dbf4f7e67e11d6c3b5f38b6b2c30bc8fa9de4756"} err="failed to get container status \"bea7610d2db19aa250b2ff47dbf4f7e67e11d6c3b5f38b6b2c30bc8fa9de4756\": rpc error: code = NotFound desc = could not find container \"bea7610d2db19aa250b2ff47dbf4f7e67e11d6c3b5f38b6b2c30bc8fa9de4756\": container with ID starting with bea7610d2db19aa250b2ff47dbf4f7e67e11d6c3b5f38b6b2c30bc8fa9de4756 not found: ID does not exist" Dec 06 09:54:11 crc kubenswrapper[4672]: I1206 09:54:11.094528 4672 scope.go:117] "RemoveContainer" containerID="8fc4076cabb871abbe31930c36e76af266571869635dd832f3884a427fd963ff" Dec 06 09:54:11 crc kubenswrapper[4672]: E1206 09:54:11.094856 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fc4076cabb871abbe31930c36e76af266571869635dd832f3884a427fd963ff\": container with ID starting with 8fc4076cabb871abbe31930c36e76af266571869635dd832f3884a427fd963ff not found: ID does not exist" containerID="8fc4076cabb871abbe31930c36e76af266571869635dd832f3884a427fd963ff" Dec 06 09:54:11 crc kubenswrapper[4672]: I1206 09:54:11.094904 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fc4076cabb871abbe31930c36e76af266571869635dd832f3884a427fd963ff"} err="failed to get container status \"8fc4076cabb871abbe31930c36e76af266571869635dd832f3884a427fd963ff\": rpc error: code = NotFound desc = could not find container \"8fc4076cabb871abbe31930c36e76af266571869635dd832f3884a427fd963ff\": container with ID starting with 8fc4076cabb871abbe31930c36e76af266571869635dd832f3884a427fd963ff not found: ID does not exist" Dec 06 09:54:12 crc kubenswrapper[4672]: I1206 09:54:12.568904 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="278c980d-8b4e-4f3b-93d8-de725cdfa39d" path="/var/lib/kubelet/pods/278c980d-8b4e-4f3b-93d8-de725cdfa39d/volumes" Dec 06 09:54:50 crc kubenswrapper[4672]: I1206 09:54:50.130335 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-twwt7"] Dec 06 09:54:50 crc kubenswrapper[4672]: E1206 09:54:50.131338 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="278c980d-8b4e-4f3b-93d8-de725cdfa39d" containerName="extract-content" Dec 06 09:54:50 crc kubenswrapper[4672]: I1206 09:54:50.131354 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="278c980d-8b4e-4f3b-93d8-de725cdfa39d" containerName="extract-content" Dec 06 09:54:50 crc kubenswrapper[4672]: E1206 09:54:50.131372 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="278c980d-8b4e-4f3b-93d8-de725cdfa39d" containerName="registry-server" Dec 06 09:54:50 crc kubenswrapper[4672]: I1206 09:54:50.131380 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="278c980d-8b4e-4f3b-93d8-de725cdfa39d" containerName="registry-server" Dec 06 09:54:50 crc kubenswrapper[4672]: E1206 09:54:50.131409 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="278c980d-8b4e-4f3b-93d8-de725cdfa39d" containerName="extract-utilities" Dec 06 09:54:50 crc kubenswrapper[4672]: I1206 09:54:50.131417 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="278c980d-8b4e-4f3b-93d8-de725cdfa39d" containerName="extract-utilities" Dec 06 09:54:50 crc kubenswrapper[4672]: I1206 09:54:50.131659 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="278c980d-8b4e-4f3b-93d8-de725cdfa39d" containerName="registry-server" Dec 06 09:54:50 crc kubenswrapper[4672]: I1206 09:54:50.133236 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-twwt7" Dec 06 09:54:50 crc kubenswrapper[4672]: I1206 09:54:50.143748 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-twwt7"] Dec 06 09:54:50 crc kubenswrapper[4672]: I1206 09:54:50.238111 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f68a1365-5a93-4505-99ae-e63a2c422aac-catalog-content\") pod \"certified-operators-twwt7\" (UID: \"f68a1365-5a93-4505-99ae-e63a2c422aac\") " pod="openshift-marketplace/certified-operators-twwt7" Dec 06 09:54:50 crc kubenswrapper[4672]: I1206 09:54:50.238418 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f68a1365-5a93-4505-99ae-e63a2c422aac-utilities\") pod \"certified-operators-twwt7\" (UID: \"f68a1365-5a93-4505-99ae-e63a2c422aac\") " pod="openshift-marketplace/certified-operators-twwt7" Dec 06 09:54:50 crc kubenswrapper[4672]: I1206 09:54:50.238560 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j5dr\" (UniqueName: \"kubernetes.io/projected/f68a1365-5a93-4505-99ae-e63a2c422aac-kube-api-access-6j5dr\") pod \"certified-operators-twwt7\" (UID: \"f68a1365-5a93-4505-99ae-e63a2c422aac\") " pod="openshift-marketplace/certified-operators-twwt7" Dec 06 09:54:50 crc kubenswrapper[4672]: I1206 09:54:50.342116 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f68a1365-5a93-4505-99ae-e63a2c422aac-catalog-content\") pod \"certified-operators-twwt7\" (UID: \"f68a1365-5a93-4505-99ae-e63a2c422aac\") " pod="openshift-marketplace/certified-operators-twwt7" Dec 06 09:54:50 crc kubenswrapper[4672]: I1206 09:54:50.342184 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f68a1365-5a93-4505-99ae-e63a2c422aac-utilities\") pod \"certified-operators-twwt7\" (UID: \"f68a1365-5a93-4505-99ae-e63a2c422aac\") " pod="openshift-marketplace/certified-operators-twwt7" Dec 06 09:54:50 crc kubenswrapper[4672]: I1206 09:54:50.342242 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j5dr\" (UniqueName: \"kubernetes.io/projected/f68a1365-5a93-4505-99ae-e63a2c422aac-kube-api-access-6j5dr\") pod \"certified-operators-twwt7\" (UID: \"f68a1365-5a93-4505-99ae-e63a2c422aac\") " pod="openshift-marketplace/certified-operators-twwt7" Dec 06 09:54:50 crc kubenswrapper[4672]: I1206 09:54:50.342952 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f68a1365-5a93-4505-99ae-e63a2c422aac-utilities\") pod \"certified-operators-twwt7\" (UID: \"f68a1365-5a93-4505-99ae-e63a2c422aac\") " pod="openshift-marketplace/certified-operators-twwt7" Dec 06 09:54:50 crc kubenswrapper[4672]: I1206 09:54:50.342991 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f68a1365-5a93-4505-99ae-e63a2c422aac-catalog-content\") pod \"certified-operators-twwt7\" (UID: \"f68a1365-5a93-4505-99ae-e63a2c422aac\") " pod="openshift-marketplace/certified-operators-twwt7" Dec 06 09:54:50 crc kubenswrapper[4672]: I1206 09:54:50.361986 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j5dr\" (UniqueName: \"kubernetes.io/projected/f68a1365-5a93-4505-99ae-e63a2c422aac-kube-api-access-6j5dr\") pod \"certified-operators-twwt7\" (UID: \"f68a1365-5a93-4505-99ae-e63a2c422aac\") " pod="openshift-marketplace/certified-operators-twwt7" Dec 06 09:54:50 crc kubenswrapper[4672]: I1206 09:54:50.456668 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-twwt7" Dec 06 09:54:50 crc kubenswrapper[4672]: I1206 09:54:50.926576 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7hb9f"] Dec 06 09:54:50 crc kubenswrapper[4672]: I1206 09:54:50.929854 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7hb9f" Dec 06 09:54:50 crc kubenswrapper[4672]: I1206 09:54:50.940850 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7hb9f"] Dec 06 09:54:51 crc kubenswrapper[4672]: I1206 09:54:51.063209 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgs7k\" (UniqueName: \"kubernetes.io/projected/79bc8e8e-4c59-4615-a582-cd20864f2204-kube-api-access-qgs7k\") pod \"community-operators-7hb9f\" (UID: \"79bc8e8e-4c59-4615-a582-cd20864f2204\") " pod="openshift-marketplace/community-operators-7hb9f" Dec 06 09:54:51 crc kubenswrapper[4672]: I1206 09:54:51.063273 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79bc8e8e-4c59-4615-a582-cd20864f2204-utilities\") pod \"community-operators-7hb9f\" (UID: \"79bc8e8e-4c59-4615-a582-cd20864f2204\") " pod="openshift-marketplace/community-operators-7hb9f" Dec 06 09:54:51 crc kubenswrapper[4672]: I1206 09:54:51.063435 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79bc8e8e-4c59-4615-a582-cd20864f2204-catalog-content\") pod \"community-operators-7hb9f\" (UID: \"79bc8e8e-4c59-4615-a582-cd20864f2204\") " pod="openshift-marketplace/community-operators-7hb9f" Dec 06 09:54:51 crc kubenswrapper[4672]: I1206 09:54:51.076242 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-twwt7"] Dec 06 09:54:51 crc kubenswrapper[4672]: I1206 09:54:51.165654 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgs7k\" (UniqueName: \"kubernetes.io/projected/79bc8e8e-4c59-4615-a582-cd20864f2204-kube-api-access-qgs7k\") pod \"community-operators-7hb9f\" (UID: \"79bc8e8e-4c59-4615-a582-cd20864f2204\") " pod="openshift-marketplace/community-operators-7hb9f" Dec 06 09:54:51 crc kubenswrapper[4672]: I1206 09:54:51.165705 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79bc8e8e-4c59-4615-a582-cd20864f2204-utilities\") pod \"community-operators-7hb9f\" (UID: \"79bc8e8e-4c59-4615-a582-cd20864f2204\") " pod="openshift-marketplace/community-operators-7hb9f" Dec 06 09:54:51 crc kubenswrapper[4672]: I1206 09:54:51.165770 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79bc8e8e-4c59-4615-a582-cd20864f2204-catalog-content\") pod \"community-operators-7hb9f\" (UID: \"79bc8e8e-4c59-4615-a582-cd20864f2204\") " pod="openshift-marketplace/community-operators-7hb9f" Dec 06 09:54:51 crc kubenswrapper[4672]: I1206 09:54:51.166243 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79bc8e8e-4c59-4615-a582-cd20864f2204-catalog-content\") pod \"community-operators-7hb9f\" (UID: \"79bc8e8e-4c59-4615-a582-cd20864f2204\") " pod="openshift-marketplace/community-operators-7hb9f" Dec 06 09:54:51 crc kubenswrapper[4672]: I1206 09:54:51.166771 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79bc8e8e-4c59-4615-a582-cd20864f2204-utilities\") pod \"community-operators-7hb9f\" (UID: \"79bc8e8e-4c59-4615-a582-cd20864f2204\") " pod="openshift-marketplace/community-operators-7hb9f" Dec 06 09:54:51 crc kubenswrapper[4672]: I1206 09:54:51.198260 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgs7k\" (UniqueName: \"kubernetes.io/projected/79bc8e8e-4c59-4615-a582-cd20864f2204-kube-api-access-qgs7k\") pod \"community-operators-7hb9f\" (UID: \"79bc8e8e-4c59-4615-a582-cd20864f2204\") " pod="openshift-marketplace/community-operators-7hb9f" Dec 06 09:54:51 crc kubenswrapper[4672]: I1206 09:54:51.257235 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7hb9f" Dec 06 09:54:51 crc kubenswrapper[4672]: I1206 09:54:51.349909 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-twwt7" event={"ID":"f68a1365-5a93-4505-99ae-e63a2c422aac","Type":"ContainerStarted","Data":"159b1246db14c7805b1653f971b0f4fd37d4a1b62df7ae6f66dcefe71e759c91"} Dec 06 09:54:51 crc kubenswrapper[4672]: I1206 09:54:51.350135 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-twwt7" event={"ID":"f68a1365-5a93-4505-99ae-e63a2c422aac","Type":"ContainerStarted","Data":"390507c02edc636cc85afb0ceb20f4ba8a94faad37b2e8361d3c8213672383eb"} Dec 06 09:54:51 crc kubenswrapper[4672]: I1206 09:54:51.787699 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7hb9f"] Dec 06 09:54:51 crc kubenswrapper[4672]: W1206 09:54:51.799970 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79bc8e8e_4c59_4615_a582_cd20864f2204.slice/crio-e7250f33c0a92ebab0efbc801dba52c2b271fdd2697d47885379ee66b9f40e0c WatchSource:0}: Error finding container e7250f33c0a92ebab0efbc801dba52c2b271fdd2697d47885379ee66b9f40e0c: Status 404 returned error can't find the container with id e7250f33c0a92ebab0efbc801dba52c2b271fdd2697d47885379ee66b9f40e0c Dec 06 09:54:52 crc kubenswrapper[4672]: I1206 09:54:52.366390 4672 generic.go:334] "Generic (PLEG): container finished" podID="f68a1365-5a93-4505-99ae-e63a2c422aac" containerID="159b1246db14c7805b1653f971b0f4fd37d4a1b62df7ae6f66dcefe71e759c91" exitCode=0 Dec 06 09:54:52 crc kubenswrapper[4672]: I1206 09:54:52.366445 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-twwt7" event={"ID":"f68a1365-5a93-4505-99ae-e63a2c422aac","Type":"ContainerDied","Data":"159b1246db14c7805b1653f971b0f4fd37d4a1b62df7ae6f66dcefe71e759c91"} Dec 06 09:54:52 crc kubenswrapper[4672]: I1206 09:54:52.380447 4672 generic.go:334] "Generic (PLEG): container finished" podID="79bc8e8e-4c59-4615-a582-cd20864f2204" containerID="9bdd24c79516200dae9829c6b97f8eacd9c447eec1d990947a61868f85af0ae2" exitCode=0 Dec 06 09:54:52 crc kubenswrapper[4672]: I1206 09:54:52.380704 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hb9f" event={"ID":"79bc8e8e-4c59-4615-a582-cd20864f2204","Type":"ContainerDied","Data":"9bdd24c79516200dae9829c6b97f8eacd9c447eec1d990947a61868f85af0ae2"} Dec 06 09:54:52 crc kubenswrapper[4672]: I1206 09:54:52.380732 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hb9f" event={"ID":"79bc8e8e-4c59-4615-a582-cd20864f2204","Type":"ContainerStarted","Data":"e7250f33c0a92ebab0efbc801dba52c2b271fdd2697d47885379ee66b9f40e0c"} Dec 06 09:54:53 crc kubenswrapper[4672]: I1206 09:54:53.389192 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hb9f" event={"ID":"79bc8e8e-4c59-4615-a582-cd20864f2204","Type":"ContainerStarted","Data":"36c7ea6295d72278af57d5a5decf54780531c16de50744f48a23f5cd8ff8edb0"} Dec 06 09:54:53 crc kubenswrapper[4672]: I1206 09:54:53.391826 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-twwt7" event={"ID":"f68a1365-5a93-4505-99ae-e63a2c422aac","Type":"ContainerStarted","Data":"211a2b39b6c8742291d48f1e8074dd6de57c3056b72842a06ec3b0578dd19a53"} Dec 06 09:54:54 crc kubenswrapper[4672]: I1206 09:54:54.402712 4672 generic.go:334] "Generic (PLEG): container finished" podID="79bc8e8e-4c59-4615-a582-cd20864f2204" containerID="36c7ea6295d72278af57d5a5decf54780531c16de50744f48a23f5cd8ff8edb0" exitCode=0 Dec 06 09:54:54 crc kubenswrapper[4672]: I1206 09:54:54.402795 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hb9f" event={"ID":"79bc8e8e-4c59-4615-a582-cd20864f2204","Type":"ContainerDied","Data":"36c7ea6295d72278af57d5a5decf54780531c16de50744f48a23f5cd8ff8edb0"} Dec 06 09:54:54 crc kubenswrapper[4672]: I1206 09:54:54.407374 4672 generic.go:334] "Generic (PLEG): container finished" podID="f68a1365-5a93-4505-99ae-e63a2c422aac" containerID="211a2b39b6c8742291d48f1e8074dd6de57c3056b72842a06ec3b0578dd19a53" exitCode=0 Dec 06 09:54:54 crc kubenswrapper[4672]: I1206 09:54:54.407432 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-twwt7" event={"ID":"f68a1365-5a93-4505-99ae-e63a2c422aac","Type":"ContainerDied","Data":"211a2b39b6c8742291d48f1e8074dd6de57c3056b72842a06ec3b0578dd19a53"} Dec 06 09:54:55 crc kubenswrapper[4672]: I1206 09:54:55.423472 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hb9f" event={"ID":"79bc8e8e-4c59-4615-a582-cd20864f2204","Type":"ContainerStarted","Data":"bc16de101afec8fd50b9c0b9c6a1aeee9d625ed56bfd82715378c3aeccd8d0ad"} Dec 06 09:54:55 crc kubenswrapper[4672]: I1206 09:54:55.426939 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-twwt7" event={"ID":"f68a1365-5a93-4505-99ae-e63a2c422aac","Type":"ContainerStarted","Data":"78c1806d865345d9469364e64f500e2788a63f1d7f0f3ca798f1c45199457e0b"} Dec 06 09:54:55 crc kubenswrapper[4672]: I1206 09:54:55.457682 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7hb9f" podStartSLOduration=3.036778001 podStartE2EDuration="5.457665552s" podCreationTimestamp="2025-12-06 09:54:50 +0000 UTC" firstStartedPulling="2025-12-06 09:54:52.382555386 +0000 UTC m=+2910.126815683" lastFinishedPulling="2025-12-06 09:54:54.803442947 +0000 UTC m=+2912.547703234" observedRunningTime="2025-12-06 09:54:55.446895081 +0000 UTC m=+2913.191155368" watchObservedRunningTime="2025-12-06 09:54:55.457665552 +0000 UTC m=+2913.201925839" Dec 06 09:54:55 crc kubenswrapper[4672]: I1206 09:54:55.467196 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-twwt7" podStartSLOduration=3.04194132 podStartE2EDuration="5.467180079s" podCreationTimestamp="2025-12-06 09:54:50 +0000 UTC" firstStartedPulling="2025-12-06 09:54:52.368449075 +0000 UTC m=+2910.112709362" lastFinishedPulling="2025-12-06 09:54:54.793687834 +0000 UTC m=+2912.537948121" observedRunningTime="2025-12-06 09:54:55.463705475 +0000 UTC m=+2913.207965762" watchObservedRunningTime="2025-12-06 09:54:55.467180079 +0000 UTC m=+2913.211440366" Dec 06 09:55:00 crc kubenswrapper[4672]: I1206 09:55:00.457163 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-twwt7" Dec 06 09:55:00 crc kubenswrapper[4672]: I1206 09:55:00.458813 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-twwt7" Dec 06 09:55:00 crc kubenswrapper[4672]: I1206 09:55:00.514278 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-twwt7" Dec 06 09:55:00 crc kubenswrapper[4672]: I1206 09:55:00.577479 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-twwt7" Dec 06 09:55:01 crc kubenswrapper[4672]: I1206 09:55:01.257884 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7hb9f" Dec 06 09:55:01 crc kubenswrapper[4672]: I1206 09:55:01.258959 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7hb9f" Dec 06 09:55:01 crc kubenswrapper[4672]: I1206 09:55:01.317166 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-twwt7"] Dec 06 09:55:01 crc kubenswrapper[4672]: I1206 09:55:01.317467 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7hb9f" Dec 06 09:55:01 crc kubenswrapper[4672]: I1206 09:55:01.537348 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7hb9f" Dec 06 09:55:02 crc kubenswrapper[4672]: I1206 09:55:02.503105 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-twwt7" podUID="f68a1365-5a93-4505-99ae-e63a2c422aac" containerName="registry-server" containerID="cri-o://78c1806d865345d9469364e64f500e2788a63f1d7f0f3ca798f1c45199457e0b" gracePeriod=2 Dec 06 09:55:02 crc kubenswrapper[4672]: I1206 09:55:02.929829 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-twwt7" Dec 06 09:55:03 crc kubenswrapper[4672]: I1206 09:55:03.081571 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f68a1365-5a93-4505-99ae-e63a2c422aac-utilities\") pod \"f68a1365-5a93-4505-99ae-e63a2c422aac\" (UID: \"f68a1365-5a93-4505-99ae-e63a2c422aac\") " Dec 06 09:55:03 crc kubenswrapper[4672]: I1206 09:55:03.081666 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f68a1365-5a93-4505-99ae-e63a2c422aac-catalog-content\") pod \"f68a1365-5a93-4505-99ae-e63a2c422aac\" (UID: \"f68a1365-5a93-4505-99ae-e63a2c422aac\") " Dec 06 09:55:03 crc kubenswrapper[4672]: I1206 09:55:03.081719 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j5dr\" (UniqueName: \"kubernetes.io/projected/f68a1365-5a93-4505-99ae-e63a2c422aac-kube-api-access-6j5dr\") pod \"f68a1365-5a93-4505-99ae-e63a2c422aac\" (UID: \"f68a1365-5a93-4505-99ae-e63a2c422aac\") " Dec 06 09:55:03 crc kubenswrapper[4672]: I1206 09:55:03.082374 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f68a1365-5a93-4505-99ae-e63a2c422aac-utilities" (OuterVolumeSpecName: "utilities") pod "f68a1365-5a93-4505-99ae-e63a2c422aac" (UID: "f68a1365-5a93-4505-99ae-e63a2c422aac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:55:03 crc kubenswrapper[4672]: I1206 09:55:03.088885 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f68a1365-5a93-4505-99ae-e63a2c422aac-kube-api-access-6j5dr" (OuterVolumeSpecName: "kube-api-access-6j5dr") pod "f68a1365-5a93-4505-99ae-e63a2c422aac" (UID: "f68a1365-5a93-4505-99ae-e63a2c422aac"). InnerVolumeSpecName "kube-api-access-6j5dr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:55:03 crc kubenswrapper[4672]: I1206 09:55:03.133254 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f68a1365-5a93-4505-99ae-e63a2c422aac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f68a1365-5a93-4505-99ae-e63a2c422aac" (UID: "f68a1365-5a93-4505-99ae-e63a2c422aac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:55:03 crc kubenswrapper[4672]: I1206 09:55:03.183931 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f68a1365-5a93-4505-99ae-e63a2c422aac-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:03 crc kubenswrapper[4672]: I1206 09:55:03.183982 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f68a1365-5a93-4505-99ae-e63a2c422aac-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:03 crc kubenswrapper[4672]: I1206 09:55:03.183992 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j5dr\" (UniqueName: \"kubernetes.io/projected/f68a1365-5a93-4505-99ae-e63a2c422aac-kube-api-access-6j5dr\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:03 crc kubenswrapper[4672]: I1206 09:55:03.514807 4672 generic.go:334] "Generic (PLEG): container finished" podID="f68a1365-5a93-4505-99ae-e63a2c422aac" containerID="78c1806d865345d9469364e64f500e2788a63f1d7f0f3ca798f1c45199457e0b" exitCode=0 Dec 06 09:55:03 crc kubenswrapper[4672]: I1206 09:55:03.514887 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-twwt7" Dec 06 09:55:03 crc kubenswrapper[4672]: I1206 09:55:03.514917 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-twwt7" event={"ID":"f68a1365-5a93-4505-99ae-e63a2c422aac","Type":"ContainerDied","Data":"78c1806d865345d9469364e64f500e2788a63f1d7f0f3ca798f1c45199457e0b"} Dec 06 09:55:03 crc kubenswrapper[4672]: I1206 09:55:03.514996 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-twwt7" event={"ID":"f68a1365-5a93-4505-99ae-e63a2c422aac","Type":"ContainerDied","Data":"390507c02edc636cc85afb0ceb20f4ba8a94faad37b2e8361d3c8213672383eb"} Dec 06 09:55:03 crc kubenswrapper[4672]: I1206 09:55:03.515025 4672 scope.go:117] "RemoveContainer" containerID="78c1806d865345d9469364e64f500e2788a63f1d7f0f3ca798f1c45199457e0b" Dec 06 09:55:03 crc kubenswrapper[4672]: I1206 09:55:03.546496 4672 scope.go:117] "RemoveContainer" containerID="211a2b39b6c8742291d48f1e8074dd6de57c3056b72842a06ec3b0578dd19a53" Dec 06 09:55:03 crc kubenswrapper[4672]: I1206 09:55:03.566505 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-twwt7"] Dec 06 09:55:03 crc kubenswrapper[4672]: I1206 09:55:03.574298 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-twwt7"] Dec 06 09:55:03 crc kubenswrapper[4672]: I1206 09:55:03.582481 4672 scope.go:117] "RemoveContainer" containerID="159b1246db14c7805b1653f971b0f4fd37d4a1b62df7ae6f66dcefe71e759c91" Dec 06 09:55:03 crc kubenswrapper[4672]: I1206 09:55:03.632247 4672 scope.go:117] "RemoveContainer" containerID="78c1806d865345d9469364e64f500e2788a63f1d7f0f3ca798f1c45199457e0b" Dec 06 09:55:03 crc kubenswrapper[4672]: E1206 09:55:03.632836 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78c1806d865345d9469364e64f500e2788a63f1d7f0f3ca798f1c45199457e0b\": container with ID starting with 78c1806d865345d9469364e64f500e2788a63f1d7f0f3ca798f1c45199457e0b not found: ID does not exist" containerID="78c1806d865345d9469364e64f500e2788a63f1d7f0f3ca798f1c45199457e0b" Dec 06 09:55:03 crc kubenswrapper[4672]: I1206 09:55:03.632873 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78c1806d865345d9469364e64f500e2788a63f1d7f0f3ca798f1c45199457e0b"} err="failed to get container status \"78c1806d865345d9469364e64f500e2788a63f1d7f0f3ca798f1c45199457e0b\": rpc error: code = NotFound desc = could not find container \"78c1806d865345d9469364e64f500e2788a63f1d7f0f3ca798f1c45199457e0b\": container with ID starting with 78c1806d865345d9469364e64f500e2788a63f1d7f0f3ca798f1c45199457e0b not found: ID does not exist" Dec 06 09:55:03 crc kubenswrapper[4672]: I1206 09:55:03.632898 4672 scope.go:117] "RemoveContainer" containerID="211a2b39b6c8742291d48f1e8074dd6de57c3056b72842a06ec3b0578dd19a53" Dec 06 09:55:03 crc kubenswrapper[4672]: E1206 09:55:03.633167 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"211a2b39b6c8742291d48f1e8074dd6de57c3056b72842a06ec3b0578dd19a53\": container with ID starting with 211a2b39b6c8742291d48f1e8074dd6de57c3056b72842a06ec3b0578dd19a53 not found: ID does not exist" containerID="211a2b39b6c8742291d48f1e8074dd6de57c3056b72842a06ec3b0578dd19a53" Dec 06 09:55:03 crc kubenswrapper[4672]: I1206 09:55:03.633198 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"211a2b39b6c8742291d48f1e8074dd6de57c3056b72842a06ec3b0578dd19a53"} err="failed to get container status \"211a2b39b6c8742291d48f1e8074dd6de57c3056b72842a06ec3b0578dd19a53\": rpc error: code = NotFound desc = could not find container \"211a2b39b6c8742291d48f1e8074dd6de57c3056b72842a06ec3b0578dd19a53\": container with ID starting with 211a2b39b6c8742291d48f1e8074dd6de57c3056b72842a06ec3b0578dd19a53 not found: ID does not exist" Dec 06 09:55:03 crc kubenswrapper[4672]: I1206 09:55:03.633216 4672 scope.go:117] "RemoveContainer" containerID="159b1246db14c7805b1653f971b0f4fd37d4a1b62df7ae6f66dcefe71e759c91" Dec 06 09:55:03 crc kubenswrapper[4672]: E1206 09:55:03.633452 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"159b1246db14c7805b1653f971b0f4fd37d4a1b62df7ae6f66dcefe71e759c91\": container with ID starting with 159b1246db14c7805b1653f971b0f4fd37d4a1b62df7ae6f66dcefe71e759c91 not found: ID does not exist" containerID="159b1246db14c7805b1653f971b0f4fd37d4a1b62df7ae6f66dcefe71e759c91" Dec 06 09:55:03 crc kubenswrapper[4672]: I1206 09:55:03.633477 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"159b1246db14c7805b1653f971b0f4fd37d4a1b62df7ae6f66dcefe71e759c91"} err="failed to get container status \"159b1246db14c7805b1653f971b0f4fd37d4a1b62df7ae6f66dcefe71e759c91\": rpc error: code = NotFound desc = could not find container \"159b1246db14c7805b1653f971b0f4fd37d4a1b62df7ae6f66dcefe71e759c91\": container with ID starting with 159b1246db14c7805b1653f971b0f4fd37d4a1b62df7ae6f66dcefe71e759c91 not found: ID does not exist" Dec 06 09:55:03 crc kubenswrapper[4672]: I1206 09:55:03.719398 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7hb9f"] Dec 06 09:55:04 crc kubenswrapper[4672]: I1206 09:55:04.530457 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7hb9f" podUID="79bc8e8e-4c59-4615-a582-cd20864f2204" containerName="registry-server" containerID="cri-o://bc16de101afec8fd50b9c0b9c6a1aeee9d625ed56bfd82715378c3aeccd8d0ad" gracePeriod=2 Dec 06 09:55:04 crc kubenswrapper[4672]: I1206 09:55:04.582784 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f68a1365-5a93-4505-99ae-e63a2c422aac" path="/var/lib/kubelet/pods/f68a1365-5a93-4505-99ae-e63a2c422aac/volumes" Dec 06 09:55:05 crc kubenswrapper[4672]: I1206 09:55:05.532813 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7hb9f" Dec 06 09:55:05 crc kubenswrapper[4672]: I1206 09:55:05.539821 4672 generic.go:334] "Generic (PLEG): container finished" podID="79bc8e8e-4c59-4615-a582-cd20864f2204" containerID="bc16de101afec8fd50b9c0b9c6a1aeee9d625ed56bfd82715378c3aeccd8d0ad" exitCode=0 Dec 06 09:55:05 crc kubenswrapper[4672]: I1206 09:55:05.539857 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hb9f" event={"ID":"79bc8e8e-4c59-4615-a582-cd20864f2204","Type":"ContainerDied","Data":"bc16de101afec8fd50b9c0b9c6a1aeee9d625ed56bfd82715378c3aeccd8d0ad"} Dec 06 09:55:05 crc kubenswrapper[4672]: I1206 09:55:05.539879 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hb9f" event={"ID":"79bc8e8e-4c59-4615-a582-cd20864f2204","Type":"ContainerDied","Data":"e7250f33c0a92ebab0efbc801dba52c2b271fdd2697d47885379ee66b9f40e0c"} Dec 06 09:55:05 crc kubenswrapper[4672]: I1206 09:55:05.539897 4672 scope.go:117] "RemoveContainer" containerID="bc16de101afec8fd50b9c0b9c6a1aeee9d625ed56bfd82715378c3aeccd8d0ad" Dec 06 09:55:05 crc kubenswrapper[4672]: I1206 09:55:05.539981 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7hb9f" Dec 06 09:55:05 crc kubenswrapper[4672]: I1206 09:55:05.573645 4672 scope.go:117] "RemoveContainer" containerID="36c7ea6295d72278af57d5a5decf54780531c16de50744f48a23f5cd8ff8edb0" Dec 06 09:55:05 crc kubenswrapper[4672]: I1206 09:55:05.626552 4672 scope.go:117] "RemoveContainer" containerID="9bdd24c79516200dae9829c6b97f8eacd9c447eec1d990947a61868f85af0ae2" Dec 06 09:55:05 crc kubenswrapper[4672]: I1206 09:55:05.644267 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79bc8e8e-4c59-4615-a582-cd20864f2204-utilities\") pod \"79bc8e8e-4c59-4615-a582-cd20864f2204\" (UID: \"79bc8e8e-4c59-4615-a582-cd20864f2204\") " Dec 06 09:55:05 crc kubenswrapper[4672]: I1206 09:55:05.644514 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgs7k\" (UniqueName: \"kubernetes.io/projected/79bc8e8e-4c59-4615-a582-cd20864f2204-kube-api-access-qgs7k\") pod \"79bc8e8e-4c59-4615-a582-cd20864f2204\" (UID: \"79bc8e8e-4c59-4615-a582-cd20864f2204\") " Dec 06 09:55:05 crc kubenswrapper[4672]: I1206 09:55:05.644547 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79bc8e8e-4c59-4615-a582-cd20864f2204-catalog-content\") pod \"79bc8e8e-4c59-4615-a582-cd20864f2204\" (UID: \"79bc8e8e-4c59-4615-a582-cd20864f2204\") " Dec 06 09:55:05 crc kubenswrapper[4672]: I1206 09:55:05.645842 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79bc8e8e-4c59-4615-a582-cd20864f2204-utilities" (OuterVolumeSpecName: "utilities") pod "79bc8e8e-4c59-4615-a582-cd20864f2204" (UID: "79bc8e8e-4c59-4615-a582-cd20864f2204"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:55:05 crc kubenswrapper[4672]: I1206 09:55:05.651649 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79bc8e8e-4c59-4615-a582-cd20864f2204-kube-api-access-qgs7k" (OuterVolumeSpecName: "kube-api-access-qgs7k") pod "79bc8e8e-4c59-4615-a582-cd20864f2204" (UID: "79bc8e8e-4c59-4615-a582-cd20864f2204"). InnerVolumeSpecName "kube-api-access-qgs7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:55:05 crc kubenswrapper[4672]: I1206 09:55:05.653444 4672 scope.go:117] "RemoveContainer" containerID="bc16de101afec8fd50b9c0b9c6a1aeee9d625ed56bfd82715378c3aeccd8d0ad" Dec 06 09:55:05 crc kubenswrapper[4672]: E1206 09:55:05.656998 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc16de101afec8fd50b9c0b9c6a1aeee9d625ed56bfd82715378c3aeccd8d0ad\": container with ID starting with bc16de101afec8fd50b9c0b9c6a1aeee9d625ed56bfd82715378c3aeccd8d0ad not found: ID does not exist" containerID="bc16de101afec8fd50b9c0b9c6a1aeee9d625ed56bfd82715378c3aeccd8d0ad" Dec 06 09:55:05 crc kubenswrapper[4672]: I1206 09:55:05.657040 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc16de101afec8fd50b9c0b9c6a1aeee9d625ed56bfd82715378c3aeccd8d0ad"} err="failed to get container status \"bc16de101afec8fd50b9c0b9c6a1aeee9d625ed56bfd82715378c3aeccd8d0ad\": rpc error: code = NotFound desc = could not find container \"bc16de101afec8fd50b9c0b9c6a1aeee9d625ed56bfd82715378c3aeccd8d0ad\": container with ID starting with bc16de101afec8fd50b9c0b9c6a1aeee9d625ed56bfd82715378c3aeccd8d0ad not found: ID does not exist" Dec 06 09:55:05 crc kubenswrapper[4672]: I1206 09:55:05.657063 4672 scope.go:117] "RemoveContainer" containerID="36c7ea6295d72278af57d5a5decf54780531c16de50744f48a23f5cd8ff8edb0" Dec 06 09:55:05 crc kubenswrapper[4672]: E1206 09:55:05.660918 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36c7ea6295d72278af57d5a5decf54780531c16de50744f48a23f5cd8ff8edb0\": container with ID starting with 36c7ea6295d72278af57d5a5decf54780531c16de50744f48a23f5cd8ff8edb0 not found: ID does not exist" containerID="36c7ea6295d72278af57d5a5decf54780531c16de50744f48a23f5cd8ff8edb0" Dec 06 09:55:05 crc kubenswrapper[4672]: I1206 09:55:05.660947 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36c7ea6295d72278af57d5a5decf54780531c16de50744f48a23f5cd8ff8edb0"} err="failed to get container status \"36c7ea6295d72278af57d5a5decf54780531c16de50744f48a23f5cd8ff8edb0\": rpc error: code = NotFound desc = could not find container \"36c7ea6295d72278af57d5a5decf54780531c16de50744f48a23f5cd8ff8edb0\": container with ID starting with 36c7ea6295d72278af57d5a5decf54780531c16de50744f48a23f5cd8ff8edb0 not found: ID does not exist" Dec 06 09:55:05 crc kubenswrapper[4672]: I1206 09:55:05.660969 4672 scope.go:117] "RemoveContainer" containerID="9bdd24c79516200dae9829c6b97f8eacd9c447eec1d990947a61868f85af0ae2" Dec 06 09:55:05 crc kubenswrapper[4672]: E1206 09:55:05.661319 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bdd24c79516200dae9829c6b97f8eacd9c447eec1d990947a61868f85af0ae2\": container with ID starting with 9bdd24c79516200dae9829c6b97f8eacd9c447eec1d990947a61868f85af0ae2 not found: ID does not exist" containerID="9bdd24c79516200dae9829c6b97f8eacd9c447eec1d990947a61868f85af0ae2" Dec 06 09:55:05 crc kubenswrapper[4672]: I1206 09:55:05.661350 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bdd24c79516200dae9829c6b97f8eacd9c447eec1d990947a61868f85af0ae2"} err="failed to get container status \"9bdd24c79516200dae9829c6b97f8eacd9c447eec1d990947a61868f85af0ae2\": rpc error: code = NotFound desc = could not find container \"9bdd24c79516200dae9829c6b97f8eacd9c447eec1d990947a61868f85af0ae2\": container with ID starting with 9bdd24c79516200dae9829c6b97f8eacd9c447eec1d990947a61868f85af0ae2 not found: ID does not exist" Dec 06 09:55:05 crc kubenswrapper[4672]: I1206 09:55:05.694515 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79bc8e8e-4c59-4615-a582-cd20864f2204-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79bc8e8e-4c59-4615-a582-cd20864f2204" (UID: "79bc8e8e-4c59-4615-a582-cd20864f2204"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:55:05 crc kubenswrapper[4672]: I1206 09:55:05.747471 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgs7k\" (UniqueName: \"kubernetes.io/projected/79bc8e8e-4c59-4615-a582-cd20864f2204-kube-api-access-qgs7k\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:05 crc kubenswrapper[4672]: I1206 09:55:05.747773 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79bc8e8e-4c59-4615-a582-cd20864f2204-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:05 crc kubenswrapper[4672]: I1206 09:55:05.747782 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79bc8e8e-4c59-4615-a582-cd20864f2204-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:05 crc kubenswrapper[4672]: I1206 09:55:05.881630 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7hb9f"] Dec 06 09:55:05 crc kubenswrapper[4672]: I1206 09:55:05.895951 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7hb9f"] Dec 06 09:55:06 crc kubenswrapper[4672]: I1206 09:55:06.573678 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79bc8e8e-4c59-4615-a582-cd20864f2204" path="/var/lib/kubelet/pods/79bc8e8e-4c59-4615-a582-cd20864f2204/volumes" Dec 06 09:55:12 crc kubenswrapper[4672]: I1206 09:55:12.320113 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:55:12 crc kubenswrapper[4672]: I1206 09:55:12.320767 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:55:42 crc kubenswrapper[4672]: I1206 09:55:42.319781 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:55:42 crc kubenswrapper[4672]: I1206 09:55:42.320474 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:56:12 crc kubenswrapper[4672]: I1206 09:56:12.319488 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:56:12 crc kubenswrapper[4672]: I1206 09:56:12.320079 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:56:12 crc kubenswrapper[4672]: I1206 09:56:12.320128 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 09:56:12 crc kubenswrapper[4672]: I1206 09:56:12.320869 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7bbf2781550e7a61427a2b236cb3b966725940a4a76024981629c0dc4fd1af55"} pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 09:56:12 crc kubenswrapper[4672]: I1206 09:56:12.320926 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" containerID="cri-o://7bbf2781550e7a61427a2b236cb3b966725940a4a76024981629c0dc4fd1af55" gracePeriod=600 Dec 06 09:56:12 crc kubenswrapper[4672]: E1206 09:56:12.450964 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:56:13 crc kubenswrapper[4672]: I1206 09:56:13.234378 4672 generic.go:334] "Generic (PLEG): container finished" podID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerID="7bbf2781550e7a61427a2b236cb3b966725940a4a76024981629c0dc4fd1af55" exitCode=0 Dec 06 09:56:13 crc kubenswrapper[4672]: I1206 09:56:13.234438 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerDied","Data":"7bbf2781550e7a61427a2b236cb3b966725940a4a76024981629c0dc4fd1af55"} Dec 06 09:56:13 crc kubenswrapper[4672]: I1206 09:56:13.234640 4672 scope.go:117] "RemoveContainer" containerID="7cd1be7755bf34af00f4c17fd0804f052c17aac5446317d3315921d4c9466ed8" Dec 06 09:56:13 crc kubenswrapper[4672]: I1206 09:56:13.236281 4672 scope.go:117] "RemoveContainer" containerID="7bbf2781550e7a61427a2b236cb3b966725940a4a76024981629c0dc4fd1af55" Dec 06 09:56:13 crc kubenswrapper[4672]: E1206 09:56:13.237253 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:56:26 crc kubenswrapper[4672]: I1206 09:56:26.562494 4672 scope.go:117] "RemoveContainer" containerID="7bbf2781550e7a61427a2b236cb3b966725940a4a76024981629c0dc4fd1af55" Dec 06 09:56:26 crc kubenswrapper[4672]: E1206 09:56:26.563249 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:56:39 crc kubenswrapper[4672]: I1206 09:56:39.557793 4672 scope.go:117] "RemoveContainer" containerID="7bbf2781550e7a61427a2b236cb3b966725940a4a76024981629c0dc4fd1af55" Dec 06 09:56:39 crc kubenswrapper[4672]: E1206 09:56:39.558828 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:56:54 crc kubenswrapper[4672]: I1206 09:56:54.556878 4672 scope.go:117] "RemoveContainer" containerID="7bbf2781550e7a61427a2b236cb3b966725940a4a76024981629c0dc4fd1af55" Dec 06 09:56:54 crc kubenswrapper[4672]: E1206 09:56:54.557584 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:57:06 crc kubenswrapper[4672]: I1206 09:57:06.723316 4672 generic.go:334] "Generic (PLEG): container finished" podID="cc85e883-c516-489f-b15d-6e57e4236b75" containerID="40978211a115bc9a5378ec7e30af1e055397c6e62a1fef0d627bdb480eae7553" exitCode=0 Dec 06 09:57:06 crc kubenswrapper[4672]: I1206 09:57:06.723379 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r" event={"ID":"cc85e883-c516-489f-b15d-6e57e4236b75","Type":"ContainerDied","Data":"40978211a115bc9a5378ec7e30af1e055397c6e62a1fef0d627bdb480eae7553"} Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.124659 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.251094 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpxwt\" (UniqueName: \"kubernetes.io/projected/cc85e883-c516-489f-b15d-6e57e4236b75-kube-api-access-vpxwt\") pod \"cc85e883-c516-489f-b15d-6e57e4236b75\" (UID: \"cc85e883-c516-489f-b15d-6e57e4236b75\") " Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.251548 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/cc85e883-c516-489f-b15d-6e57e4236b75-libvirt-secret-0\") pod \"cc85e883-c516-489f-b15d-6e57e4236b75\" (UID: \"cc85e883-c516-489f-b15d-6e57e4236b75\") " Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.251574 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc85e883-c516-489f-b15d-6e57e4236b75-libvirt-combined-ca-bundle\") pod \"cc85e883-c516-489f-b15d-6e57e4236b75\" (UID: \"cc85e883-c516-489f-b15d-6e57e4236b75\") " Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.251629 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc85e883-c516-489f-b15d-6e57e4236b75-ssh-key\") pod \"cc85e883-c516-489f-b15d-6e57e4236b75\" (UID: \"cc85e883-c516-489f-b15d-6e57e4236b75\") " Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.251703 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cc85e883-c516-489f-b15d-6e57e4236b75-ceph\") pod \"cc85e883-c516-489f-b15d-6e57e4236b75\" (UID: \"cc85e883-c516-489f-b15d-6e57e4236b75\") " Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.251794 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc85e883-c516-489f-b15d-6e57e4236b75-inventory\") pod \"cc85e883-c516-489f-b15d-6e57e4236b75\" (UID: \"cc85e883-c516-489f-b15d-6e57e4236b75\") " Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.256895 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc85e883-c516-489f-b15d-6e57e4236b75-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "cc85e883-c516-489f-b15d-6e57e4236b75" (UID: "cc85e883-c516-489f-b15d-6e57e4236b75"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.264388 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc85e883-c516-489f-b15d-6e57e4236b75-ceph" (OuterVolumeSpecName: "ceph") pod "cc85e883-c516-489f-b15d-6e57e4236b75" (UID: "cc85e883-c516-489f-b15d-6e57e4236b75"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.271045 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc85e883-c516-489f-b15d-6e57e4236b75-kube-api-access-vpxwt" (OuterVolumeSpecName: "kube-api-access-vpxwt") pod "cc85e883-c516-489f-b15d-6e57e4236b75" (UID: "cc85e883-c516-489f-b15d-6e57e4236b75"). InnerVolumeSpecName "kube-api-access-vpxwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.278495 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc85e883-c516-489f-b15d-6e57e4236b75-inventory" (OuterVolumeSpecName: "inventory") pod "cc85e883-c516-489f-b15d-6e57e4236b75" (UID: "cc85e883-c516-489f-b15d-6e57e4236b75"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.279550 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc85e883-c516-489f-b15d-6e57e4236b75-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cc85e883-c516-489f-b15d-6e57e4236b75" (UID: "cc85e883-c516-489f-b15d-6e57e4236b75"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.281774 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc85e883-c516-489f-b15d-6e57e4236b75-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "cc85e883-c516-489f-b15d-6e57e4236b75" (UID: "cc85e883-c516-489f-b15d-6e57e4236b75"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.353400 4672 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cc85e883-c516-489f-b15d-6e57e4236b75-ceph\") on node \"crc\" DevicePath \"\"" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.353652 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc85e883-c516-489f-b15d-6e57e4236b75-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.353729 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpxwt\" (UniqueName: \"kubernetes.io/projected/cc85e883-c516-489f-b15d-6e57e4236b75-kube-api-access-vpxwt\") on node \"crc\" DevicePath \"\"" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.353794 4672 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/cc85e883-c516-489f-b15d-6e57e4236b75-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.353855 4672 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc85e883-c516-489f-b15d-6e57e4236b75-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.353910 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc85e883-c516-489f-b15d-6e57e4236b75-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.557777 4672 scope.go:117] "RemoveContainer" containerID="7bbf2781550e7a61427a2b236cb3b966725940a4a76024981629c0dc4fd1af55" Dec 06 09:57:08 crc kubenswrapper[4672]: E1206 09:57:08.558212 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.741245 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r" event={"ID":"cc85e883-c516-489f-b15d-6e57e4236b75","Type":"ContainerDied","Data":"e059a91d0593c64504621f0954cf0d09c1ab22420e243dea70153765ca5fe419"} Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.741279 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e059a91d0593c64504621f0954cf0d09c1ab22420e243dea70153765ca5fe419" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.741330 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.925824 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb"] Dec 06 09:57:08 crc kubenswrapper[4672]: E1206 09:57:08.926256 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79bc8e8e-4c59-4615-a582-cd20864f2204" containerName="extract-utilities" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.926280 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="79bc8e8e-4c59-4615-a582-cd20864f2204" containerName="extract-utilities" Dec 06 09:57:08 crc kubenswrapper[4672]: E1206 09:57:08.926303 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79bc8e8e-4c59-4615-a582-cd20864f2204" containerName="registry-server" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.926310 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="79bc8e8e-4c59-4615-a582-cd20864f2204" containerName="registry-server" Dec 06 09:57:08 crc kubenswrapper[4672]: E1206 09:57:08.926322 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79bc8e8e-4c59-4615-a582-cd20864f2204" containerName="extract-content" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.926330 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="79bc8e8e-4c59-4615-a582-cd20864f2204" containerName="extract-content" Dec 06 09:57:08 crc kubenswrapper[4672]: E1206 09:57:08.926347 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f68a1365-5a93-4505-99ae-e63a2c422aac" containerName="extract-utilities" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.926355 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="f68a1365-5a93-4505-99ae-e63a2c422aac" containerName="extract-utilities" Dec 06 09:57:08 crc kubenswrapper[4672]: E1206 09:57:08.926364 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f68a1365-5a93-4505-99ae-e63a2c422aac" containerName="extract-content" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.926371 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="f68a1365-5a93-4505-99ae-e63a2c422aac" containerName="extract-content" Dec 06 09:57:08 crc kubenswrapper[4672]: E1206 09:57:08.926383 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f68a1365-5a93-4505-99ae-e63a2c422aac" containerName="registry-server" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.926391 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="f68a1365-5a93-4505-99ae-e63a2c422aac" containerName="registry-server" Dec 06 09:57:08 crc kubenswrapper[4672]: E1206 09:57:08.926404 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc85e883-c516-489f-b15d-6e57e4236b75" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.926412 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc85e883-c516-489f-b15d-6e57e4236b75" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.926633 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc85e883-c516-489f-b15d-6e57e4236b75" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.926664 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="79bc8e8e-4c59-4615-a582-cd20864f2204" containerName="registry-server" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.926681 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="f68a1365-5a93-4505-99ae-e63a2c422aac" containerName="registry-server" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.927386 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.929438 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.929507 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.929830 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.930777 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.930981 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.931747 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.931769 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.933221 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p6qrb" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.933874 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:57:08 crc kubenswrapper[4672]: I1206 09:57:08.935615 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb"] Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.006067 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.006275 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.006396 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.006439 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.006642 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.006712 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/b27237d2-1240-4f55-a12b-9248c3a899e4-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.006751 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.006814 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh22b\" (UniqueName: \"kubernetes.io/projected/b27237d2-1240-4f55-a12b-9248c3a899e4-kube-api-access-nh22b\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.006893 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.007046 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.007091 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.109550 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.109647 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.109698 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.109720 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.109812 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.109849 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/b27237d2-1240-4f55-a12b-9248c3a899e4-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.109871 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.109889 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh22b\" (UniqueName: \"kubernetes.io/projected/b27237d2-1240-4f55-a12b-9248c3a899e4-kube-api-access-nh22b\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.109913 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.109955 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.109982 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.111216 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.111270 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/b27237d2-1240-4f55-a12b-9248c3a899e4-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.114850 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.115226 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.115372 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.116227 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.116456 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.117454 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.117577 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.117766 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.130350 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh22b\" (UniqueName: \"kubernetes.io/projected/b27237d2-1240-4f55-a12b-9248c3a899e4-kube-api-access-nh22b\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.254729 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.787494 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb"] Dec 06 09:57:09 crc kubenswrapper[4672]: I1206 09:57:09.803108 4672 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 09:57:10 crc kubenswrapper[4672]: I1206 09:57:10.761220 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" event={"ID":"b27237d2-1240-4f55-a12b-9248c3a899e4","Type":"ContainerStarted","Data":"5d3dd27f20fd0e7b4c4549e612b94a91f303d3278a154bee80088c44113e2be0"} Dec 06 09:57:11 crc kubenswrapper[4672]: I1206 09:57:11.772448 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" event={"ID":"b27237d2-1240-4f55-a12b-9248c3a899e4","Type":"ContainerStarted","Data":"3f683b8077e4359ad29917fdba617b0f7036171ec50552be51c7ba12361afeab"} Dec 06 09:57:11 crc kubenswrapper[4672]: I1206 09:57:11.802205 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" podStartSLOduration=3.106891639 podStartE2EDuration="3.802183575s" podCreationTimestamp="2025-12-06 09:57:08 +0000 UTC" firstStartedPulling="2025-12-06 09:57:09.802538002 +0000 UTC m=+3047.546798299" lastFinishedPulling="2025-12-06 09:57:10.497829898 +0000 UTC m=+3048.242090235" observedRunningTime="2025-12-06 09:57:11.796173463 +0000 UTC m=+3049.540433790" watchObservedRunningTime="2025-12-06 09:57:11.802183575 +0000 UTC m=+3049.546443862" Dec 06 09:57:22 crc kubenswrapper[4672]: I1206 09:57:22.561895 4672 scope.go:117] "RemoveContainer" containerID="7bbf2781550e7a61427a2b236cb3b966725940a4a76024981629c0dc4fd1af55" Dec 06 09:57:22 crc kubenswrapper[4672]: E1206 09:57:22.562730 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:57:33 crc kubenswrapper[4672]: I1206 09:57:33.556988 4672 scope.go:117] "RemoveContainer" containerID="7bbf2781550e7a61427a2b236cb3b966725940a4a76024981629c0dc4fd1af55" Dec 06 09:57:33 crc kubenswrapper[4672]: E1206 09:57:33.558130 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:57:45 crc kubenswrapper[4672]: I1206 09:57:45.557128 4672 scope.go:117] "RemoveContainer" containerID="7bbf2781550e7a61427a2b236cb3b966725940a4a76024981629c0dc4fd1af55" Dec 06 09:57:45 crc kubenswrapper[4672]: E1206 09:57:45.557951 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:57:57 crc kubenswrapper[4672]: I1206 09:57:57.558089 4672 scope.go:117] "RemoveContainer" containerID="7bbf2781550e7a61427a2b236cb3b966725940a4a76024981629c0dc4fd1af55" Dec 06 09:57:57 crc kubenswrapper[4672]: E1206 09:57:57.558933 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:58:11 crc kubenswrapper[4672]: I1206 09:58:11.557467 4672 scope.go:117] "RemoveContainer" containerID="7bbf2781550e7a61427a2b236cb3b966725940a4a76024981629c0dc4fd1af55" Dec 06 09:58:11 crc kubenswrapper[4672]: E1206 09:58:11.558446 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:58:25 crc kubenswrapper[4672]: I1206 09:58:25.557169 4672 scope.go:117] "RemoveContainer" containerID="7bbf2781550e7a61427a2b236cb3b966725940a4a76024981629c0dc4fd1af55" Dec 06 09:58:25 crc kubenswrapper[4672]: E1206 09:58:25.558431 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:58:37 crc kubenswrapper[4672]: I1206 09:58:37.556953 4672 scope.go:117] "RemoveContainer" containerID="7bbf2781550e7a61427a2b236cb3b966725940a4a76024981629c0dc4fd1af55" Dec 06 09:58:37 crc kubenswrapper[4672]: E1206 09:58:37.557678 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:58:51 crc kubenswrapper[4672]: I1206 09:58:51.558523 4672 scope.go:117] "RemoveContainer" containerID="7bbf2781550e7a61427a2b236cb3b966725940a4a76024981629c0dc4fd1af55" Dec 06 09:58:51 crc kubenswrapper[4672]: E1206 09:58:51.559316 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:59:03 crc kubenswrapper[4672]: I1206 09:59:03.557486 4672 scope.go:117] "RemoveContainer" containerID="7bbf2781550e7a61427a2b236cb3b966725940a4a76024981629c0dc4fd1af55" Dec 06 09:59:03 crc kubenswrapper[4672]: E1206 09:59:03.558266 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:59:18 crc kubenswrapper[4672]: I1206 09:59:18.558110 4672 scope.go:117] "RemoveContainer" containerID="7bbf2781550e7a61427a2b236cb3b966725940a4a76024981629c0dc4fd1af55" Dec 06 09:59:18 crc kubenswrapper[4672]: E1206 09:59:18.559063 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:59:32 crc kubenswrapper[4672]: I1206 09:59:32.562438 4672 scope.go:117] "RemoveContainer" containerID="7bbf2781550e7a61427a2b236cb3b966725940a4a76024981629c0dc4fd1af55" Dec 06 09:59:32 crc kubenswrapper[4672]: E1206 09:59:32.563297 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:59:43 crc kubenswrapper[4672]: I1206 09:59:43.557412 4672 scope.go:117] "RemoveContainer" containerID="7bbf2781550e7a61427a2b236cb3b966725940a4a76024981629c0dc4fd1af55" Dec 06 09:59:43 crc kubenswrapper[4672]: E1206 09:59:43.558163 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 09:59:56 crc kubenswrapper[4672]: I1206 09:59:56.556469 4672 scope.go:117] "RemoveContainer" containerID="7bbf2781550e7a61427a2b236cb3b966725940a4a76024981629c0dc4fd1af55" Dec 06 09:59:56 crc kubenswrapper[4672]: E1206 09:59:56.557240 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:00:00 crc kubenswrapper[4672]: I1206 10:00:00.154704 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416920-lwr45"] Dec 06 10:00:00 crc kubenswrapper[4672]: I1206 10:00:00.156101 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-lwr45" Dec 06 10:00:00 crc kubenswrapper[4672]: I1206 10:00:00.158151 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 10:00:00 crc kubenswrapper[4672]: I1206 10:00:00.158492 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 10:00:00 crc kubenswrapper[4672]: I1206 10:00:00.171148 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416920-lwr45"] Dec 06 10:00:00 crc kubenswrapper[4672]: I1206 10:00:00.186991 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/24cc0cd8-74cb-448b-bff2-3addfac76ca8-secret-volume\") pod \"collect-profiles-29416920-lwr45\" (UID: \"24cc0cd8-74cb-448b-bff2-3addfac76ca8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-lwr45" Dec 06 10:00:00 crc kubenswrapper[4672]: I1206 10:00:00.187044 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhwvp\" (UniqueName: \"kubernetes.io/projected/24cc0cd8-74cb-448b-bff2-3addfac76ca8-kube-api-access-qhwvp\") pod \"collect-profiles-29416920-lwr45\" (UID: \"24cc0cd8-74cb-448b-bff2-3addfac76ca8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-lwr45" Dec 06 10:00:00 crc kubenswrapper[4672]: I1206 10:00:00.187116 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24cc0cd8-74cb-448b-bff2-3addfac76ca8-config-volume\") pod \"collect-profiles-29416920-lwr45\" (UID: \"24cc0cd8-74cb-448b-bff2-3addfac76ca8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-lwr45" Dec 06 10:00:00 crc kubenswrapper[4672]: I1206 10:00:00.288897 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24cc0cd8-74cb-448b-bff2-3addfac76ca8-config-volume\") pod \"collect-profiles-29416920-lwr45\" (UID: \"24cc0cd8-74cb-448b-bff2-3addfac76ca8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-lwr45" Dec 06 10:00:00 crc kubenswrapper[4672]: I1206 10:00:00.289044 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/24cc0cd8-74cb-448b-bff2-3addfac76ca8-secret-volume\") pod \"collect-profiles-29416920-lwr45\" (UID: \"24cc0cd8-74cb-448b-bff2-3addfac76ca8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-lwr45" Dec 06 10:00:00 crc kubenswrapper[4672]: I1206 10:00:00.289065 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhwvp\" (UniqueName: \"kubernetes.io/projected/24cc0cd8-74cb-448b-bff2-3addfac76ca8-kube-api-access-qhwvp\") pod \"collect-profiles-29416920-lwr45\" (UID: \"24cc0cd8-74cb-448b-bff2-3addfac76ca8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-lwr45" Dec 06 10:00:00 crc kubenswrapper[4672]: I1206 10:00:00.290122 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24cc0cd8-74cb-448b-bff2-3addfac76ca8-config-volume\") pod \"collect-profiles-29416920-lwr45\" (UID: \"24cc0cd8-74cb-448b-bff2-3addfac76ca8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-lwr45" Dec 06 10:00:00 crc kubenswrapper[4672]: I1206 10:00:00.298733 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/24cc0cd8-74cb-448b-bff2-3addfac76ca8-secret-volume\") pod \"collect-profiles-29416920-lwr45\" (UID: \"24cc0cd8-74cb-448b-bff2-3addfac76ca8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-lwr45" Dec 06 10:00:00 crc kubenswrapper[4672]: I1206 10:00:00.316736 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhwvp\" (UniqueName: \"kubernetes.io/projected/24cc0cd8-74cb-448b-bff2-3addfac76ca8-kube-api-access-qhwvp\") pod \"collect-profiles-29416920-lwr45\" (UID: \"24cc0cd8-74cb-448b-bff2-3addfac76ca8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-lwr45" Dec 06 10:00:00 crc kubenswrapper[4672]: I1206 10:00:00.478810 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-lwr45" Dec 06 10:00:00 crc kubenswrapper[4672]: I1206 10:00:00.956810 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416920-lwr45"] Dec 06 10:00:01 crc kubenswrapper[4672]: I1206 10:00:01.348991 4672 generic.go:334] "Generic (PLEG): container finished" podID="24cc0cd8-74cb-448b-bff2-3addfac76ca8" containerID="88fe823d6f30fbcf79dedb9399a0798d6907e74031d6ca36222fe2f866ebb201" exitCode=0 Dec 06 10:00:01 crc kubenswrapper[4672]: I1206 10:00:01.349373 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-lwr45" event={"ID":"24cc0cd8-74cb-448b-bff2-3addfac76ca8","Type":"ContainerDied","Data":"88fe823d6f30fbcf79dedb9399a0798d6907e74031d6ca36222fe2f866ebb201"} Dec 06 10:00:01 crc kubenswrapper[4672]: I1206 10:00:01.349406 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-lwr45" event={"ID":"24cc0cd8-74cb-448b-bff2-3addfac76ca8","Type":"ContainerStarted","Data":"1db20f89cf44051eadb5ad315b611b92f103b72c8f2b5b6105d7340a6e93c4d7"} Dec 06 10:00:02 crc kubenswrapper[4672]: I1206 10:00:02.658679 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-lwr45" Dec 06 10:00:02 crc kubenswrapper[4672]: I1206 10:00:02.767816 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhwvp\" (UniqueName: \"kubernetes.io/projected/24cc0cd8-74cb-448b-bff2-3addfac76ca8-kube-api-access-qhwvp\") pod \"24cc0cd8-74cb-448b-bff2-3addfac76ca8\" (UID: \"24cc0cd8-74cb-448b-bff2-3addfac76ca8\") " Dec 06 10:00:02 crc kubenswrapper[4672]: I1206 10:00:02.767865 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24cc0cd8-74cb-448b-bff2-3addfac76ca8-config-volume\") pod \"24cc0cd8-74cb-448b-bff2-3addfac76ca8\" (UID: \"24cc0cd8-74cb-448b-bff2-3addfac76ca8\") " Dec 06 10:00:02 crc kubenswrapper[4672]: I1206 10:00:02.767937 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/24cc0cd8-74cb-448b-bff2-3addfac76ca8-secret-volume\") pod \"24cc0cd8-74cb-448b-bff2-3addfac76ca8\" (UID: \"24cc0cd8-74cb-448b-bff2-3addfac76ca8\") " Dec 06 10:00:02 crc kubenswrapper[4672]: I1206 10:00:02.768791 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24cc0cd8-74cb-448b-bff2-3addfac76ca8-config-volume" (OuterVolumeSpecName: "config-volume") pod "24cc0cd8-74cb-448b-bff2-3addfac76ca8" (UID: "24cc0cd8-74cb-448b-bff2-3addfac76ca8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:00:02 crc kubenswrapper[4672]: I1206 10:00:02.777851 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24cc0cd8-74cb-448b-bff2-3addfac76ca8-kube-api-access-qhwvp" (OuterVolumeSpecName: "kube-api-access-qhwvp") pod "24cc0cd8-74cb-448b-bff2-3addfac76ca8" (UID: "24cc0cd8-74cb-448b-bff2-3addfac76ca8"). InnerVolumeSpecName "kube-api-access-qhwvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:00:02 crc kubenswrapper[4672]: I1206 10:00:02.777876 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24cc0cd8-74cb-448b-bff2-3addfac76ca8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "24cc0cd8-74cb-448b-bff2-3addfac76ca8" (UID: "24cc0cd8-74cb-448b-bff2-3addfac76ca8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:00:02 crc kubenswrapper[4672]: I1206 10:00:02.869659 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhwvp\" (UniqueName: \"kubernetes.io/projected/24cc0cd8-74cb-448b-bff2-3addfac76ca8-kube-api-access-qhwvp\") on node \"crc\" DevicePath \"\"" Dec 06 10:00:02 crc kubenswrapper[4672]: I1206 10:00:02.869690 4672 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24cc0cd8-74cb-448b-bff2-3addfac76ca8-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 10:00:02 crc kubenswrapper[4672]: I1206 10:00:02.869701 4672 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/24cc0cd8-74cb-448b-bff2-3addfac76ca8-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 10:00:03 crc kubenswrapper[4672]: I1206 10:00:03.374919 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-lwr45" event={"ID":"24cc0cd8-74cb-448b-bff2-3addfac76ca8","Type":"ContainerDied","Data":"1db20f89cf44051eadb5ad315b611b92f103b72c8f2b5b6105d7340a6e93c4d7"} Dec 06 10:00:03 crc kubenswrapper[4672]: I1206 10:00:03.374967 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-lwr45" Dec 06 10:00:03 crc kubenswrapper[4672]: I1206 10:00:03.374974 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1db20f89cf44051eadb5ad315b611b92f103b72c8f2b5b6105d7340a6e93c4d7" Dec 06 10:00:03 crc kubenswrapper[4672]: I1206 10:00:03.767178 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416875-rxxdl"] Dec 06 10:00:03 crc kubenswrapper[4672]: I1206 10:00:03.775894 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416875-rxxdl"] Dec 06 10:00:04 crc kubenswrapper[4672]: I1206 10:00:04.570722 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5571247b-599d-4a18-b507-f9a153ded2ec" path="/var/lib/kubelet/pods/5571247b-599d-4a18-b507-f9a153ded2ec/volumes" Dec 06 10:00:07 crc kubenswrapper[4672]: I1206 10:00:07.557018 4672 scope.go:117] "RemoveContainer" containerID="7bbf2781550e7a61427a2b236cb3b966725940a4a76024981629c0dc4fd1af55" Dec 06 10:00:07 crc kubenswrapper[4672]: E1206 10:00:07.557571 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:00:22 crc kubenswrapper[4672]: I1206 10:00:22.566347 4672 scope.go:117] "RemoveContainer" containerID="7bbf2781550e7a61427a2b236cb3b966725940a4a76024981629c0dc4fd1af55" Dec 06 10:00:22 crc kubenswrapper[4672]: E1206 10:00:22.568242 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:00:28 crc kubenswrapper[4672]: I1206 10:00:28.131902 4672 scope.go:117] "RemoveContainer" containerID="6e94145b9634ce6896a4ba9e85586c7ab48924ba369b1a34839cb38fb75a2012" Dec 06 10:00:35 crc kubenswrapper[4672]: I1206 10:00:35.556589 4672 scope.go:117] "RemoveContainer" containerID="7bbf2781550e7a61427a2b236cb3b966725940a4a76024981629c0dc4fd1af55" Dec 06 10:00:35 crc kubenswrapper[4672]: E1206 10:00:35.557303 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:00:36 crc kubenswrapper[4672]: I1206 10:00:36.665646 4672 generic.go:334] "Generic (PLEG): container finished" podID="b27237d2-1240-4f55-a12b-9248c3a899e4" containerID="3f683b8077e4359ad29917fdba617b0f7036171ec50552be51c7ba12361afeab" exitCode=0 Dec 06 10:00:36 crc kubenswrapper[4672]: I1206 10:00:36.665798 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" event={"ID":"b27237d2-1240-4f55-a12b-9248c3a899e4","Type":"ContainerDied","Data":"3f683b8077e4359ad29917fdba617b0f7036171ec50552be51c7ba12361afeab"} Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.072219 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.245621 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-ceph\") pod \"b27237d2-1240-4f55-a12b-9248c3a899e4\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.245667 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-migration-ssh-key-1\") pod \"b27237d2-1240-4f55-a12b-9248c3a899e4\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.245751 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-cell1-compute-config-0\") pod \"b27237d2-1240-4f55-a12b-9248c3a899e4\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.245805 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-migration-ssh-key-0\") pod \"b27237d2-1240-4f55-a12b-9248c3a899e4\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.245827 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-ssh-key\") pod \"b27237d2-1240-4f55-a12b-9248c3a899e4\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.245861 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-inventory\") pod \"b27237d2-1240-4f55-a12b-9248c3a899e4\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.245889 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/b27237d2-1240-4f55-a12b-9248c3a899e4-ceph-nova-0\") pod \"b27237d2-1240-4f55-a12b-9248c3a899e4\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.245941 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-extra-config-0\") pod \"b27237d2-1240-4f55-a12b-9248c3a899e4\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.245977 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh22b\" (UniqueName: \"kubernetes.io/projected/b27237d2-1240-4f55-a12b-9248c3a899e4-kube-api-access-nh22b\") pod \"b27237d2-1240-4f55-a12b-9248c3a899e4\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.245997 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-custom-ceph-combined-ca-bundle\") pod \"b27237d2-1240-4f55-a12b-9248c3a899e4\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.246028 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-cell1-compute-config-1\") pod \"b27237d2-1240-4f55-a12b-9248c3a899e4\" (UID: \"b27237d2-1240-4f55-a12b-9248c3a899e4\") " Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.254853 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b27237d2-1240-4f55-a12b-9248c3a899e4-kube-api-access-nh22b" (OuterVolumeSpecName: "kube-api-access-nh22b") pod "b27237d2-1240-4f55-a12b-9248c3a899e4" (UID: "b27237d2-1240-4f55-a12b-9248c3a899e4"). InnerVolumeSpecName "kube-api-access-nh22b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.267943 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "b27237d2-1240-4f55-a12b-9248c3a899e4" (UID: "b27237d2-1240-4f55-a12b-9248c3a899e4"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.270255 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-ceph" (OuterVolumeSpecName: "ceph") pod "b27237d2-1240-4f55-a12b-9248c3a899e4" (UID: "b27237d2-1240-4f55-a12b-9248c3a899e4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.273854 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b27237d2-1240-4f55-a12b-9248c3a899e4-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "b27237d2-1240-4f55-a12b-9248c3a899e4" (UID: "b27237d2-1240-4f55-a12b-9248c3a899e4"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.283994 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "b27237d2-1240-4f55-a12b-9248c3a899e4" (UID: "b27237d2-1240-4f55-a12b-9248c3a899e4"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.294161 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "b27237d2-1240-4f55-a12b-9248c3a899e4" (UID: "b27237d2-1240-4f55-a12b-9248c3a899e4"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.295260 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-inventory" (OuterVolumeSpecName: "inventory") pod "b27237d2-1240-4f55-a12b-9248c3a899e4" (UID: "b27237d2-1240-4f55-a12b-9248c3a899e4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.295876 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b27237d2-1240-4f55-a12b-9248c3a899e4" (UID: "b27237d2-1240-4f55-a12b-9248c3a899e4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.299579 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "b27237d2-1240-4f55-a12b-9248c3a899e4" (UID: "b27237d2-1240-4f55-a12b-9248c3a899e4"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.301652 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "b27237d2-1240-4f55-a12b-9248c3a899e4" (UID: "b27237d2-1240-4f55-a12b-9248c3a899e4"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.303770 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "b27237d2-1240-4f55-a12b-9248c3a899e4" (UID: "b27237d2-1240-4f55-a12b-9248c3a899e4"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.348127 4672 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.348354 4672 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.348469 4672 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-ceph\") on node \"crc\" DevicePath \"\"" Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.348551 4672 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.348771 4672 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.348861 4672 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.348941 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.349011 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b27237d2-1240-4f55-a12b-9248c3a899e4-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.349093 4672 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/b27237d2-1240-4f55-a12b-9248c3a899e4-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.349172 4672 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b27237d2-1240-4f55-a12b-9248c3a899e4-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.349241 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh22b\" (UniqueName: \"kubernetes.io/projected/b27237d2-1240-4f55-a12b-9248c3a899e4-kube-api-access-nh22b\") on node \"crc\" DevicePath \"\"" Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.682477 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" event={"ID":"b27237d2-1240-4f55-a12b-9248c3a899e4","Type":"ContainerDied","Data":"5d3dd27f20fd0e7b4c4549e612b94a91f303d3278a154bee80088c44113e2be0"} Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.682513 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d3dd27f20fd0e7b4c4549e612b94a91f303d3278a154bee80088c44113e2be0" Dec 06 10:00:38 crc kubenswrapper[4672]: I1206 10:00:38.682532 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb" Dec 06 10:00:47 crc kubenswrapper[4672]: I1206 10:00:47.556814 4672 scope.go:117] "RemoveContainer" containerID="7bbf2781550e7a61427a2b236cb3b966725940a4a76024981629c0dc4fd1af55" Dec 06 10:00:47 crc kubenswrapper[4672]: E1206 10:00:47.557560 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:00:53 crc kubenswrapper[4672]: I1206 10:00:53.965135 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 06 10:00:53 crc kubenswrapper[4672]: E1206 10:00:53.965969 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24cc0cd8-74cb-448b-bff2-3addfac76ca8" containerName="collect-profiles" Dec 06 10:00:53 crc kubenswrapper[4672]: I1206 10:00:53.965981 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="24cc0cd8-74cb-448b-bff2-3addfac76ca8" containerName="collect-profiles" Dec 06 10:00:53 crc kubenswrapper[4672]: E1206 10:00:53.965998 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b27237d2-1240-4f55-a12b-9248c3a899e4" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Dec 06 10:00:53 crc kubenswrapper[4672]: I1206 10:00:53.966008 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="b27237d2-1240-4f55-a12b-9248c3a899e4" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Dec 06 10:00:53 crc kubenswrapper[4672]: I1206 10:00:53.966173 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="24cc0cd8-74cb-448b-bff2-3addfac76ca8" containerName="collect-profiles" Dec 06 10:00:53 crc kubenswrapper[4672]: I1206 10:00:53.966189 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="b27237d2-1240-4f55-a12b-9248c3a899e4" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Dec 06 10:00:53 crc kubenswrapper[4672]: I1206 10:00:53.967141 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:53 crc kubenswrapper[4672]: I1206 10:00:53.969162 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 06 10:00:53 crc kubenswrapper[4672]: I1206 10:00:53.974836 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Dec 06 10:00:53 crc kubenswrapper[4672]: I1206 10:00:53.975820 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Dec 06 10:00:53 crc kubenswrapper[4672]: I1206 10:00:53.977232 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 06 10:00:53 crc kubenswrapper[4672]: I1206 10:00:53.981220 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Dec 06 10:00:53 crc kubenswrapper[4672]: I1206 10:00:53.988718 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 06 10:00:53 crc kubenswrapper[4672]: I1206 10:00:53.996315 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028277 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a35af03e-7b48-40ee-a857-20824a664f4e-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028324 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv76z\" (UniqueName: \"kubernetes.io/projected/db88dbb4-2112-4bec-a4e4-f0bf562bb173-kube-api-access-hv76z\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028345 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/db88dbb4-2112-4bec-a4e4-f0bf562bb173-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028368 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a35af03e-7b48-40ee-a857-20824a664f4e-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028399 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/db88dbb4-2112-4bec-a4e4-f0bf562bb173-ceph\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028418 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/db88dbb4-2112-4bec-a4e4-f0bf562bb173-dev\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028433 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db88dbb4-2112-4bec-a4e4-f0bf562bb173-config-data-custom\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028449 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a35af03e-7b48-40ee-a857-20824a664f4e-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028466 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/db88dbb4-2112-4bec-a4e4-f0bf562bb173-sys\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028480 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a35af03e-7b48-40ee-a857-20824a664f4e-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028497 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a35af03e-7b48-40ee-a857-20824a664f4e-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028519 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db88dbb4-2112-4bec-a4e4-f0bf562bb173-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028532 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/db88dbb4-2112-4bec-a4e4-f0bf562bb173-lib-modules\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028550 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a35af03e-7b48-40ee-a857-20824a664f4e-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028569 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a35af03e-7b48-40ee-a857-20824a664f4e-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028591 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/db88dbb4-2112-4bec-a4e4-f0bf562bb173-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028625 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a35af03e-7b48-40ee-a857-20824a664f4e-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028639 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a35af03e-7b48-40ee-a857-20824a664f4e-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028670 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/a35af03e-7b48-40ee-a857-20824a664f4e-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028683 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/db88dbb4-2112-4bec-a4e4-f0bf562bb173-run\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028704 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db88dbb4-2112-4bec-a4e4-f0bf562bb173-scripts\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028722 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn9jn\" (UniqueName: \"kubernetes.io/projected/a35af03e-7b48-40ee-a857-20824a664f4e-kube-api-access-gn9jn\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028738 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a35af03e-7b48-40ee-a857-20824a664f4e-run\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028753 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a35af03e-7b48-40ee-a857-20824a664f4e-dev\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028773 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/db88dbb4-2112-4bec-a4e4-f0bf562bb173-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028791 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/db88dbb4-2112-4bec-a4e4-f0bf562bb173-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028812 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db88dbb4-2112-4bec-a4e4-f0bf562bb173-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028826 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db88dbb4-2112-4bec-a4e4-f0bf562bb173-config-data\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028843 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/a35af03e-7b48-40ee-a857-20824a664f4e-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028864 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a35af03e-7b48-40ee-a857-20824a664f4e-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028883 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a35af03e-7b48-40ee-a857-20824a664f4e-sys\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.028912 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/db88dbb4-2112-4bec-a4e4-f0bf562bb173-etc-nvme\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.130851 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/db88dbb4-2112-4bec-a4e4-f0bf562bb173-ceph\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131114 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/db88dbb4-2112-4bec-a4e4-f0bf562bb173-dev\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131135 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db88dbb4-2112-4bec-a4e4-f0bf562bb173-config-data-custom\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131153 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a35af03e-7b48-40ee-a857-20824a664f4e-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131171 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/db88dbb4-2112-4bec-a4e4-f0bf562bb173-sys\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131187 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a35af03e-7b48-40ee-a857-20824a664f4e-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131204 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a35af03e-7b48-40ee-a857-20824a664f4e-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131228 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db88dbb4-2112-4bec-a4e4-f0bf562bb173-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131246 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/db88dbb4-2112-4bec-a4e4-f0bf562bb173-lib-modules\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131260 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a35af03e-7b48-40ee-a857-20824a664f4e-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131281 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a35af03e-7b48-40ee-a857-20824a664f4e-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131304 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/db88dbb4-2112-4bec-a4e4-f0bf562bb173-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131324 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a35af03e-7b48-40ee-a857-20824a664f4e-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131338 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a35af03e-7b48-40ee-a857-20824a664f4e-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131372 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/a35af03e-7b48-40ee-a857-20824a664f4e-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131388 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/db88dbb4-2112-4bec-a4e4-f0bf562bb173-run\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131413 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db88dbb4-2112-4bec-a4e4-f0bf562bb173-scripts\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131437 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn9jn\" (UniqueName: \"kubernetes.io/projected/a35af03e-7b48-40ee-a857-20824a664f4e-kube-api-access-gn9jn\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131455 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a35af03e-7b48-40ee-a857-20824a664f4e-run\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131471 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a35af03e-7b48-40ee-a857-20824a664f4e-dev\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131488 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/db88dbb4-2112-4bec-a4e4-f0bf562bb173-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131523 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/db88dbb4-2112-4bec-a4e4-f0bf562bb173-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131553 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db88dbb4-2112-4bec-a4e4-f0bf562bb173-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131568 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db88dbb4-2112-4bec-a4e4-f0bf562bb173-config-data\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131587 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/a35af03e-7b48-40ee-a857-20824a664f4e-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131627 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a35af03e-7b48-40ee-a857-20824a664f4e-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131665 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a35af03e-7b48-40ee-a857-20824a664f4e-sys\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131703 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/db88dbb4-2112-4bec-a4e4-f0bf562bb173-etc-nvme\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131733 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a35af03e-7b48-40ee-a857-20824a664f4e-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131750 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/db88dbb4-2112-4bec-a4e4-f0bf562bb173-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131765 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv76z\" (UniqueName: \"kubernetes.io/projected/db88dbb4-2112-4bec-a4e4-f0bf562bb173-kube-api-access-hv76z\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.131790 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a35af03e-7b48-40ee-a857-20824a664f4e-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.133811 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/db88dbb4-2112-4bec-a4e4-f0bf562bb173-run\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.139643 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a35af03e-7b48-40ee-a857-20824a664f4e-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.139741 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db88dbb4-2112-4bec-a4e4-f0bf562bb173-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.139805 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/db88dbb4-2112-4bec-a4e4-f0bf562bb173-dev\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.139857 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/a35af03e-7b48-40ee-a857-20824a664f4e-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.139889 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a35af03e-7b48-40ee-a857-20824a664f4e-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.139911 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a35af03e-7b48-40ee-a857-20824a664f4e-sys\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.140003 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/db88dbb4-2112-4bec-a4e4-f0bf562bb173-etc-nvme\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.140087 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a35af03e-7b48-40ee-a857-20824a664f4e-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.140117 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/db88dbb4-2112-4bec-a4e4-f0bf562bb173-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.140333 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a35af03e-7b48-40ee-a857-20824a664f4e-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.140360 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a35af03e-7b48-40ee-a857-20824a664f4e-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.140335 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a35af03e-7b48-40ee-a857-20824a664f4e-dev\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.140447 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/db88dbb4-2112-4bec-a4e4-f0bf562bb173-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.140483 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/db88dbb4-2112-4bec-a4e4-f0bf562bb173-lib-modules\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.140512 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a35af03e-7b48-40ee-a857-20824a664f4e-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.140625 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a35af03e-7b48-40ee-a857-20824a664f4e-run\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.140692 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/db88dbb4-2112-4bec-a4e4-f0bf562bb173-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.140829 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/db88dbb4-2112-4bec-a4e4-f0bf562bb173-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.141121 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/db88dbb4-2112-4bec-a4e4-f0bf562bb173-sys\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.144100 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a35af03e-7b48-40ee-a857-20824a664f4e-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.144183 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/a35af03e-7b48-40ee-a857-20824a664f4e-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.145515 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db88dbb4-2112-4bec-a4e4-f0bf562bb173-config-data\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.147793 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a35af03e-7b48-40ee-a857-20824a664f4e-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.152198 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a35af03e-7b48-40ee-a857-20824a664f4e-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.152851 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db88dbb4-2112-4bec-a4e4-f0bf562bb173-config-data-custom\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.160229 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/db88dbb4-2112-4bec-a4e4-f0bf562bb173-ceph\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.160856 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db88dbb4-2112-4bec-a4e4-f0bf562bb173-scripts\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.163156 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db88dbb4-2112-4bec-a4e4-f0bf562bb173-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.163656 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a35af03e-7b48-40ee-a857-20824a664f4e-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.175547 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv76z\" (UniqueName: \"kubernetes.io/projected/db88dbb4-2112-4bec-a4e4-f0bf562bb173-kube-api-access-hv76z\") pod \"cinder-backup-0\" (UID: \"db88dbb4-2112-4bec-a4e4-f0bf562bb173\") " pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.185848 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn9jn\" (UniqueName: \"kubernetes.io/projected/a35af03e-7b48-40ee-a857-20824a664f4e-kube-api-access-gn9jn\") pod \"cinder-volume-volume1-0\" (UID: \"a35af03e-7b48-40ee-a857-20824a664f4e\") " pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.285958 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.315438 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.819639 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.823053 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.838286 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.838512 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.838665 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zbcbp" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.838817 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.875795 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgm7n\" (UniqueName: \"kubernetes.io/projected/c68f70da-7f06-4f89-8b77-cb1fa481bc29-kube-api-access-vgm7n\") pod \"glance-default-external-api-0\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " pod="openstack/glance-default-external-api-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.891467 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c68f70da-7f06-4f89-8b77-cb1fa481bc29-config-data\") pod \"glance-default-external-api-0\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " pod="openstack/glance-default-external-api-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.891500 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c68f70da-7f06-4f89-8b77-cb1fa481bc29-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " pod="openstack/glance-default-external-api-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.891523 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f70da-7f06-4f89-8b77-cb1fa481bc29-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " pod="openstack/glance-default-external-api-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.891570 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c68f70da-7f06-4f89-8b77-cb1fa481bc29-logs\") pod \"glance-default-external-api-0\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " pod="openstack/glance-default-external-api-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.891655 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c68f70da-7f06-4f89-8b77-cb1fa481bc29-scripts\") pod \"glance-default-external-api-0\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " pod="openstack/glance-default-external-api-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.891773 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " pod="openstack/glance-default-external-api-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.885046 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.892101 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c68f70da-7f06-4f89-8b77-cb1fa481bc29-ceph\") pod \"glance-default-external-api-0\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " pod="openstack/glance-default-external-api-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.892121 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c68f70da-7f06-4f89-8b77-cb1fa481bc29-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " pod="openstack/glance-default-external-api-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.893423 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.907609 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.908151 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.914668 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.945361 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.981492 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-fvtdd"] Dec 06 10:00:54 crc kubenswrapper[4672]: I1206 10:00:54.982691 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-fvtdd" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:54.995247 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgm7n\" (UniqueName: \"kubernetes.io/projected/c68f70da-7f06-4f89-8b77-cb1fa481bc29-kube-api-access-vgm7n\") pod \"glance-default-external-api-0\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " pod="openstack/glance-default-external-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:54.995287 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c68f70da-7f06-4f89-8b77-cb1fa481bc29-config-data\") pod \"glance-default-external-api-0\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " pod="openstack/glance-default-external-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:54.995304 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c68f70da-7f06-4f89-8b77-cb1fa481bc29-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " pod="openstack/glance-default-external-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:54.995320 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f70da-7f06-4f89-8b77-cb1fa481bc29-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " pod="openstack/glance-default-external-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:54.995342 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c68f70da-7f06-4f89-8b77-cb1fa481bc29-logs\") pod \"glance-default-external-api-0\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " pod="openstack/glance-default-external-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:54.995370 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c68f70da-7f06-4f89-8b77-cb1fa481bc29-scripts\") pod \"glance-default-external-api-0\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " pod="openstack/glance-default-external-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:54.995416 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " pod="openstack/glance-default-external-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:54.995454 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c68f70da-7f06-4f89-8b77-cb1fa481bc29-ceph\") pod \"glance-default-external-api-0\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " pod="openstack/glance-default-external-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:54.995469 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c68f70da-7f06-4f89-8b77-cb1fa481bc29-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " pod="openstack/glance-default-external-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:54.997243 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c68f70da-7f06-4f89-8b77-cb1fa481bc29-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " pod="openstack/glance-default-external-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.002379 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f70da-7f06-4f89-8b77-cb1fa481bc29-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " pod="openstack/glance-default-external-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.014292 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c68f70da-7f06-4f89-8b77-cb1fa481bc29-scripts\") pod \"glance-default-external-api-0\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " pod="openstack/glance-default-external-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.015758 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.016195 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c68f70da-7f06-4f89-8b77-cb1fa481bc29-logs\") pod \"glance-default-external-api-0\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " pod="openstack/glance-default-external-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.017132 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c68f70da-7f06-4f89-8b77-cb1fa481bc29-config-data\") pod \"glance-default-external-api-0\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " pod="openstack/glance-default-external-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.069776 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c68f70da-7f06-4f89-8b77-cb1fa481bc29-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " pod="openstack/glance-default-external-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.070617 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " pod="openstack/glance-default-external-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.075491 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c68f70da-7f06-4f89-8b77-cb1fa481bc29-ceph\") pod \"glance-default-external-api-0\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " pod="openstack/glance-default-external-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.075704 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgm7n\" (UniqueName: \"kubernetes.io/projected/c68f70da-7f06-4f89-8b77-cb1fa481bc29-kube-api-access-vgm7n\") pod \"glance-default-external-api-0\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " pod="openstack/glance-default-external-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.090733 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-fvtdd"] Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.118586 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.118642 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.118679 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.118701 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-ceph\") pod \"glance-default-internal-api-0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.118758 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl6qm\" (UniqueName: \"kubernetes.io/projected/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-kube-api-access-fl6qm\") pod \"glance-default-internal-api-0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.118815 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.118840 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.118859 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-logs\") pod \"glance-default-internal-api-0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.118893 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.152253 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.220520 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn9br\" (UniqueName: \"kubernetes.io/projected/54fd106c-75b6-4c7a-810c-dd13d4655cba-kube-api-access-vn9br\") pod \"manila-db-create-fvtdd\" (UID: \"54fd106c-75b6-4c7a-810c-dd13d4655cba\") " pod="openstack/manila-db-create-fvtdd" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.220572 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl6qm\" (UniqueName: \"kubernetes.io/projected/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-kube-api-access-fl6qm\") pod \"glance-default-internal-api-0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.220638 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.220668 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.220690 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-logs\") pod \"glance-default-internal-api-0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.220725 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54fd106c-75b6-4c7a-810c-dd13d4655cba-operator-scripts\") pod \"manila-db-create-fvtdd\" (UID: \"54fd106c-75b6-4c7a-810c-dd13d4655cba\") " pod="openstack/manila-db-create-fvtdd" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.220745 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.220780 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.220803 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.220837 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.220856 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-ceph\") pod \"glance-default-internal-api-0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.222367 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.233405 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.233773 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-logs\") pod \"glance-default-internal-api-0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.235177 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.237418 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-ceph\") pod \"glance-default-internal-api-0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.238103 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-3062-account-create-update-7k7nc"] Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.238450 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.239189 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-3062-account-create-update-7k7nc" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.244067 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.253226 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.253504 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.265077 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.324061 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn9br\" (UniqueName: \"kubernetes.io/projected/54fd106c-75b6-4c7a-810c-dd13d4655cba-kube-api-access-vn9br\") pod \"manila-db-create-fvtdd\" (UID: \"54fd106c-75b6-4c7a-810c-dd13d4655cba\") " pod="openstack/manila-db-create-fvtdd" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.324150 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/129c23da-4548-4207-808e-296c6e1f6396-operator-scripts\") pod \"manila-3062-account-create-update-7k7nc\" (UID: \"129c23da-4548-4207-808e-296c6e1f6396\") " pod="openstack/manila-3062-account-create-update-7k7nc" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.324216 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54fd106c-75b6-4c7a-810c-dd13d4655cba-operator-scripts\") pod \"manila-db-create-fvtdd\" (UID: \"54fd106c-75b6-4c7a-810c-dd13d4655cba\") " pod="openstack/manila-db-create-fvtdd" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.324234 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j2qj\" (UniqueName: \"kubernetes.io/projected/129c23da-4548-4207-808e-296c6e1f6396-kube-api-access-2j2qj\") pod \"manila-3062-account-create-update-7k7nc\" (UID: \"129c23da-4548-4207-808e-296c6e1f6396\") " pod="openstack/manila-3062-account-create-update-7k7nc" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.325156 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54fd106c-75b6-4c7a-810c-dd13d4655cba-operator-scripts\") pod \"manila-db-create-fvtdd\" (UID: \"54fd106c-75b6-4c7a-810c-dd13d4655cba\") " pod="openstack/manila-db-create-fvtdd" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.327537 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl6qm\" (UniqueName: \"kubernetes.io/projected/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-kube-api-access-fl6qm\") pod \"glance-default-internal-api-0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.365012 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-3062-account-create-update-7k7nc"] Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.407142 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.417130 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn9br\" (UniqueName: \"kubernetes.io/projected/54fd106c-75b6-4c7a-810c-dd13d4655cba-kube-api-access-vn9br\") pod \"manila-db-create-fvtdd\" (UID: \"54fd106c-75b6-4c7a-810c-dd13d4655cba\") " pod="openstack/manila-db-create-fvtdd" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.426812 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/129c23da-4548-4207-808e-296c6e1f6396-operator-scripts\") pod \"manila-3062-account-create-update-7k7nc\" (UID: \"129c23da-4548-4207-808e-296c6e1f6396\") " pod="openstack/manila-3062-account-create-update-7k7nc" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.426897 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j2qj\" (UniqueName: \"kubernetes.io/projected/129c23da-4548-4207-808e-296c6e1f6396-kube-api-access-2j2qj\") pod \"manila-3062-account-create-update-7k7nc\" (UID: \"129c23da-4548-4207-808e-296c6e1f6396\") " pod="openstack/manila-3062-account-create-update-7k7nc" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.428153 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/129c23da-4548-4207-808e-296c6e1f6396-operator-scripts\") pod \"manila-3062-account-create-update-7k7nc\" (UID: \"129c23da-4548-4207-808e-296c6e1f6396\") " pod="openstack/manila-3062-account-create-update-7k7nc" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.484362 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7c65c599b9-mz2lg"] Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.485843 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c65c599b9-mz2lg" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.495332 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j2qj\" (UniqueName: \"kubernetes.io/projected/129c23da-4548-4207-808e-296c6e1f6396-kube-api-access-2j2qj\") pod \"manila-3062-account-create-update-7k7nc\" (UID: \"129c23da-4548-4207-808e-296c6e1f6396\") " pod="openstack/manila-3062-account-create-update-7k7nc" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.509220 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.509445 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.509555 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.509739 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-rv44v" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.523218 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-3062-account-create-update-7k7nc" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.528549 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c65c599b9-mz2lg"] Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.529706 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d26c88b2-357e-4c41-8f93-1f6422329000-scripts\") pod \"horizon-7c65c599b9-mz2lg\" (UID: \"d26c88b2-357e-4c41-8f93-1f6422329000\") " pod="openstack/horizon-7c65c599b9-mz2lg" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.529760 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d26c88b2-357e-4c41-8f93-1f6422329000-horizon-secret-key\") pod \"horizon-7c65c599b9-mz2lg\" (UID: \"d26c88b2-357e-4c41-8f93-1f6422329000\") " pod="openstack/horizon-7c65c599b9-mz2lg" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.529830 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrckh\" (UniqueName: \"kubernetes.io/projected/d26c88b2-357e-4c41-8f93-1f6422329000-kube-api-access-vrckh\") pod \"horizon-7c65c599b9-mz2lg\" (UID: \"d26c88b2-357e-4c41-8f93-1f6422329000\") " pod="openstack/horizon-7c65c599b9-mz2lg" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.529849 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d26c88b2-357e-4c41-8f93-1f6422329000-config-data\") pod \"horizon-7c65c599b9-mz2lg\" (UID: \"d26c88b2-357e-4c41-8f93-1f6422329000\") " pod="openstack/horizon-7c65c599b9-mz2lg" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.529868 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d26c88b2-357e-4c41-8f93-1f6422329000-logs\") pod \"horizon-7c65c599b9-mz2lg\" (UID: \"d26c88b2-357e-4c41-8f93-1f6422329000\") " pod="openstack/horizon-7c65c599b9-mz2lg" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.563827 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.570316 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.634068 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrckh\" (UniqueName: \"kubernetes.io/projected/d26c88b2-357e-4c41-8f93-1f6422329000-kube-api-access-vrckh\") pod \"horizon-7c65c599b9-mz2lg\" (UID: \"d26c88b2-357e-4c41-8f93-1f6422329000\") " pod="openstack/horizon-7c65c599b9-mz2lg" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.634130 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d26c88b2-357e-4c41-8f93-1f6422329000-config-data\") pod \"horizon-7c65c599b9-mz2lg\" (UID: \"d26c88b2-357e-4c41-8f93-1f6422329000\") " pod="openstack/horizon-7c65c599b9-mz2lg" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.634172 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d26c88b2-357e-4c41-8f93-1f6422329000-logs\") pod \"horizon-7c65c599b9-mz2lg\" (UID: \"d26c88b2-357e-4c41-8f93-1f6422329000\") " pod="openstack/horizon-7c65c599b9-mz2lg" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.634299 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d26c88b2-357e-4c41-8f93-1f6422329000-scripts\") pod \"horizon-7c65c599b9-mz2lg\" (UID: \"d26c88b2-357e-4c41-8f93-1f6422329000\") " pod="openstack/horizon-7c65c599b9-mz2lg" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.634341 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d26c88b2-357e-4c41-8f93-1f6422329000-horizon-secret-key\") pod \"horizon-7c65c599b9-mz2lg\" (UID: \"d26c88b2-357e-4c41-8f93-1f6422329000\") " pod="openstack/horizon-7c65c599b9-mz2lg" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.639119 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d26c88b2-357e-4c41-8f93-1f6422329000-horizon-secret-key\") pod \"horizon-7c65c599b9-mz2lg\" (UID: \"d26c88b2-357e-4c41-8f93-1f6422329000\") " pod="openstack/horizon-7c65c599b9-mz2lg" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.639668 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d26c88b2-357e-4c41-8f93-1f6422329000-scripts\") pod \"horizon-7c65c599b9-mz2lg\" (UID: \"d26c88b2-357e-4c41-8f93-1f6422329000\") " pod="openstack/horizon-7c65c599b9-mz2lg" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.641209 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d26c88b2-357e-4c41-8f93-1f6422329000-config-data\") pod \"horizon-7c65c599b9-mz2lg\" (UID: \"d26c88b2-357e-4c41-8f93-1f6422329000\") " pod="openstack/horizon-7c65c599b9-mz2lg" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.646490 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d26c88b2-357e-4c41-8f93-1f6422329000-logs\") pod \"horizon-7c65c599b9-mz2lg\" (UID: \"d26c88b2-357e-4c41-8f93-1f6422329000\") " pod="openstack/horizon-7c65c599b9-mz2lg" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.665026 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-fvtdd" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.677272 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrckh\" (UniqueName: \"kubernetes.io/projected/d26c88b2-357e-4c41-8f93-1f6422329000-kube-api-access-vrckh\") pod \"horizon-7c65c599b9-mz2lg\" (UID: \"d26c88b2-357e-4c41-8f93-1f6422329000\") " pod="openstack/horizon-7c65c599b9-mz2lg" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.754737 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.828465 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-76f96bf4d5-7wt8s"] Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.830065 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76f96bf4d5-7wt8s" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.873360 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c65c599b9-mz2lg" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.879473 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"db88dbb4-2112-4bec-a4e4-f0bf562bb173","Type":"ContainerStarted","Data":"5af1f262062467cf637b3caacfa3d8e3fd5a896aa60440af5ee8b02523a2d64e"} Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.887372 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76f96bf4d5-7wt8s"] Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.941628 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b071d77-1fdb-4938-95ee-10e91492c545-logs\") pod \"horizon-76f96bf4d5-7wt8s\" (UID: \"9b071d77-1fdb-4938-95ee-10e91492c545\") " pod="openstack/horizon-76f96bf4d5-7wt8s" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.941695 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9b071d77-1fdb-4938-95ee-10e91492c545-horizon-secret-key\") pod \"horizon-76f96bf4d5-7wt8s\" (UID: \"9b071d77-1fdb-4938-95ee-10e91492c545\") " pod="openstack/horizon-76f96bf4d5-7wt8s" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.941732 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b071d77-1fdb-4938-95ee-10e91492c545-scripts\") pod \"horizon-76f96bf4d5-7wt8s\" (UID: \"9b071d77-1fdb-4938-95ee-10e91492c545\") " pod="openstack/horizon-76f96bf4d5-7wt8s" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.941842 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrzvq\" (UniqueName: \"kubernetes.io/projected/9b071d77-1fdb-4938-95ee-10e91492c545-kube-api-access-lrzvq\") pod \"horizon-76f96bf4d5-7wt8s\" (UID: \"9b071d77-1fdb-4938-95ee-10e91492c545\") " pod="openstack/horizon-76f96bf4d5-7wt8s" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.941888 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9b071d77-1fdb-4938-95ee-10e91492c545-config-data\") pod \"horizon-76f96bf4d5-7wt8s\" (UID: \"9b071d77-1fdb-4938-95ee-10e91492c545\") " pod="openstack/horizon-76f96bf4d5-7wt8s" Dec 06 10:00:55 crc kubenswrapper[4672]: I1206 10:00:55.948277 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 06 10:00:56 crc kubenswrapper[4672]: I1206 10:00:56.049723 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b071d77-1fdb-4938-95ee-10e91492c545-scripts\") pod \"horizon-76f96bf4d5-7wt8s\" (UID: \"9b071d77-1fdb-4938-95ee-10e91492c545\") " pod="openstack/horizon-76f96bf4d5-7wt8s" Dec 06 10:00:56 crc kubenswrapper[4672]: I1206 10:00:56.049919 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrzvq\" (UniqueName: \"kubernetes.io/projected/9b071d77-1fdb-4938-95ee-10e91492c545-kube-api-access-lrzvq\") pod \"horizon-76f96bf4d5-7wt8s\" (UID: \"9b071d77-1fdb-4938-95ee-10e91492c545\") " pod="openstack/horizon-76f96bf4d5-7wt8s" Dec 06 10:00:56 crc kubenswrapper[4672]: I1206 10:00:56.049987 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9b071d77-1fdb-4938-95ee-10e91492c545-config-data\") pod \"horizon-76f96bf4d5-7wt8s\" (UID: \"9b071d77-1fdb-4938-95ee-10e91492c545\") " pod="openstack/horizon-76f96bf4d5-7wt8s" Dec 06 10:00:56 crc kubenswrapper[4672]: I1206 10:00:56.050031 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b071d77-1fdb-4938-95ee-10e91492c545-logs\") pod \"horizon-76f96bf4d5-7wt8s\" (UID: \"9b071d77-1fdb-4938-95ee-10e91492c545\") " pod="openstack/horizon-76f96bf4d5-7wt8s" Dec 06 10:00:56 crc kubenswrapper[4672]: I1206 10:00:56.050104 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9b071d77-1fdb-4938-95ee-10e91492c545-horizon-secret-key\") pod \"horizon-76f96bf4d5-7wt8s\" (UID: \"9b071d77-1fdb-4938-95ee-10e91492c545\") " pod="openstack/horizon-76f96bf4d5-7wt8s" Dec 06 10:00:56 crc kubenswrapper[4672]: I1206 10:00:56.053551 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b071d77-1fdb-4938-95ee-10e91492c545-logs\") pod \"horizon-76f96bf4d5-7wt8s\" (UID: \"9b071d77-1fdb-4938-95ee-10e91492c545\") " pod="openstack/horizon-76f96bf4d5-7wt8s" Dec 06 10:00:56 crc kubenswrapper[4672]: I1206 10:00:56.056424 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9b071d77-1fdb-4938-95ee-10e91492c545-config-data\") pod \"horizon-76f96bf4d5-7wt8s\" (UID: \"9b071d77-1fdb-4938-95ee-10e91492c545\") " pod="openstack/horizon-76f96bf4d5-7wt8s" Dec 06 10:00:56 crc kubenswrapper[4672]: I1206 10:00:56.064840 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b071d77-1fdb-4938-95ee-10e91492c545-scripts\") pod \"horizon-76f96bf4d5-7wt8s\" (UID: \"9b071d77-1fdb-4938-95ee-10e91492c545\") " pod="openstack/horizon-76f96bf4d5-7wt8s" Dec 06 10:00:56 crc kubenswrapper[4672]: I1206 10:00:56.087135 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9b071d77-1fdb-4938-95ee-10e91492c545-horizon-secret-key\") pod \"horizon-76f96bf4d5-7wt8s\" (UID: \"9b071d77-1fdb-4938-95ee-10e91492c545\") " pod="openstack/horizon-76f96bf4d5-7wt8s" Dec 06 10:00:56 crc kubenswrapper[4672]: I1206 10:00:56.120205 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrzvq\" (UniqueName: \"kubernetes.io/projected/9b071d77-1fdb-4938-95ee-10e91492c545-kube-api-access-lrzvq\") pod \"horizon-76f96bf4d5-7wt8s\" (UID: \"9b071d77-1fdb-4938-95ee-10e91492c545\") " pod="openstack/horizon-76f96bf4d5-7wt8s" Dec 06 10:00:56 crc kubenswrapper[4672]: I1206 10:00:56.243351 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76f96bf4d5-7wt8s" Dec 06 10:00:56 crc kubenswrapper[4672]: I1206 10:00:56.704760 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 10:00:56 crc kubenswrapper[4672]: W1206 10:00:56.725379 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc68f70da_7f06_4f89_8b77_cb1fa481bc29.slice/crio-e44116545b3f0e3561d0ca59248552fa6d9569e890e6176a6fa4d99b0f16de52 WatchSource:0}: Error finding container e44116545b3f0e3561d0ca59248552fa6d9569e890e6176a6fa4d99b0f16de52: Status 404 returned error can't find the container with id e44116545b3f0e3561d0ca59248552fa6d9569e890e6176a6fa4d99b0f16de52 Dec 06 10:00:56 crc kubenswrapper[4672]: I1206 10:00:56.765765 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-fvtdd"] Dec 06 10:00:56 crc kubenswrapper[4672]: I1206 10:00:56.927540 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"db88dbb4-2112-4bec-a4e4-f0bf562bb173","Type":"ContainerStarted","Data":"d8f547862807336ef31c6e32875c0d4929e5a0ded24f9ca3b9dfd099883b00f7"} Dec 06 10:00:56 crc kubenswrapper[4672]: I1206 10:00:56.949573 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"a35af03e-7b48-40ee-a857-20824a664f4e","Type":"ContainerStarted","Data":"c3114bb09ba83bcbeb2c3b222219ae60f0964c22e8001bb6db7cd4995b72f076"} Dec 06 10:00:56 crc kubenswrapper[4672]: I1206 10:00:56.954841 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-fvtdd" event={"ID":"54fd106c-75b6-4c7a-810c-dd13d4655cba","Type":"ContainerStarted","Data":"a971c0ef0efe3b2e052f9c6c580ebeab8544b6eaf87d9445610f5c8f5eaa8148"} Dec 06 10:00:56 crc kubenswrapper[4672]: I1206 10:00:56.972692 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-3062-account-create-update-7k7nc"] Dec 06 10:00:56 crc kubenswrapper[4672]: I1206 10:00:56.986881 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c68f70da-7f06-4f89-8b77-cb1fa481bc29","Type":"ContainerStarted","Data":"e44116545b3f0e3561d0ca59248552fa6d9569e890e6176a6fa4d99b0f16de52"} Dec 06 10:00:57 crc kubenswrapper[4672]: I1206 10:00:57.017584 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c65c599b9-mz2lg"] Dec 06 10:00:57 crc kubenswrapper[4672]: I1206 10:00:57.048922 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76f96bf4d5-7wt8s"] Dec 06 10:00:57 crc kubenswrapper[4672]: I1206 10:00:57.057794 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 10:00:57 crc kubenswrapper[4672]: I1206 10:00:57.999404 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c65c599b9-mz2lg" event={"ID":"d26c88b2-357e-4c41-8f93-1f6422329000","Type":"ContainerStarted","Data":"3337070ccd27e6a4582d339749efb97cb0bccc215ff1d359f46e806dfee91191"} Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.002327 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"db88dbb4-2112-4bec-a4e4-f0bf562bb173","Type":"ContainerStarted","Data":"e9192c6bbee92aeaa527dfb614d91bab18d9ea0e4c9a5a872a373ec91991063b"} Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.009391 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0","Type":"ContainerStarted","Data":"f01e7a165873833ed18debf401b773560d06375996329cbd6743d3e8f2876426"} Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.011199 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-3062-account-create-update-7k7nc" event={"ID":"129c23da-4548-4207-808e-296c6e1f6396","Type":"ContainerStarted","Data":"3090f0031848a6bf3c8a021d9097785a1f3dbe9b99e3f54cb478402d8e24aa94"} Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.013061 4672 generic.go:334] "Generic (PLEG): container finished" podID="54fd106c-75b6-4c7a-810c-dd13d4655cba" containerID="8c1542d192562b39daee0b6830d11cc3c88e0389589e3bab2d4a2c69002de8ee" exitCode=0 Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.013127 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-fvtdd" event={"ID":"54fd106c-75b6-4c7a-810c-dd13d4655cba","Type":"ContainerDied","Data":"8c1542d192562b39daee0b6830d11cc3c88e0389589e3bab2d4a2c69002de8ee"} Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.015091 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c68f70da-7f06-4f89-8b77-cb1fa481bc29","Type":"ContainerStarted","Data":"7c77bc375801e61fc4836f40c0765bd9b6fddae90557bd94a1cc73b4901ab7c3"} Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.016460 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76f96bf4d5-7wt8s" event={"ID":"9b071d77-1fdb-4938-95ee-10e91492c545","Type":"ContainerStarted","Data":"d8384d411952f3eb9bb4d82aded129e5eab4d0e33ccc750bb50ebed0bda1bd96"} Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.035376 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.712350286 podStartE2EDuration="5.035358917s" podCreationTimestamp="2025-12-06 10:00:53 +0000 UTC" firstStartedPulling="2025-12-06 10:00:55.12139663 +0000 UTC m=+3272.865656917" lastFinishedPulling="2025-12-06 10:00:56.444405261 +0000 UTC m=+3274.188665548" observedRunningTime="2025-12-06 10:00:58.024546653 +0000 UTC m=+3275.768806970" watchObservedRunningTime="2025-12-06 10:00:58.035358917 +0000 UTC m=+3275.779619214" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.499332 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-76f96bf4d5-7wt8s"] Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.544256 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5b745d8b98-pzrsc"] Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.546386 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b745d8b98-pzrsc" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.554543 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.644126 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd13bd1-0e47-4739-9f82-e673232e3c61-combined-ca-bundle\") pod \"horizon-5b745d8b98-pzrsc\" (UID: \"afd13bd1-0e47-4739-9f82-e673232e3c61\") " pod="openstack/horizon-5b745d8b98-pzrsc" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.644330 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afd13bd1-0e47-4739-9f82-e673232e3c61-logs\") pod \"horizon-5b745d8b98-pzrsc\" (UID: \"afd13bd1-0e47-4739-9f82-e673232e3c61\") " pod="openstack/horizon-5b745d8b98-pzrsc" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.644352 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clsmh\" (UniqueName: \"kubernetes.io/projected/afd13bd1-0e47-4739-9f82-e673232e3c61-kube-api-access-clsmh\") pod \"horizon-5b745d8b98-pzrsc\" (UID: \"afd13bd1-0e47-4739-9f82-e673232e3c61\") " pod="openstack/horizon-5b745d8b98-pzrsc" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.644396 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/afd13bd1-0e47-4739-9f82-e673232e3c61-horizon-secret-key\") pod \"horizon-5b745d8b98-pzrsc\" (UID: \"afd13bd1-0e47-4739-9f82-e673232e3c61\") " pod="openstack/horizon-5b745d8b98-pzrsc" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.644424 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/afd13bd1-0e47-4739-9f82-e673232e3c61-horizon-tls-certs\") pod \"horizon-5b745d8b98-pzrsc\" (UID: \"afd13bd1-0e47-4739-9f82-e673232e3c61\") " pod="openstack/horizon-5b745d8b98-pzrsc" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.644492 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/afd13bd1-0e47-4739-9f82-e673232e3c61-config-data\") pod \"horizon-5b745d8b98-pzrsc\" (UID: \"afd13bd1-0e47-4739-9f82-e673232e3c61\") " pod="openstack/horizon-5b745d8b98-pzrsc" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.644514 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/afd13bd1-0e47-4739-9f82-e673232e3c61-scripts\") pod \"horizon-5b745d8b98-pzrsc\" (UID: \"afd13bd1-0e47-4739-9f82-e673232e3c61\") " pod="openstack/horizon-5b745d8b98-pzrsc" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.701664 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5b745d8b98-pzrsc"] Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.701701 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c65c599b9-mz2lg"] Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.723777 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8c74dbc66-8ghhf"] Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.737435 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8c74dbc66-8ghhf"] Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.737534 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8c74dbc66-8ghhf" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.803795 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd13bd1-0e47-4739-9f82-e673232e3c61-combined-ca-bundle\") pod \"horizon-5b745d8b98-pzrsc\" (UID: \"afd13bd1-0e47-4739-9f82-e673232e3c61\") " pod="openstack/horizon-5b745d8b98-pzrsc" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.804226 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afd13bd1-0e47-4739-9f82-e673232e3c61-logs\") pod \"horizon-5b745d8b98-pzrsc\" (UID: \"afd13bd1-0e47-4739-9f82-e673232e3c61\") " pod="openstack/horizon-5b745d8b98-pzrsc" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.804327 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clsmh\" (UniqueName: \"kubernetes.io/projected/afd13bd1-0e47-4739-9f82-e673232e3c61-kube-api-access-clsmh\") pod \"horizon-5b745d8b98-pzrsc\" (UID: \"afd13bd1-0e47-4739-9f82-e673232e3c61\") " pod="openstack/horizon-5b745d8b98-pzrsc" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.804512 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/afd13bd1-0e47-4739-9f82-e673232e3c61-horizon-secret-key\") pod \"horizon-5b745d8b98-pzrsc\" (UID: \"afd13bd1-0e47-4739-9f82-e673232e3c61\") " pod="openstack/horizon-5b745d8b98-pzrsc" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.804880 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afd13bd1-0e47-4739-9f82-e673232e3c61-logs\") pod \"horizon-5b745d8b98-pzrsc\" (UID: \"afd13bd1-0e47-4739-9f82-e673232e3c61\") " pod="openstack/horizon-5b745d8b98-pzrsc" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.807895 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/afd13bd1-0e47-4739-9f82-e673232e3c61-horizon-tls-certs\") pod \"horizon-5b745d8b98-pzrsc\" (UID: \"afd13bd1-0e47-4739-9f82-e673232e3c61\") " pod="openstack/horizon-5b745d8b98-pzrsc" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.808221 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/afd13bd1-0e47-4739-9f82-e673232e3c61-config-data\") pod \"horizon-5b745d8b98-pzrsc\" (UID: \"afd13bd1-0e47-4739-9f82-e673232e3c61\") " pod="openstack/horizon-5b745d8b98-pzrsc" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.808563 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/afd13bd1-0e47-4739-9f82-e673232e3c61-scripts\") pod \"horizon-5b745d8b98-pzrsc\" (UID: \"afd13bd1-0e47-4739-9f82-e673232e3c61\") " pod="openstack/horizon-5b745d8b98-pzrsc" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.809562 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/afd13bd1-0e47-4739-9f82-e673232e3c61-scripts\") pod \"horizon-5b745d8b98-pzrsc\" (UID: \"afd13bd1-0e47-4739-9f82-e673232e3c61\") " pod="openstack/horizon-5b745d8b98-pzrsc" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.815076 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/afd13bd1-0e47-4739-9f82-e673232e3c61-config-data\") pod \"horizon-5b745d8b98-pzrsc\" (UID: \"afd13bd1-0e47-4739-9f82-e673232e3c61\") " pod="openstack/horizon-5b745d8b98-pzrsc" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.838227 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd13bd1-0e47-4739-9f82-e673232e3c61-combined-ca-bundle\") pod \"horizon-5b745d8b98-pzrsc\" (UID: \"afd13bd1-0e47-4739-9f82-e673232e3c61\") " pod="openstack/horizon-5b745d8b98-pzrsc" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.851222 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clsmh\" (UniqueName: \"kubernetes.io/projected/afd13bd1-0e47-4739-9f82-e673232e3c61-kube-api-access-clsmh\") pod \"horizon-5b745d8b98-pzrsc\" (UID: \"afd13bd1-0e47-4739-9f82-e673232e3c61\") " pod="openstack/horizon-5b745d8b98-pzrsc" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.854244 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/afd13bd1-0e47-4739-9f82-e673232e3c61-horizon-secret-key\") pod \"horizon-5b745d8b98-pzrsc\" (UID: \"afd13bd1-0e47-4739-9f82-e673232e3c61\") " pod="openstack/horizon-5b745d8b98-pzrsc" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.891138 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/afd13bd1-0e47-4739-9f82-e673232e3c61-horizon-tls-certs\") pod \"horizon-5b745d8b98-pzrsc\" (UID: \"afd13bd1-0e47-4739-9f82-e673232e3c61\") " pod="openstack/horizon-5b745d8b98-pzrsc" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.899410 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b745d8b98-pzrsc" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.916708 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70d8ba3e-3f2d-4627-afab-5bb8908f89eb-logs\") pod \"horizon-8c74dbc66-8ghhf\" (UID: \"70d8ba3e-3f2d-4627-afab-5bb8908f89eb\") " pod="openstack/horizon-8c74dbc66-8ghhf" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.916772 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70d8ba3e-3f2d-4627-afab-5bb8908f89eb-scripts\") pod \"horizon-8c74dbc66-8ghhf\" (UID: \"70d8ba3e-3f2d-4627-afab-5bb8908f89eb\") " pod="openstack/horizon-8c74dbc66-8ghhf" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.916809 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70d8ba3e-3f2d-4627-afab-5bb8908f89eb-combined-ca-bundle\") pod \"horizon-8c74dbc66-8ghhf\" (UID: \"70d8ba3e-3f2d-4627-afab-5bb8908f89eb\") " pod="openstack/horizon-8c74dbc66-8ghhf" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.916834 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/70d8ba3e-3f2d-4627-afab-5bb8908f89eb-horizon-secret-key\") pod \"horizon-8c74dbc66-8ghhf\" (UID: \"70d8ba3e-3f2d-4627-afab-5bb8908f89eb\") " pod="openstack/horizon-8c74dbc66-8ghhf" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.916865 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8qrp\" (UniqueName: \"kubernetes.io/projected/70d8ba3e-3f2d-4627-afab-5bb8908f89eb-kube-api-access-t8qrp\") pod \"horizon-8c74dbc66-8ghhf\" (UID: \"70d8ba3e-3f2d-4627-afab-5bb8908f89eb\") " pod="openstack/horizon-8c74dbc66-8ghhf" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.916961 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/70d8ba3e-3f2d-4627-afab-5bb8908f89eb-horizon-tls-certs\") pod \"horizon-8c74dbc66-8ghhf\" (UID: \"70d8ba3e-3f2d-4627-afab-5bb8908f89eb\") " pod="openstack/horizon-8c74dbc66-8ghhf" Dec 06 10:00:58 crc kubenswrapper[4672]: I1206 10:00:58.916993 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70d8ba3e-3f2d-4627-afab-5bb8908f89eb-config-data\") pod \"horizon-8c74dbc66-8ghhf\" (UID: \"70d8ba3e-3f2d-4627-afab-5bb8908f89eb\") " pod="openstack/horizon-8c74dbc66-8ghhf" Dec 06 10:00:59 crc kubenswrapper[4672]: I1206 10:00:59.018565 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/70d8ba3e-3f2d-4627-afab-5bb8908f89eb-horizon-tls-certs\") pod \"horizon-8c74dbc66-8ghhf\" (UID: \"70d8ba3e-3f2d-4627-afab-5bb8908f89eb\") " pod="openstack/horizon-8c74dbc66-8ghhf" Dec 06 10:00:59 crc kubenswrapper[4672]: I1206 10:00:59.019062 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70d8ba3e-3f2d-4627-afab-5bb8908f89eb-config-data\") pod \"horizon-8c74dbc66-8ghhf\" (UID: \"70d8ba3e-3f2d-4627-afab-5bb8908f89eb\") " pod="openstack/horizon-8c74dbc66-8ghhf" Dec 06 10:00:59 crc kubenswrapper[4672]: I1206 10:00:59.019103 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70d8ba3e-3f2d-4627-afab-5bb8908f89eb-logs\") pod \"horizon-8c74dbc66-8ghhf\" (UID: \"70d8ba3e-3f2d-4627-afab-5bb8908f89eb\") " pod="openstack/horizon-8c74dbc66-8ghhf" Dec 06 10:00:59 crc kubenswrapper[4672]: I1206 10:00:59.019139 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70d8ba3e-3f2d-4627-afab-5bb8908f89eb-scripts\") pod \"horizon-8c74dbc66-8ghhf\" (UID: \"70d8ba3e-3f2d-4627-afab-5bb8908f89eb\") " pod="openstack/horizon-8c74dbc66-8ghhf" Dec 06 10:00:59 crc kubenswrapper[4672]: I1206 10:00:59.019165 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70d8ba3e-3f2d-4627-afab-5bb8908f89eb-combined-ca-bundle\") pod \"horizon-8c74dbc66-8ghhf\" (UID: \"70d8ba3e-3f2d-4627-afab-5bb8908f89eb\") " pod="openstack/horizon-8c74dbc66-8ghhf" Dec 06 10:00:59 crc kubenswrapper[4672]: I1206 10:00:59.019192 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/70d8ba3e-3f2d-4627-afab-5bb8908f89eb-horizon-secret-key\") pod \"horizon-8c74dbc66-8ghhf\" (UID: \"70d8ba3e-3f2d-4627-afab-5bb8908f89eb\") " pod="openstack/horizon-8c74dbc66-8ghhf" Dec 06 10:00:59 crc kubenswrapper[4672]: I1206 10:00:59.019224 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8qrp\" (UniqueName: \"kubernetes.io/projected/70d8ba3e-3f2d-4627-afab-5bb8908f89eb-kube-api-access-t8qrp\") pod \"horizon-8c74dbc66-8ghhf\" (UID: \"70d8ba3e-3f2d-4627-afab-5bb8908f89eb\") " pod="openstack/horizon-8c74dbc66-8ghhf" Dec 06 10:00:59 crc kubenswrapper[4672]: I1206 10:00:59.020537 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70d8ba3e-3f2d-4627-afab-5bb8908f89eb-logs\") pod \"horizon-8c74dbc66-8ghhf\" (UID: \"70d8ba3e-3f2d-4627-afab-5bb8908f89eb\") " pod="openstack/horizon-8c74dbc66-8ghhf" Dec 06 10:00:59 crc kubenswrapper[4672]: I1206 10:00:59.020932 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70d8ba3e-3f2d-4627-afab-5bb8908f89eb-scripts\") pod \"horizon-8c74dbc66-8ghhf\" (UID: \"70d8ba3e-3f2d-4627-afab-5bb8908f89eb\") " pod="openstack/horizon-8c74dbc66-8ghhf" Dec 06 10:00:59 crc kubenswrapper[4672]: I1206 10:00:59.021790 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70d8ba3e-3f2d-4627-afab-5bb8908f89eb-config-data\") pod \"horizon-8c74dbc66-8ghhf\" (UID: \"70d8ba3e-3f2d-4627-afab-5bb8908f89eb\") " pod="openstack/horizon-8c74dbc66-8ghhf" Dec 06 10:00:59 crc kubenswrapper[4672]: I1206 10:00:59.028574 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/70d8ba3e-3f2d-4627-afab-5bb8908f89eb-horizon-tls-certs\") pod \"horizon-8c74dbc66-8ghhf\" (UID: \"70d8ba3e-3f2d-4627-afab-5bb8908f89eb\") " pod="openstack/horizon-8c74dbc66-8ghhf" Dec 06 10:00:59 crc kubenswrapper[4672]: I1206 10:00:59.028912 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70d8ba3e-3f2d-4627-afab-5bb8908f89eb-combined-ca-bundle\") pod \"horizon-8c74dbc66-8ghhf\" (UID: \"70d8ba3e-3f2d-4627-afab-5bb8908f89eb\") " pod="openstack/horizon-8c74dbc66-8ghhf" Dec 06 10:00:59 crc kubenswrapper[4672]: I1206 10:00:59.038326 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/70d8ba3e-3f2d-4627-afab-5bb8908f89eb-horizon-secret-key\") pod \"horizon-8c74dbc66-8ghhf\" (UID: \"70d8ba3e-3f2d-4627-afab-5bb8908f89eb\") " pod="openstack/horizon-8c74dbc66-8ghhf" Dec 06 10:00:59 crc kubenswrapper[4672]: I1206 10:00:59.046020 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8qrp\" (UniqueName: \"kubernetes.io/projected/70d8ba3e-3f2d-4627-afab-5bb8908f89eb-kube-api-access-t8qrp\") pod \"horizon-8c74dbc66-8ghhf\" (UID: \"70d8ba3e-3f2d-4627-afab-5bb8908f89eb\") " pod="openstack/horizon-8c74dbc66-8ghhf" Dec 06 10:00:59 crc kubenswrapper[4672]: I1206 10:00:59.082015 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0","Type":"ContainerStarted","Data":"42e671b00ad788843e79d1d20736cf39ba806f16acceae8e4afff023bbfb05c1"} Dec 06 10:00:59 crc kubenswrapper[4672]: I1206 10:00:59.089445 4672 generic.go:334] "Generic (PLEG): container finished" podID="129c23da-4548-4207-808e-296c6e1f6396" containerID="bdbdec7cbeffcee022b2554351f1d7917ac0f455aad112cb3a5e8818b32b5d05" exitCode=0 Dec 06 10:00:59 crc kubenswrapper[4672]: I1206 10:00:59.089509 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-3062-account-create-update-7k7nc" event={"ID":"129c23da-4548-4207-808e-296c6e1f6396","Type":"ContainerDied","Data":"bdbdec7cbeffcee022b2554351f1d7917ac0f455aad112cb3a5e8818b32b5d05"} Dec 06 10:00:59 crc kubenswrapper[4672]: I1206 10:00:59.114916 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"a35af03e-7b48-40ee-a857-20824a664f4e","Type":"ContainerStarted","Data":"f94642b33d154bccaf475cc0df9c821bc4132dd529200a756b2bae210f204034"} Dec 06 10:00:59 crc kubenswrapper[4672]: I1206 10:00:59.115154 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"a35af03e-7b48-40ee-a857-20824a664f4e","Type":"ContainerStarted","Data":"df6be76a3d4369bed5d1c15770e4c83ad77a7d48ca38e289a8f8fe281593005f"} Dec 06 10:00:59 crc kubenswrapper[4672]: I1206 10:00:59.147480 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8c74dbc66-8ghhf" Dec 06 10:00:59 crc kubenswrapper[4672]: I1206 10:00:59.172561 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=4.938036979 podStartE2EDuration="6.172540411s" podCreationTimestamp="2025-12-06 10:00:53 +0000 UTC" firstStartedPulling="2025-12-06 10:00:56.410725639 +0000 UTC m=+3274.154985926" lastFinishedPulling="2025-12-06 10:00:57.645229071 +0000 UTC m=+3275.389489358" observedRunningTime="2025-12-06 10:00:59.144387937 +0000 UTC m=+3276.888648244" watchObservedRunningTime="2025-12-06 10:00:59.172540411 +0000 UTC m=+3276.916800698" Dec 06 10:00:59 crc kubenswrapper[4672]: I1206 10:00:59.293825 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Dec 06 10:00:59 crc kubenswrapper[4672]: I1206 10:00:59.316463 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Dec 06 10:00:59 crc kubenswrapper[4672]: I1206 10:00:59.718860 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5b745d8b98-pzrsc"] Dec 06 10:00:59 crc kubenswrapper[4672]: W1206 10:00:59.743268 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafd13bd1_0e47_4739_9f82_e673232e3c61.slice/crio-63be707c1f4a160a8d5ec72b4a3245a9569529caf6eaf1780ad01344c825e31a WatchSource:0}: Error finding container 63be707c1f4a160a8d5ec72b4a3245a9569529caf6eaf1780ad01344c825e31a: Status 404 returned error can't find the container with id 63be707c1f4a160a8d5ec72b4a3245a9569529caf6eaf1780ad01344c825e31a Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.046357 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-fvtdd" Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.069413 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54fd106c-75b6-4c7a-810c-dd13d4655cba-operator-scripts\") pod \"54fd106c-75b6-4c7a-810c-dd13d4655cba\" (UID: \"54fd106c-75b6-4c7a-810c-dd13d4655cba\") " Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.069762 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn9br\" (UniqueName: \"kubernetes.io/projected/54fd106c-75b6-4c7a-810c-dd13d4655cba-kube-api-access-vn9br\") pod \"54fd106c-75b6-4c7a-810c-dd13d4655cba\" (UID: \"54fd106c-75b6-4c7a-810c-dd13d4655cba\") " Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.070773 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54fd106c-75b6-4c7a-810c-dd13d4655cba-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "54fd106c-75b6-4c7a-810c-dd13d4655cba" (UID: "54fd106c-75b6-4c7a-810c-dd13d4655cba"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.094232 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54fd106c-75b6-4c7a-810c-dd13d4655cba-kube-api-access-vn9br" (OuterVolumeSpecName: "kube-api-access-vn9br") pod "54fd106c-75b6-4c7a-810c-dd13d4655cba" (UID: "54fd106c-75b6-4c7a-810c-dd13d4655cba"). InnerVolumeSpecName "kube-api-access-vn9br". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.152179 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b745d8b98-pzrsc" event={"ID":"afd13bd1-0e47-4739-9f82-e673232e3c61","Type":"ContainerStarted","Data":"63be707c1f4a160a8d5ec72b4a3245a9569529caf6eaf1780ad01344c825e31a"} Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.157648 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29416921-tdtbn"] Dec 06 10:01:00 crc kubenswrapper[4672]: E1206 10:01:00.158528 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54fd106c-75b6-4c7a-810c-dd13d4655cba" containerName="mariadb-database-create" Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.158794 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="54fd106c-75b6-4c7a-810c-dd13d4655cba" containerName="mariadb-database-create" Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.159073 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="54fd106c-75b6-4c7a-810c-dd13d4655cba" containerName="mariadb-database-create" Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.159834 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416921-tdtbn" Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.161931 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-fvtdd" event={"ID":"54fd106c-75b6-4c7a-810c-dd13d4655cba","Type":"ContainerDied","Data":"a971c0ef0efe3b2e052f9c6c580ebeab8544b6eaf87d9445610f5c8f5eaa8148"} Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.161979 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a971c0ef0efe3b2e052f9c6c580ebeab8544b6eaf87d9445610f5c8f5eaa8148" Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.162056 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-fvtdd" Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.171989 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vn9br\" (UniqueName: \"kubernetes.io/projected/54fd106c-75b6-4c7a-810c-dd13d4655cba-kube-api-access-vn9br\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.172020 4672 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54fd106c-75b6-4c7a-810c-dd13d4655cba-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.172286 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c68f70da-7f06-4f89-8b77-cb1fa481bc29","Type":"ContainerStarted","Data":"92f1ae125da582921c2aaaa0109016bc8caa59e034f6362bf0c01b7700be63b1"} Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.172480 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c68f70da-7f06-4f89-8b77-cb1fa481bc29" containerName="glance-log" containerID="cri-o://7c77bc375801e61fc4836f40c0765bd9b6fddae90557bd94a1cc73b4901ab7c3" gracePeriod=30 Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.172647 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c68f70da-7f06-4f89-8b77-cb1fa481bc29" containerName="glance-httpd" containerID="cri-o://92f1ae125da582921c2aaaa0109016bc8caa59e034f6362bf0c01b7700be63b1" gracePeriod=30 Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.202423 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29416921-tdtbn"] Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.253828 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.25380334 podStartE2EDuration="7.25380334s" podCreationTimestamp="2025-12-06 10:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 10:01:00.226977632 +0000 UTC m=+3277.971237909" watchObservedRunningTime="2025-12-06 10:01:00.25380334 +0000 UTC m=+3277.998063647" Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.278616 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f321169c-b38c-4403-8541-48064fd878b2-config-data\") pod \"keystone-cron-29416921-tdtbn\" (UID: \"f321169c-b38c-4403-8541-48064fd878b2\") " pod="openstack/keystone-cron-29416921-tdtbn" Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.279204 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f321169c-b38c-4403-8541-48064fd878b2-fernet-keys\") pod \"keystone-cron-29416921-tdtbn\" (UID: \"f321169c-b38c-4403-8541-48064fd878b2\") " pod="openstack/keystone-cron-29416921-tdtbn" Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.279297 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f321169c-b38c-4403-8541-48064fd878b2-combined-ca-bundle\") pod \"keystone-cron-29416921-tdtbn\" (UID: \"f321169c-b38c-4403-8541-48064fd878b2\") " pod="openstack/keystone-cron-29416921-tdtbn" Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.279402 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfkpf\" (UniqueName: \"kubernetes.io/projected/f321169c-b38c-4403-8541-48064fd878b2-kube-api-access-zfkpf\") pod \"keystone-cron-29416921-tdtbn\" (UID: \"f321169c-b38c-4403-8541-48064fd878b2\") " pod="openstack/keystone-cron-29416921-tdtbn" Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.308651 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8c74dbc66-8ghhf"] Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.383643 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfkpf\" (UniqueName: \"kubernetes.io/projected/f321169c-b38c-4403-8541-48064fd878b2-kube-api-access-zfkpf\") pod \"keystone-cron-29416921-tdtbn\" (UID: \"f321169c-b38c-4403-8541-48064fd878b2\") " pod="openstack/keystone-cron-29416921-tdtbn" Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.383817 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f321169c-b38c-4403-8541-48064fd878b2-config-data\") pod \"keystone-cron-29416921-tdtbn\" (UID: \"f321169c-b38c-4403-8541-48064fd878b2\") " pod="openstack/keystone-cron-29416921-tdtbn" Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.383836 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f321169c-b38c-4403-8541-48064fd878b2-fernet-keys\") pod \"keystone-cron-29416921-tdtbn\" (UID: \"f321169c-b38c-4403-8541-48064fd878b2\") " pod="openstack/keystone-cron-29416921-tdtbn" Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.383869 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f321169c-b38c-4403-8541-48064fd878b2-combined-ca-bundle\") pod \"keystone-cron-29416921-tdtbn\" (UID: \"f321169c-b38c-4403-8541-48064fd878b2\") " pod="openstack/keystone-cron-29416921-tdtbn" Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.389484 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f321169c-b38c-4403-8541-48064fd878b2-config-data\") pod \"keystone-cron-29416921-tdtbn\" (UID: \"f321169c-b38c-4403-8541-48064fd878b2\") " pod="openstack/keystone-cron-29416921-tdtbn" Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.394265 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f321169c-b38c-4403-8541-48064fd878b2-fernet-keys\") pod \"keystone-cron-29416921-tdtbn\" (UID: \"f321169c-b38c-4403-8541-48064fd878b2\") " pod="openstack/keystone-cron-29416921-tdtbn" Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.413477 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfkpf\" (UniqueName: \"kubernetes.io/projected/f321169c-b38c-4403-8541-48064fd878b2-kube-api-access-zfkpf\") pod \"keystone-cron-29416921-tdtbn\" (UID: \"f321169c-b38c-4403-8541-48064fd878b2\") " pod="openstack/keystone-cron-29416921-tdtbn" Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.596698 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f321169c-b38c-4403-8541-48064fd878b2-combined-ca-bundle\") pod \"keystone-cron-29416921-tdtbn\" (UID: \"f321169c-b38c-4403-8541-48064fd878b2\") " pod="openstack/keystone-cron-29416921-tdtbn" Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.602018 4672 scope.go:117] "RemoveContainer" containerID="7bbf2781550e7a61427a2b236cb3b966725940a4a76024981629c0dc4fd1af55" Dec 06 10:01:00 crc kubenswrapper[4672]: E1206 10:01:00.602477 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:01:00 crc kubenswrapper[4672]: I1206 10:01:00.630449 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416921-tdtbn" Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.108066 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-3062-account-create-update-7k7nc" Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.130400 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/129c23da-4548-4207-808e-296c6e1f6396-operator-scripts\") pod \"129c23da-4548-4207-808e-296c6e1f6396\" (UID: \"129c23da-4548-4207-808e-296c6e1f6396\") " Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.130571 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j2qj\" (UniqueName: \"kubernetes.io/projected/129c23da-4548-4207-808e-296c6e1f6396-kube-api-access-2j2qj\") pod \"129c23da-4548-4207-808e-296c6e1f6396\" (UID: \"129c23da-4548-4207-808e-296c6e1f6396\") " Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.142446 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/129c23da-4548-4207-808e-296c6e1f6396-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "129c23da-4548-4207-808e-296c6e1f6396" (UID: "129c23da-4548-4207-808e-296c6e1f6396"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.187281 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/129c23da-4548-4207-808e-296c6e1f6396-kube-api-access-2j2qj" (OuterVolumeSpecName: "kube-api-access-2j2qj") pod "129c23da-4548-4207-808e-296c6e1f6396" (UID: "129c23da-4548-4207-808e-296c6e1f6396"). InnerVolumeSpecName "kube-api-access-2j2qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.232172 4672 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/129c23da-4548-4207-808e-296c6e1f6396-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.232198 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j2qj\" (UniqueName: \"kubernetes.io/projected/129c23da-4548-4207-808e-296c6e1f6396-kube-api-access-2j2qj\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.271269 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6f41fc45-9ebc-4b9d-b076-338a18c1e7a0" containerName="glance-log" containerID="cri-o://42e671b00ad788843e79d1d20736cf39ba806f16acceae8e4afff023bbfb05c1" gracePeriod=30 Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.271536 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0","Type":"ContainerStarted","Data":"5fb4231af60e529824f288801f2a25aefc133e8c0634408fd45ff60b71c5c43c"} Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.271872 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6f41fc45-9ebc-4b9d-b076-338a18c1e7a0" containerName="glance-httpd" containerID="cri-o://5fb4231af60e529824f288801f2a25aefc133e8c0634408fd45ff60b71c5c43c" gracePeriod=30 Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.296025 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-3062-account-create-update-7k7nc" event={"ID":"129c23da-4548-4207-808e-296c6e1f6396","Type":"ContainerDied","Data":"3090f0031848a6bf3c8a021d9097785a1f3dbe9b99e3f54cb478402d8e24aa94"} Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.296376 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3090f0031848a6bf3c8a021d9097785a1f3dbe9b99e3f54cb478402d8e24aa94" Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.296443 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-3062-account-create-update-7k7nc" Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.341458 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.341445711 podStartE2EDuration="8.341445711s" podCreationTimestamp="2025-12-06 10:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 10:01:01.340975739 +0000 UTC m=+3279.085236026" watchObservedRunningTime="2025-12-06 10:01:01.341445711 +0000 UTC m=+3279.085705998" Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.380765 4672 generic.go:334] "Generic (PLEG): container finished" podID="c68f70da-7f06-4f89-8b77-cb1fa481bc29" containerID="92f1ae125da582921c2aaaa0109016bc8caa59e034f6362bf0c01b7700be63b1" exitCode=143 Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.380795 4672 generic.go:334] "Generic (PLEG): container finished" podID="c68f70da-7f06-4f89-8b77-cb1fa481bc29" containerID="7c77bc375801e61fc4836f40c0765bd9b6fddae90557bd94a1cc73b4901ab7c3" exitCode=143 Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.380859 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c68f70da-7f06-4f89-8b77-cb1fa481bc29","Type":"ContainerDied","Data":"92f1ae125da582921c2aaaa0109016bc8caa59e034f6362bf0c01b7700be63b1"} Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.380884 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c68f70da-7f06-4f89-8b77-cb1fa481bc29","Type":"ContainerDied","Data":"7c77bc375801e61fc4836f40c0765bd9b6fddae90557bd94a1cc73b4901ab7c3"} Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.383287 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29416921-tdtbn"] Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.428512 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8c74dbc66-8ghhf" event={"ID":"70d8ba3e-3f2d-4627-afab-5bb8908f89eb","Type":"ContainerStarted","Data":"5c1736863402c5c20796e6ab8d1a2442426c52ca037bcbe2e0655f3d253d8fd5"} Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.611510 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.662274 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c68f70da-7f06-4f89-8b77-cb1fa481bc29-httpd-run\") pod \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.662628 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c68f70da-7f06-4f89-8b77-cb1fa481bc29-public-tls-certs\") pod \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.662672 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgm7n\" (UniqueName: \"kubernetes.io/projected/c68f70da-7f06-4f89-8b77-cb1fa481bc29-kube-api-access-vgm7n\") pod \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.662726 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c68f70da-7f06-4f89-8b77-cb1fa481bc29-config-data\") pod \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.662751 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c68f70da-7f06-4f89-8b77-cb1fa481bc29-logs\") pod \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.662851 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f70da-7f06-4f89-8b77-cb1fa481bc29-combined-ca-bundle\") pod \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.662972 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c68f70da-7f06-4f89-8b77-cb1fa481bc29-ceph\") pod \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.663033 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.663153 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c68f70da-7f06-4f89-8b77-cb1fa481bc29-scripts\") pod \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\" (UID: \"c68f70da-7f06-4f89-8b77-cb1fa481bc29\") " Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.666126 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c68f70da-7f06-4f89-8b77-cb1fa481bc29-logs" (OuterVolumeSpecName: "logs") pod "c68f70da-7f06-4f89-8b77-cb1fa481bc29" (UID: "c68f70da-7f06-4f89-8b77-cb1fa481bc29"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.667418 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c68f70da-7f06-4f89-8b77-cb1fa481bc29-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c68f70da-7f06-4f89-8b77-cb1fa481bc29" (UID: "c68f70da-7f06-4f89-8b77-cb1fa481bc29"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.672345 4672 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c68f70da-7f06-4f89-8b77-cb1fa481bc29-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.672380 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c68f70da-7f06-4f89-8b77-cb1fa481bc29-logs\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.685725 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c68f70da-7f06-4f89-8b77-cb1fa481bc29-kube-api-access-vgm7n" (OuterVolumeSpecName: "kube-api-access-vgm7n") pod "c68f70da-7f06-4f89-8b77-cb1fa481bc29" (UID: "c68f70da-7f06-4f89-8b77-cb1fa481bc29"). InnerVolumeSpecName "kube-api-access-vgm7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.688924 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c68f70da-7f06-4f89-8b77-cb1fa481bc29-ceph" (OuterVolumeSpecName: "ceph") pod "c68f70da-7f06-4f89-8b77-cb1fa481bc29" (UID: "c68f70da-7f06-4f89-8b77-cb1fa481bc29"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.697345 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "c68f70da-7f06-4f89-8b77-cb1fa481bc29" (UID: "c68f70da-7f06-4f89-8b77-cb1fa481bc29"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.708467 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c68f70da-7f06-4f89-8b77-cb1fa481bc29-scripts" (OuterVolumeSpecName: "scripts") pod "c68f70da-7f06-4f89-8b77-cb1fa481bc29" (UID: "c68f70da-7f06-4f89-8b77-cb1fa481bc29"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.754373 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c68f70da-7f06-4f89-8b77-cb1fa481bc29-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c68f70da-7f06-4f89-8b77-cb1fa481bc29" (UID: "c68f70da-7f06-4f89-8b77-cb1fa481bc29"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.774450 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f70da-7f06-4f89-8b77-cb1fa481bc29-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.774701 4672 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c68f70da-7f06-4f89-8b77-cb1fa481bc29-ceph\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.774791 4672 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.774850 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c68f70da-7f06-4f89-8b77-cb1fa481bc29-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.774941 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgm7n\" (UniqueName: \"kubernetes.io/projected/c68f70da-7f06-4f89-8b77-cb1fa481bc29-kube-api-access-vgm7n\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.806950 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c68f70da-7f06-4f89-8b77-cb1fa481bc29-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c68f70da-7f06-4f89-8b77-cb1fa481bc29" (UID: "c68f70da-7f06-4f89-8b77-cb1fa481bc29"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.816814 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c68f70da-7f06-4f89-8b77-cb1fa481bc29-config-data" (OuterVolumeSpecName: "config-data") pod "c68f70da-7f06-4f89-8b77-cb1fa481bc29" (UID: "c68f70da-7f06-4f89-8b77-cb1fa481bc29"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.824802 4672 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.881403 4672 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c68f70da-7f06-4f89-8b77-cb1fa481bc29-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.881433 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c68f70da-7f06-4f89-8b77-cb1fa481bc29-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:01 crc kubenswrapper[4672]: I1206 10:01:01.881442 4672 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.423068 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.459883 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416921-tdtbn" event={"ID":"f321169c-b38c-4403-8541-48064fd878b2","Type":"ContainerStarted","Data":"8152a814b4dde0a33f11ce539d9893b4037f276172f2e15d560b6174b030e60c"} Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.459922 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416921-tdtbn" event={"ID":"f321169c-b38c-4403-8541-48064fd878b2","Type":"ContainerStarted","Data":"bf27932a310c59c2c08717a333fc39540d0f234f0b4fb5ba820997a3a8c4d1d4"} Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.477842 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.478819 4672 generic.go:334] "Generic (PLEG): container finished" podID="6f41fc45-9ebc-4b9d-b076-338a18c1e7a0" containerID="5fb4231af60e529824f288801f2a25aefc133e8c0634408fd45ff60b71c5c43c" exitCode=0 Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.478843 4672 generic.go:334] "Generic (PLEG): container finished" podID="6f41fc45-9ebc-4b9d-b076-338a18c1e7a0" containerID="42e671b00ad788843e79d1d20736cf39ba806f16acceae8e4afff023bbfb05c1" exitCode=143 Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.478873 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0","Type":"ContainerDied","Data":"5fb4231af60e529824f288801f2a25aefc133e8c0634408fd45ff60b71c5c43c"} Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.478902 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0","Type":"ContainerDied","Data":"42e671b00ad788843e79d1d20736cf39ba806f16acceae8e4afff023bbfb05c1"} Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.478912 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0","Type":"ContainerDied","Data":"f01e7a165873833ed18debf401b773560d06375996329cbd6743d3e8f2876426"} Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.478926 4672 scope.go:117] "RemoveContainer" containerID="5fb4231af60e529824f288801f2a25aefc133e8c0634408fd45ff60b71c5c43c" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.492863 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29416921-tdtbn" podStartSLOduration=2.492848412 podStartE2EDuration="2.492848412s" podCreationTimestamp="2025-12-06 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 10:01:02.486984633 +0000 UTC m=+3280.231244920" watchObservedRunningTime="2025-12-06 10:01:02.492848412 +0000 UTC m=+3280.237108699" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.494487 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl6qm\" (UniqueName: \"kubernetes.io/projected/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-kube-api-access-fl6qm\") pod \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.494541 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-ceph\") pod \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.494589 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-logs\") pod \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.494627 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-internal-tls-certs\") pod \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.494677 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-httpd-run\") pod \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.494718 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.494741 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-config-data\") pod \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.494852 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-scripts\") pod \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.494917 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-combined-ca-bundle\") pod \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\" (UID: \"6f41fc45-9ebc-4b9d-b076-338a18c1e7a0\") " Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.499138 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6f41fc45-9ebc-4b9d-b076-338a18c1e7a0" (UID: "6f41fc45-9ebc-4b9d-b076-338a18c1e7a0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.500296 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-logs" (OuterVolumeSpecName: "logs") pod "6f41fc45-9ebc-4b9d-b076-338a18c1e7a0" (UID: "6f41fc45-9ebc-4b9d-b076-338a18c1e7a0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.508562 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-ceph" (OuterVolumeSpecName: "ceph") pod "6f41fc45-9ebc-4b9d-b076-338a18c1e7a0" (UID: "6f41fc45-9ebc-4b9d-b076-338a18c1e7a0"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.522810 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-kube-api-access-fl6qm" (OuterVolumeSpecName: "kube-api-access-fl6qm") pod "6f41fc45-9ebc-4b9d-b076-338a18c1e7a0" (UID: "6f41fc45-9ebc-4b9d-b076-338a18c1e7a0"). InnerVolumeSpecName "kube-api-access-fl6qm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.533561 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-scripts" (OuterVolumeSpecName: "scripts") pod "6f41fc45-9ebc-4b9d-b076-338a18c1e7a0" (UID: "6f41fc45-9ebc-4b9d-b076-338a18c1e7a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.541938 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c68f70da-7f06-4f89-8b77-cb1fa481bc29","Type":"ContainerDied","Data":"e44116545b3f0e3561d0ca59248552fa6d9569e890e6176a6fa4d99b0f16de52"} Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.542039 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.588777 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "6f41fc45-9ebc-4b9d-b076-338a18c1e7a0" (UID: "6f41fc45-9ebc-4b9d-b076-338a18c1e7a0"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.594684 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6f41fc45-9ebc-4b9d-b076-338a18c1e7a0" (UID: "6f41fc45-9ebc-4b9d-b076-338a18c1e7a0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.604853 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.606612 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl6qm\" (UniqueName: \"kubernetes.io/projected/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-kube-api-access-fl6qm\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.606634 4672 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-ceph\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.606643 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-logs\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.606652 4672 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.606690 4672 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.606711 4672 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.639126 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dwt4q"] Dec 06 10:01:02 crc kubenswrapper[4672]: E1206 10:01:02.639846 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c68f70da-7f06-4f89-8b77-cb1fa481bc29" containerName="glance-httpd" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.639864 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c68f70da-7f06-4f89-8b77-cb1fa481bc29" containerName="glance-httpd" Dec 06 10:01:02 crc kubenswrapper[4672]: E1206 10:01:02.639880 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c68f70da-7f06-4f89-8b77-cb1fa481bc29" containerName="glance-log" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.639889 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c68f70da-7f06-4f89-8b77-cb1fa481bc29" containerName="glance-log" Dec 06 10:01:02 crc kubenswrapper[4672]: E1206 10:01:02.639916 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f41fc45-9ebc-4b9d-b076-338a18c1e7a0" containerName="glance-httpd" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.639923 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f41fc45-9ebc-4b9d-b076-338a18c1e7a0" containerName="glance-httpd" Dec 06 10:01:02 crc kubenswrapper[4672]: E1206 10:01:02.639940 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="129c23da-4548-4207-808e-296c6e1f6396" containerName="mariadb-account-create-update" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.639953 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="129c23da-4548-4207-808e-296c6e1f6396" containerName="mariadb-account-create-update" Dec 06 10:01:02 crc kubenswrapper[4672]: E1206 10:01:02.639965 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f41fc45-9ebc-4b9d-b076-338a18c1e7a0" containerName="glance-log" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.639973 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f41fc45-9ebc-4b9d-b076-338a18c1e7a0" containerName="glance-log" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.640250 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f41fc45-9ebc-4b9d-b076-338a18c1e7a0" containerName="glance-httpd" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.640262 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="c68f70da-7f06-4f89-8b77-cb1fa481bc29" containerName="glance-log" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.640278 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="129c23da-4548-4207-808e-296c6e1f6396" containerName="mariadb-account-create-update" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.640301 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="c68f70da-7f06-4f89-8b77-cb1fa481bc29" containerName="glance-httpd" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.640313 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f41fc45-9ebc-4b9d-b076-338a18c1e7a0" containerName="glance-log" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.645316 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwt4q"] Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.645538 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dwt4q" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.675329 4672 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.708616 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de298690-6d2a-44d6-9744-0a29c7041aa4-utilities\") pod \"redhat-marketplace-dwt4q\" (UID: \"de298690-6d2a-44d6-9744-0a29c7041aa4\") " pod="openshift-marketplace/redhat-marketplace-dwt4q" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.712794 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de298690-6d2a-44d6-9744-0a29c7041aa4-catalog-content\") pod \"redhat-marketplace-dwt4q\" (UID: \"de298690-6d2a-44d6-9744-0a29c7041aa4\") " pod="openshift-marketplace/redhat-marketplace-dwt4q" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.713064 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5vbt\" (UniqueName: \"kubernetes.io/projected/de298690-6d2a-44d6-9744-0a29c7041aa4-kube-api-access-l5vbt\") pod \"redhat-marketplace-dwt4q\" (UID: \"de298690-6d2a-44d6-9744-0a29c7041aa4\") " pod="openshift-marketplace/redhat-marketplace-dwt4q" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.713224 4672 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.728713 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f41fc45-9ebc-4b9d-b076-338a18c1e7a0" (UID: "6f41fc45-9ebc-4b9d-b076-338a18c1e7a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.793651 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-config-data" (OuterVolumeSpecName: "config-data") pod "6f41fc45-9ebc-4b9d-b076-338a18c1e7a0" (UID: "6f41fc45-9ebc-4b9d-b076-338a18c1e7a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.804052 4672 scope.go:117] "RemoveContainer" containerID="42e671b00ad788843e79d1d20736cf39ba806f16acceae8e4afff023bbfb05c1" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.814825 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5vbt\" (UniqueName: \"kubernetes.io/projected/de298690-6d2a-44d6-9744-0a29c7041aa4-kube-api-access-l5vbt\") pod \"redhat-marketplace-dwt4q\" (UID: \"de298690-6d2a-44d6-9744-0a29c7041aa4\") " pod="openshift-marketplace/redhat-marketplace-dwt4q" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.814925 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de298690-6d2a-44d6-9744-0a29c7041aa4-utilities\") pod \"redhat-marketplace-dwt4q\" (UID: \"de298690-6d2a-44d6-9744-0a29c7041aa4\") " pod="openshift-marketplace/redhat-marketplace-dwt4q" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.814998 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de298690-6d2a-44d6-9744-0a29c7041aa4-catalog-content\") pod \"redhat-marketplace-dwt4q\" (UID: \"de298690-6d2a-44d6-9744-0a29c7041aa4\") " pod="openshift-marketplace/redhat-marketplace-dwt4q" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.815063 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.815078 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.815429 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de298690-6d2a-44d6-9744-0a29c7041aa4-catalog-content\") pod \"redhat-marketplace-dwt4q\" (UID: \"de298690-6d2a-44d6-9744-0a29c7041aa4\") " pod="openshift-marketplace/redhat-marketplace-dwt4q" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.815929 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de298690-6d2a-44d6-9744-0a29c7041aa4-utilities\") pod \"redhat-marketplace-dwt4q\" (UID: \"de298690-6d2a-44d6-9744-0a29c7041aa4\") " pod="openshift-marketplace/redhat-marketplace-dwt4q" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.838681 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.857558 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5vbt\" (UniqueName: \"kubernetes.io/projected/de298690-6d2a-44d6-9744-0a29c7041aa4-kube-api-access-l5vbt\") pod \"redhat-marketplace-dwt4q\" (UID: \"de298690-6d2a-44d6-9744-0a29c7041aa4\") " pod="openshift-marketplace/redhat-marketplace-dwt4q" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.860027 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.875712 4672 scope.go:117] "RemoveContainer" containerID="5fb4231af60e529824f288801f2a25aefc133e8c0634408fd45ff60b71c5c43c" Dec 06 10:01:02 crc kubenswrapper[4672]: E1206 10:01:02.877230 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fb4231af60e529824f288801f2a25aefc133e8c0634408fd45ff60b71c5c43c\": container with ID starting with 5fb4231af60e529824f288801f2a25aefc133e8c0634408fd45ff60b71c5c43c not found: ID does not exist" containerID="5fb4231af60e529824f288801f2a25aefc133e8c0634408fd45ff60b71c5c43c" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.877271 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fb4231af60e529824f288801f2a25aefc133e8c0634408fd45ff60b71c5c43c"} err="failed to get container status \"5fb4231af60e529824f288801f2a25aefc133e8c0634408fd45ff60b71c5c43c\": rpc error: code = NotFound desc = could not find container \"5fb4231af60e529824f288801f2a25aefc133e8c0634408fd45ff60b71c5c43c\": container with ID starting with 5fb4231af60e529824f288801f2a25aefc133e8c0634408fd45ff60b71c5c43c not found: ID does not exist" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.877292 4672 scope.go:117] "RemoveContainer" containerID="42e671b00ad788843e79d1d20736cf39ba806f16acceae8e4afff023bbfb05c1" Dec 06 10:01:02 crc kubenswrapper[4672]: E1206 10:01:02.877884 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42e671b00ad788843e79d1d20736cf39ba806f16acceae8e4afff023bbfb05c1\": container with ID starting with 42e671b00ad788843e79d1d20736cf39ba806f16acceae8e4afff023bbfb05c1 not found: ID does not exist" containerID="42e671b00ad788843e79d1d20736cf39ba806f16acceae8e4afff023bbfb05c1" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.877906 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42e671b00ad788843e79d1d20736cf39ba806f16acceae8e4afff023bbfb05c1"} err="failed to get container status \"42e671b00ad788843e79d1d20736cf39ba806f16acceae8e4afff023bbfb05c1\": rpc error: code = NotFound desc = could not find container \"42e671b00ad788843e79d1d20736cf39ba806f16acceae8e4afff023bbfb05c1\": container with ID starting with 42e671b00ad788843e79d1d20736cf39ba806f16acceae8e4afff023bbfb05c1 not found: ID does not exist" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.877928 4672 scope.go:117] "RemoveContainer" containerID="5fb4231af60e529824f288801f2a25aefc133e8c0634408fd45ff60b71c5c43c" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.878179 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fb4231af60e529824f288801f2a25aefc133e8c0634408fd45ff60b71c5c43c"} err="failed to get container status \"5fb4231af60e529824f288801f2a25aefc133e8c0634408fd45ff60b71c5c43c\": rpc error: code = NotFound desc = could not find container \"5fb4231af60e529824f288801f2a25aefc133e8c0634408fd45ff60b71c5c43c\": container with ID starting with 5fb4231af60e529824f288801f2a25aefc133e8c0634408fd45ff60b71c5c43c not found: ID does not exist" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.878199 4672 scope.go:117] "RemoveContainer" containerID="42e671b00ad788843e79d1d20736cf39ba806f16acceae8e4afff023bbfb05c1" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.878389 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42e671b00ad788843e79d1d20736cf39ba806f16acceae8e4afff023bbfb05c1"} err="failed to get container status \"42e671b00ad788843e79d1d20736cf39ba806f16acceae8e4afff023bbfb05c1\": rpc error: code = NotFound desc = could not find container \"42e671b00ad788843e79d1d20736cf39ba806f16acceae8e4afff023bbfb05c1\": container with ID starting with 42e671b00ad788843e79d1d20736cf39ba806f16acceae8e4afff023bbfb05c1 not found: ID does not exist" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.878407 4672 scope.go:117] "RemoveContainer" containerID="92f1ae125da582921c2aaaa0109016bc8caa59e034f6362bf0c01b7700be63b1" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.885316 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.886934 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.892192 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.892368 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.913208 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 10:01:02 crc kubenswrapper[4672]: I1206 10:01:02.961295 4672 scope.go:117] "RemoveContainer" containerID="7c77bc375801e61fc4836f40c0765bd9b6fddae90557bd94a1cc73b4901ab7c3" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.018139 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a6d2d22-6464-4bf5-9bf6-e3515efedbf4-config-data\") pod \"glance-default-external-api-0\" (UID: \"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4\") " pod="openstack/glance-default-external-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.018194 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a6d2d22-6464-4bf5-9bf6-e3515efedbf4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4\") " pod="openstack/glance-default-external-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.018229 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a6d2d22-6464-4bf5-9bf6-e3515efedbf4-scripts\") pod \"glance-default-external-api-0\" (UID: \"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4\") " pod="openstack/glance-default-external-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.018313 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh9xv\" (UniqueName: \"kubernetes.io/projected/9a6d2d22-6464-4bf5-9bf6-e3515efedbf4-kube-api-access-xh9xv\") pod \"glance-default-external-api-0\" (UID: \"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4\") " pod="openstack/glance-default-external-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.018341 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4\") " pod="openstack/glance-default-external-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.018366 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a6d2d22-6464-4bf5-9bf6-e3515efedbf4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4\") " pod="openstack/glance-default-external-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.018398 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9a6d2d22-6464-4bf5-9bf6-e3515efedbf4-ceph\") pod \"glance-default-external-api-0\" (UID: \"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4\") " pod="openstack/glance-default-external-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.018424 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a6d2d22-6464-4bf5-9bf6-e3515efedbf4-logs\") pod \"glance-default-external-api-0\" (UID: \"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4\") " pod="openstack/glance-default-external-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.018442 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a6d2d22-6464-4bf5-9bf6-e3515efedbf4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4\") " pod="openstack/glance-default-external-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.039537 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dwt4q" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.121901 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a6d2d22-6464-4bf5-9bf6-e3515efedbf4-config-data\") pod \"glance-default-external-api-0\" (UID: \"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4\") " pod="openstack/glance-default-external-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.121976 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a6d2d22-6464-4bf5-9bf6-e3515efedbf4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4\") " pod="openstack/glance-default-external-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.121998 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a6d2d22-6464-4bf5-9bf6-e3515efedbf4-scripts\") pod \"glance-default-external-api-0\" (UID: \"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4\") " pod="openstack/glance-default-external-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.122075 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh9xv\" (UniqueName: \"kubernetes.io/projected/9a6d2d22-6464-4bf5-9bf6-e3515efedbf4-kube-api-access-xh9xv\") pod \"glance-default-external-api-0\" (UID: \"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4\") " pod="openstack/glance-default-external-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.122112 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4\") " pod="openstack/glance-default-external-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.122146 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a6d2d22-6464-4bf5-9bf6-e3515efedbf4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4\") " pod="openstack/glance-default-external-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.122193 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9a6d2d22-6464-4bf5-9bf6-e3515efedbf4-ceph\") pod \"glance-default-external-api-0\" (UID: \"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4\") " pod="openstack/glance-default-external-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.122231 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a6d2d22-6464-4bf5-9bf6-e3515efedbf4-logs\") pod \"glance-default-external-api-0\" (UID: \"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4\") " pod="openstack/glance-default-external-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.122248 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a6d2d22-6464-4bf5-9bf6-e3515efedbf4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4\") " pod="openstack/glance-default-external-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.122896 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a6d2d22-6464-4bf5-9bf6-e3515efedbf4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4\") " pod="openstack/glance-default-external-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.124968 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.148505 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a6d2d22-6464-4bf5-9bf6-e3515efedbf4-config-data\") pod \"glance-default-external-api-0\" (UID: \"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4\") " pod="openstack/glance-default-external-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.158775 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a6d2d22-6464-4bf5-9bf6-e3515efedbf4-scripts\") pod \"glance-default-external-api-0\" (UID: \"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4\") " pod="openstack/glance-default-external-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.161890 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a6d2d22-6464-4bf5-9bf6-e3515efedbf4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4\") " pod="openstack/glance-default-external-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.166258 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a6d2d22-6464-4bf5-9bf6-e3515efedbf4-logs\") pod \"glance-default-external-api-0\" (UID: \"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4\") " pod="openstack/glance-default-external-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.167967 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9a6d2d22-6464-4bf5-9bf6-e3515efedbf4-ceph\") pod \"glance-default-external-api-0\" (UID: \"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4\") " pod="openstack/glance-default-external-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.208374 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh9xv\" (UniqueName: \"kubernetes.io/projected/9a6d2d22-6464-4bf5-9bf6-e3515efedbf4-kube-api-access-xh9xv\") pod \"glance-default-external-api-0\" (UID: \"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4\") " pod="openstack/glance-default-external-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.224034 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a6d2d22-6464-4bf5-9bf6-e3515efedbf4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4\") " pod="openstack/glance-default-external-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.249689 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.335209 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4\") " pod="openstack/glance-default-external-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.336948 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.384767 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.390444 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.398695 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.398887 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.426992 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.512616 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.549713 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97cea7c5-c51e-4001-b398-28bdbccd9a97-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"97cea7c5-c51e-4001-b398-28bdbccd9a97\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.549984 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngw98\" (UniqueName: \"kubernetes.io/projected/97cea7c5-c51e-4001-b398-28bdbccd9a97-kube-api-access-ngw98\") pod \"glance-default-internal-api-0\" (UID: \"97cea7c5-c51e-4001-b398-28bdbccd9a97\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.550091 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97cea7c5-c51e-4001-b398-28bdbccd9a97-scripts\") pod \"glance-default-internal-api-0\" (UID: \"97cea7c5-c51e-4001-b398-28bdbccd9a97\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.550203 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"97cea7c5-c51e-4001-b398-28bdbccd9a97\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.550323 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97cea7c5-c51e-4001-b398-28bdbccd9a97-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"97cea7c5-c51e-4001-b398-28bdbccd9a97\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.550431 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/97cea7c5-c51e-4001-b398-28bdbccd9a97-ceph\") pod \"glance-default-internal-api-0\" (UID: \"97cea7c5-c51e-4001-b398-28bdbccd9a97\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.550520 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97cea7c5-c51e-4001-b398-28bdbccd9a97-config-data\") pod \"glance-default-internal-api-0\" (UID: \"97cea7c5-c51e-4001-b398-28bdbccd9a97\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.550638 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97cea7c5-c51e-4001-b398-28bdbccd9a97-logs\") pod \"glance-default-internal-api-0\" (UID: \"97cea7c5-c51e-4001-b398-28bdbccd9a97\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:01:03 crc kubenswrapper[4672]: I1206 10:01:03.550717 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97cea7c5-c51e-4001-b398-28bdbccd9a97-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"97cea7c5-c51e-4001-b398-28bdbccd9a97\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:01:04 crc kubenswrapper[4672]: E1206 10:01:03.627057 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f41fc45_9ebc_4b9d_b076_338a18c1e7a0.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f41fc45_9ebc_4b9d_b076_338a18c1e7a0.slice/crio-f01e7a165873833ed18debf401b773560d06375996329cbd6743d3e8f2876426\": RecentStats: unable to find data in memory cache]" Dec 06 10:01:04 crc kubenswrapper[4672]: I1206 10:01:03.654877 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97cea7c5-c51e-4001-b398-28bdbccd9a97-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"97cea7c5-c51e-4001-b398-28bdbccd9a97\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:01:04 crc kubenswrapper[4672]: I1206 10:01:03.655413 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngw98\" (UniqueName: \"kubernetes.io/projected/97cea7c5-c51e-4001-b398-28bdbccd9a97-kube-api-access-ngw98\") pod \"glance-default-internal-api-0\" (UID: \"97cea7c5-c51e-4001-b398-28bdbccd9a97\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:01:04 crc kubenswrapper[4672]: I1206 10:01:03.655469 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97cea7c5-c51e-4001-b398-28bdbccd9a97-scripts\") pod \"glance-default-internal-api-0\" (UID: \"97cea7c5-c51e-4001-b398-28bdbccd9a97\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:01:04 crc kubenswrapper[4672]: I1206 10:01:03.655577 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"97cea7c5-c51e-4001-b398-28bdbccd9a97\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:01:04 crc kubenswrapper[4672]: I1206 10:01:03.655761 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97cea7c5-c51e-4001-b398-28bdbccd9a97-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"97cea7c5-c51e-4001-b398-28bdbccd9a97\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:01:04 crc kubenswrapper[4672]: I1206 10:01:03.657650 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"97cea7c5-c51e-4001-b398-28bdbccd9a97\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Dec 06 10:01:04 crc kubenswrapper[4672]: I1206 10:01:03.660389 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/97cea7c5-c51e-4001-b398-28bdbccd9a97-ceph\") pod \"glance-default-internal-api-0\" (UID: \"97cea7c5-c51e-4001-b398-28bdbccd9a97\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:01:04 crc kubenswrapper[4672]: I1206 10:01:03.661078 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97cea7c5-c51e-4001-b398-28bdbccd9a97-config-data\") pod \"glance-default-internal-api-0\" (UID: \"97cea7c5-c51e-4001-b398-28bdbccd9a97\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:01:04 crc kubenswrapper[4672]: I1206 10:01:03.661164 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97cea7c5-c51e-4001-b398-28bdbccd9a97-logs\") pod \"glance-default-internal-api-0\" (UID: \"97cea7c5-c51e-4001-b398-28bdbccd9a97\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:01:04 crc kubenswrapper[4672]: I1206 10:01:03.661182 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97cea7c5-c51e-4001-b398-28bdbccd9a97-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"97cea7c5-c51e-4001-b398-28bdbccd9a97\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:01:04 crc kubenswrapper[4672]: I1206 10:01:03.662899 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97cea7c5-c51e-4001-b398-28bdbccd9a97-scripts\") pod \"glance-default-internal-api-0\" (UID: \"97cea7c5-c51e-4001-b398-28bdbccd9a97\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:01:04 crc kubenswrapper[4672]: I1206 10:01:03.663264 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97cea7c5-c51e-4001-b398-28bdbccd9a97-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"97cea7c5-c51e-4001-b398-28bdbccd9a97\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:01:04 crc kubenswrapper[4672]: I1206 10:01:03.663827 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97cea7c5-c51e-4001-b398-28bdbccd9a97-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"97cea7c5-c51e-4001-b398-28bdbccd9a97\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:01:04 crc kubenswrapper[4672]: I1206 10:01:03.667971 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97cea7c5-c51e-4001-b398-28bdbccd9a97-logs\") pod \"glance-default-internal-api-0\" (UID: \"97cea7c5-c51e-4001-b398-28bdbccd9a97\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:01:04 crc kubenswrapper[4672]: I1206 10:01:03.668813 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/97cea7c5-c51e-4001-b398-28bdbccd9a97-ceph\") pod \"glance-default-internal-api-0\" (UID: \"97cea7c5-c51e-4001-b398-28bdbccd9a97\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:01:04 crc kubenswrapper[4672]: I1206 10:01:03.668997 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97cea7c5-c51e-4001-b398-28bdbccd9a97-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"97cea7c5-c51e-4001-b398-28bdbccd9a97\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:01:04 crc kubenswrapper[4672]: I1206 10:01:03.670446 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97cea7c5-c51e-4001-b398-28bdbccd9a97-config-data\") pod \"glance-default-internal-api-0\" (UID: \"97cea7c5-c51e-4001-b398-28bdbccd9a97\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:01:04 crc kubenswrapper[4672]: I1206 10:01:03.685510 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngw98\" (UniqueName: \"kubernetes.io/projected/97cea7c5-c51e-4001-b398-28bdbccd9a97-kube-api-access-ngw98\") pod \"glance-default-internal-api-0\" (UID: \"97cea7c5-c51e-4001-b398-28bdbccd9a97\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:01:04 crc kubenswrapper[4672]: I1206 10:01:03.703303 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"97cea7c5-c51e-4001-b398-28bdbccd9a97\") " pod="openstack/glance-default-internal-api-0" Dec 06 10:01:04 crc kubenswrapper[4672]: I1206 10:01:03.723095 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 10:01:04 crc kubenswrapper[4672]: I1206 10:01:03.789012 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwt4q"] Dec 06 10:01:04 crc kubenswrapper[4672]: I1206 10:01:04.580749 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-volume-volume1-0" podUID="a35af03e-7b48-40ee-a857-20824a664f4e" containerName="cinder-volume" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 10:01:04 crc kubenswrapper[4672]: I1206 10:01:04.582838 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f41fc45-9ebc-4b9d-b076-338a18c1e7a0" path="/var/lib/kubelet/pods/6f41fc45-9ebc-4b9d-b076-338a18c1e7a0/volumes" Dec 06 10:01:04 crc kubenswrapper[4672]: I1206 10:01:04.583876 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c68f70da-7f06-4f89-8b77-cb1fa481bc29" path="/var/lib/kubelet/pods/c68f70da-7f06-4f89-8b77-cb1fa481bc29/volumes" Dec 06 10:01:04 crc kubenswrapper[4672]: I1206 10:01:04.604678 4672 generic.go:334] "Generic (PLEG): container finished" podID="de298690-6d2a-44d6-9744-0a29c7041aa4" containerID="bae3e204a33a66a3f350857f091b3045c789047d9dafbf3330ec419b50607eb0" exitCode=0 Dec 06 10:01:04 crc kubenswrapper[4672]: I1206 10:01:04.604742 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwt4q" event={"ID":"de298690-6d2a-44d6-9744-0a29c7041aa4","Type":"ContainerDied","Data":"bae3e204a33a66a3f350857f091b3045c789047d9dafbf3330ec419b50607eb0"} Dec 06 10:01:04 crc kubenswrapper[4672]: I1206 10:01:04.604789 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwt4q" event={"ID":"de298690-6d2a-44d6-9744-0a29c7041aa4","Type":"ContainerStarted","Data":"d9bdfd9d127425ec6ff143aa9b9818d254ed760dccafc13c6846d07ed4883b7e"} Dec 06 10:01:04 crc kubenswrapper[4672]: I1206 10:01:04.739104 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Dec 06 10:01:05 crc kubenswrapper[4672]: I1206 10:01:05.028947 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 10:01:05 crc kubenswrapper[4672]: I1206 10:01:05.482410 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-96zr2"] Dec 06 10:01:05 crc kubenswrapper[4672]: I1206 10:01:05.484245 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-96zr2" Dec 06 10:01:05 crc kubenswrapper[4672]: I1206 10:01:05.487631 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-rv9j5" Dec 06 10:01:05 crc kubenswrapper[4672]: I1206 10:01:05.488337 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Dec 06 10:01:05 crc kubenswrapper[4672]: I1206 10:01:05.495810 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-96zr2"] Dec 06 10:01:05 crc kubenswrapper[4672]: I1206 10:01:05.539489 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/5bfe87b4-aa9f-475a-bba9-438425d79d47-job-config-data\") pod \"manila-db-sync-96zr2\" (UID: \"5bfe87b4-aa9f-475a-bba9-438425d79d47\") " pod="openstack/manila-db-sync-96zr2" Dec 06 10:01:05 crc kubenswrapper[4672]: I1206 10:01:05.539635 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtr88\" (UniqueName: \"kubernetes.io/projected/5bfe87b4-aa9f-475a-bba9-438425d79d47-kube-api-access-mtr88\") pod \"manila-db-sync-96zr2\" (UID: \"5bfe87b4-aa9f-475a-bba9-438425d79d47\") " pod="openstack/manila-db-sync-96zr2" Dec 06 10:01:05 crc kubenswrapper[4672]: I1206 10:01:05.541751 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bfe87b4-aa9f-475a-bba9-438425d79d47-config-data\") pod \"manila-db-sync-96zr2\" (UID: \"5bfe87b4-aa9f-475a-bba9-438425d79d47\") " pod="openstack/manila-db-sync-96zr2" Dec 06 10:01:05 crc kubenswrapper[4672]: I1206 10:01:05.541785 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bfe87b4-aa9f-475a-bba9-438425d79d47-combined-ca-bundle\") pod \"manila-db-sync-96zr2\" (UID: \"5bfe87b4-aa9f-475a-bba9-438425d79d47\") " pod="openstack/manila-db-sync-96zr2" Dec 06 10:01:05 crc kubenswrapper[4672]: I1206 10:01:05.664774 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4","Type":"ContainerStarted","Data":"c93429771b6e17e03e1b4e61569adf0457cdad33ced142a48b87089adf3590ab"} Dec 06 10:01:05 crc kubenswrapper[4672]: I1206 10:01:05.678647 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/5bfe87b4-aa9f-475a-bba9-438425d79d47-job-config-data\") pod \"manila-db-sync-96zr2\" (UID: \"5bfe87b4-aa9f-475a-bba9-438425d79d47\") " pod="openstack/manila-db-sync-96zr2" Dec 06 10:01:05 crc kubenswrapper[4672]: I1206 10:01:05.678759 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtr88\" (UniqueName: \"kubernetes.io/projected/5bfe87b4-aa9f-475a-bba9-438425d79d47-kube-api-access-mtr88\") pod \"manila-db-sync-96zr2\" (UID: \"5bfe87b4-aa9f-475a-bba9-438425d79d47\") " pod="openstack/manila-db-sync-96zr2" Dec 06 10:01:05 crc kubenswrapper[4672]: I1206 10:01:05.678977 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bfe87b4-aa9f-475a-bba9-438425d79d47-config-data\") pod \"manila-db-sync-96zr2\" (UID: \"5bfe87b4-aa9f-475a-bba9-438425d79d47\") " pod="openstack/manila-db-sync-96zr2" Dec 06 10:01:05 crc kubenswrapper[4672]: I1206 10:01:05.678997 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bfe87b4-aa9f-475a-bba9-438425d79d47-combined-ca-bundle\") pod \"manila-db-sync-96zr2\" (UID: \"5bfe87b4-aa9f-475a-bba9-438425d79d47\") " pod="openstack/manila-db-sync-96zr2" Dec 06 10:01:05 crc kubenswrapper[4672]: I1206 10:01:05.717041 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bfe87b4-aa9f-475a-bba9-438425d79d47-combined-ca-bundle\") pod \"manila-db-sync-96zr2\" (UID: \"5bfe87b4-aa9f-475a-bba9-438425d79d47\") " pod="openstack/manila-db-sync-96zr2" Dec 06 10:01:05 crc kubenswrapper[4672]: I1206 10:01:05.736685 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/5bfe87b4-aa9f-475a-bba9-438425d79d47-job-config-data\") pod \"manila-db-sync-96zr2\" (UID: \"5bfe87b4-aa9f-475a-bba9-438425d79d47\") " pod="openstack/manila-db-sync-96zr2" Dec 06 10:01:05 crc kubenswrapper[4672]: I1206 10:01:05.739863 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtr88\" (UniqueName: \"kubernetes.io/projected/5bfe87b4-aa9f-475a-bba9-438425d79d47-kube-api-access-mtr88\") pod \"manila-db-sync-96zr2\" (UID: \"5bfe87b4-aa9f-475a-bba9-438425d79d47\") " pod="openstack/manila-db-sync-96zr2" Dec 06 10:01:05 crc kubenswrapper[4672]: I1206 10:01:05.765771 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bfe87b4-aa9f-475a-bba9-438425d79d47-config-data\") pod \"manila-db-sync-96zr2\" (UID: \"5bfe87b4-aa9f-475a-bba9-438425d79d47\") " pod="openstack/manila-db-sync-96zr2" Dec 06 10:01:05 crc kubenswrapper[4672]: I1206 10:01:05.867462 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-96zr2" Dec 06 10:01:06 crc kubenswrapper[4672]: I1206 10:01:06.072481 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 10:01:06 crc kubenswrapper[4672]: W1206 10:01:06.127126 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97cea7c5_c51e_4001_b398_28bdbccd9a97.slice/crio-161e4c8e02788322e1d09e5e2f5da0d237f1bd8ad98552578178c7d7f7085d55 WatchSource:0}: Error finding container 161e4c8e02788322e1d09e5e2f5da0d237f1bd8ad98552578178c7d7f7085d55: Status 404 returned error can't find the container with id 161e4c8e02788322e1d09e5e2f5da0d237f1bd8ad98552578178c7d7f7085d55 Dec 06 10:01:06 crc kubenswrapper[4672]: I1206 10:01:06.688065 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4","Type":"ContainerStarted","Data":"a4d7bd5d02ae9963b6d9a95c7b2fde85398a730ade2cb12e847a6e8f8fdd614c"} Dec 06 10:01:06 crc kubenswrapper[4672]: I1206 10:01:06.691212 4672 generic.go:334] "Generic (PLEG): container finished" podID="de298690-6d2a-44d6-9744-0a29c7041aa4" containerID="49deab2f933611316ba84f96de72e25052d2f197920a9d3f024c6218fa3aa90e" exitCode=0 Dec 06 10:01:06 crc kubenswrapper[4672]: I1206 10:01:06.691278 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwt4q" event={"ID":"de298690-6d2a-44d6-9744-0a29c7041aa4","Type":"ContainerDied","Data":"49deab2f933611316ba84f96de72e25052d2f197920a9d3f024c6218fa3aa90e"} Dec 06 10:01:06 crc kubenswrapper[4672]: I1206 10:01:06.704517 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"97cea7c5-c51e-4001-b398-28bdbccd9a97","Type":"ContainerStarted","Data":"161e4c8e02788322e1d09e5e2f5da0d237f1bd8ad98552578178c7d7f7085d55"} Dec 06 10:01:06 crc kubenswrapper[4672]: I1206 10:01:06.710526 4672 generic.go:334] "Generic (PLEG): container finished" podID="f321169c-b38c-4403-8541-48064fd878b2" containerID="8152a814b4dde0a33f11ce539d9893b4037f276172f2e15d560b6174b030e60c" exitCode=0 Dec 06 10:01:06 crc kubenswrapper[4672]: I1206 10:01:06.710582 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416921-tdtbn" event={"ID":"f321169c-b38c-4403-8541-48064fd878b2","Type":"ContainerDied","Data":"8152a814b4dde0a33f11ce539d9893b4037f276172f2e15d560b6174b030e60c"} Dec 06 10:01:06 crc kubenswrapper[4672]: I1206 10:01:06.832050 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-96zr2"] Dec 06 10:01:06 crc kubenswrapper[4672]: W1206 10:01:06.878621 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bfe87b4_aa9f_475a_bba9_438425d79d47.slice/crio-9f01f977fcd48c7d56cf6e75ee3010915c5fbcbb538258a8d44ef2c2a15d0cb0 WatchSource:0}: Error finding container 9f01f977fcd48c7d56cf6e75ee3010915c5fbcbb538258a8d44ef2c2a15d0cb0: Status 404 returned error can't find the container with id 9f01f977fcd48c7d56cf6e75ee3010915c5fbcbb538258a8d44ef2c2a15d0cb0 Dec 06 10:01:07 crc kubenswrapper[4672]: I1206 10:01:07.722263 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"97cea7c5-c51e-4001-b398-28bdbccd9a97","Type":"ContainerStarted","Data":"37d0d1f24e077e0857cc37f32923f9c1b009faff90bc64519661c5cf976dc04a"} Dec 06 10:01:07 crc kubenswrapper[4672]: I1206 10:01:07.724885 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-96zr2" event={"ID":"5bfe87b4-aa9f-475a-bba9-438425d79d47","Type":"ContainerStarted","Data":"9f01f977fcd48c7d56cf6e75ee3010915c5fbcbb538258a8d44ef2c2a15d0cb0"} Dec 06 10:01:07 crc kubenswrapper[4672]: I1206 10:01:07.732424 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9a6d2d22-6464-4bf5-9bf6-e3515efedbf4","Type":"ContainerStarted","Data":"7646aa263db433530abfe3a7c987255d7a0738b6242502e19a8a70783b8bdbc2"} Dec 06 10:01:07 crc kubenswrapper[4672]: I1206 10:01:07.741513 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwt4q" event={"ID":"de298690-6d2a-44d6-9744-0a29c7041aa4","Type":"ContainerStarted","Data":"265bc499f2c5367e860f9dde0172180b7161d871b58dd19bf6fbaea1c550fb38"} Dec 06 10:01:07 crc kubenswrapper[4672]: I1206 10:01:07.778077 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.778053526 podStartE2EDuration="5.778053526s" podCreationTimestamp="2025-12-06 10:01:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 10:01:07.759307507 +0000 UTC m=+3285.503567794" watchObservedRunningTime="2025-12-06 10:01:07.778053526 +0000 UTC m=+3285.522313813" Dec 06 10:01:07 crc kubenswrapper[4672]: I1206 10:01:07.796558 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dwt4q" podStartSLOduration=3.116525829 podStartE2EDuration="5.796542466s" podCreationTimestamp="2025-12-06 10:01:02 +0000 UTC" firstStartedPulling="2025-12-06 10:01:04.617834752 +0000 UTC m=+3282.362095039" lastFinishedPulling="2025-12-06 10:01:07.297851389 +0000 UTC m=+3285.042111676" observedRunningTime="2025-12-06 10:01:07.790134383 +0000 UTC m=+3285.534394690" watchObservedRunningTime="2025-12-06 10:01:07.796542466 +0000 UTC m=+3285.540802753" Dec 06 10:01:08 crc kubenswrapper[4672]: I1206 10:01:08.756908 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"97cea7c5-c51e-4001-b398-28bdbccd9a97","Type":"ContainerStarted","Data":"f84d31c7caf78f06d3b7caad74b138652856d30f675100a07410b3267154d4c7"} Dec 06 10:01:08 crc kubenswrapper[4672]: I1206 10:01:08.788255 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.788239338 podStartE2EDuration="5.788239338s" podCreationTimestamp="2025-12-06 10:01:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 10:01:08.778073673 +0000 UTC m=+3286.522333970" watchObservedRunningTime="2025-12-06 10:01:08.788239338 +0000 UTC m=+3286.532499625" Dec 06 10:01:09 crc kubenswrapper[4672]: I1206 10:01:09.292662 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Dec 06 10:01:13 crc kubenswrapper[4672]: I1206 10:01:13.040856 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dwt4q" Dec 06 10:01:13 crc kubenswrapper[4672]: I1206 10:01:13.041352 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dwt4q" Dec 06 10:01:13 crc kubenswrapper[4672]: I1206 10:01:13.094700 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dwt4q" Dec 06 10:01:13 crc kubenswrapper[4672]: I1206 10:01:13.513694 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 06 10:01:13 crc kubenswrapper[4672]: I1206 10:01:13.513754 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 06 10:01:13 crc kubenswrapper[4672]: I1206 10:01:13.723328 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 06 10:01:13 crc kubenswrapper[4672]: I1206 10:01:13.723367 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 06 10:01:13 crc kubenswrapper[4672]: I1206 10:01:13.853190 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dwt4q" Dec 06 10:01:13 crc kubenswrapper[4672]: I1206 10:01:13.918376 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwt4q"] Dec 06 10:01:14 crc kubenswrapper[4672]: I1206 10:01:14.014372 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 06 10:01:14 crc kubenswrapper[4672]: I1206 10:01:14.015776 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 06 10:01:14 crc kubenswrapper[4672]: I1206 10:01:14.018225 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 06 10:01:14 crc kubenswrapper[4672]: I1206 10:01:14.018434 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 06 10:01:14 crc kubenswrapper[4672]: I1206 10:01:14.018987 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 06 10:01:14 crc kubenswrapper[4672]: I1206 10:01:14.020756 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 06 10:01:14 crc kubenswrapper[4672]: I1206 10:01:14.200273 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416921-tdtbn" Dec 06 10:01:14 crc kubenswrapper[4672]: I1206 10:01:14.301120 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f321169c-b38c-4403-8541-48064fd878b2-config-data\") pod \"f321169c-b38c-4403-8541-48064fd878b2\" (UID: \"f321169c-b38c-4403-8541-48064fd878b2\") " Dec 06 10:01:14 crc kubenswrapper[4672]: I1206 10:01:14.301244 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f321169c-b38c-4403-8541-48064fd878b2-fernet-keys\") pod \"f321169c-b38c-4403-8541-48064fd878b2\" (UID: \"f321169c-b38c-4403-8541-48064fd878b2\") " Dec 06 10:01:14 crc kubenswrapper[4672]: I1206 10:01:14.301393 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f321169c-b38c-4403-8541-48064fd878b2-combined-ca-bundle\") pod \"f321169c-b38c-4403-8541-48064fd878b2\" (UID: \"f321169c-b38c-4403-8541-48064fd878b2\") " Dec 06 10:01:14 crc kubenswrapper[4672]: I1206 10:01:14.301814 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfkpf\" (UniqueName: \"kubernetes.io/projected/f321169c-b38c-4403-8541-48064fd878b2-kube-api-access-zfkpf\") pod \"f321169c-b38c-4403-8541-48064fd878b2\" (UID: \"f321169c-b38c-4403-8541-48064fd878b2\") " Dec 06 10:01:14 crc kubenswrapper[4672]: I1206 10:01:14.306343 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f321169c-b38c-4403-8541-48064fd878b2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f321169c-b38c-4403-8541-48064fd878b2" (UID: "f321169c-b38c-4403-8541-48064fd878b2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:14 crc kubenswrapper[4672]: I1206 10:01:14.315916 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f321169c-b38c-4403-8541-48064fd878b2-kube-api-access-zfkpf" (OuterVolumeSpecName: "kube-api-access-zfkpf") pod "f321169c-b38c-4403-8541-48064fd878b2" (UID: "f321169c-b38c-4403-8541-48064fd878b2"). InnerVolumeSpecName "kube-api-access-zfkpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:01:14 crc kubenswrapper[4672]: I1206 10:01:14.350199 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f321169c-b38c-4403-8541-48064fd878b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f321169c-b38c-4403-8541-48064fd878b2" (UID: "f321169c-b38c-4403-8541-48064fd878b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:14 crc kubenswrapper[4672]: I1206 10:01:14.400064 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f321169c-b38c-4403-8541-48064fd878b2-config-data" (OuterVolumeSpecName: "config-data") pod "f321169c-b38c-4403-8541-48064fd878b2" (UID: "f321169c-b38c-4403-8541-48064fd878b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:14 crc kubenswrapper[4672]: I1206 10:01:14.404423 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f321169c-b38c-4403-8541-48064fd878b2-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:14 crc kubenswrapper[4672]: I1206 10:01:14.404479 4672 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f321169c-b38c-4403-8541-48064fd878b2-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:14 crc kubenswrapper[4672]: I1206 10:01:14.404490 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f321169c-b38c-4403-8541-48064fd878b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:14 crc kubenswrapper[4672]: I1206 10:01:14.404503 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfkpf\" (UniqueName: \"kubernetes.io/projected/f321169c-b38c-4403-8541-48064fd878b2-kube-api-access-zfkpf\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:14 crc kubenswrapper[4672]: I1206 10:01:14.809746 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416921-tdtbn" event={"ID":"f321169c-b38c-4403-8541-48064fd878b2","Type":"ContainerDied","Data":"bf27932a310c59c2c08717a333fc39540d0f234f0b4fb5ba820997a3a8c4d1d4"} Dec 06 10:01:14 crc kubenswrapper[4672]: I1206 10:01:14.810302 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf27932a310c59c2c08717a333fc39540d0f234f0b4fb5ba820997a3a8c4d1d4" Dec 06 10:01:14 crc kubenswrapper[4672]: I1206 10:01:14.809827 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416921-tdtbn" Dec 06 10:01:14 crc kubenswrapper[4672]: I1206 10:01:14.810545 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 06 10:01:14 crc kubenswrapper[4672]: I1206 10:01:14.810577 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 06 10:01:15 crc kubenswrapper[4672]: I1206 10:01:15.822484 4672 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 10:01:15 crc kubenswrapper[4672]: I1206 10:01:15.823866 4672 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 10:01:15 crc kubenswrapper[4672]: I1206 10:01:15.825147 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dwt4q" podUID="de298690-6d2a-44d6-9744-0a29c7041aa4" containerName="registry-server" containerID="cri-o://265bc499f2c5367e860f9dde0172180b7161d871b58dd19bf6fbaea1c550fb38" gracePeriod=2 Dec 06 10:01:16 crc kubenswrapper[4672]: I1206 10:01:16.557367 4672 scope.go:117] "RemoveContainer" containerID="7bbf2781550e7a61427a2b236cb3b966725940a4a76024981629c0dc4fd1af55" Dec 06 10:01:16 crc kubenswrapper[4672]: I1206 10:01:16.846552 4672 generic.go:334] "Generic (PLEG): container finished" podID="de298690-6d2a-44d6-9744-0a29c7041aa4" containerID="265bc499f2c5367e860f9dde0172180b7161d871b58dd19bf6fbaea1c550fb38" exitCode=0 Dec 06 10:01:16 crc kubenswrapper[4672]: I1206 10:01:16.846622 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwt4q" event={"ID":"de298690-6d2a-44d6-9744-0a29c7041aa4","Type":"ContainerDied","Data":"265bc499f2c5367e860f9dde0172180b7161d871b58dd19bf6fbaea1c550fb38"} Dec 06 10:01:17 crc kubenswrapper[4672]: I1206 10:01:17.964586 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 06 10:01:17 crc kubenswrapper[4672]: I1206 10:01:17.965682 4672 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 10:01:17 crc kubenswrapper[4672]: I1206 10:01:17.973245 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 06 10:01:17 crc kubenswrapper[4672]: I1206 10:01:17.973352 4672 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 10:01:18 crc kubenswrapper[4672]: I1206 10:01:17.996253 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 06 10:01:18 crc kubenswrapper[4672]: I1206 10:01:18.020357 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 06 10:01:19 crc kubenswrapper[4672]: I1206 10:01:19.259880 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dwt4q" Dec 06 10:01:19 crc kubenswrapper[4672]: I1206 10:01:19.334438 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de298690-6d2a-44d6-9744-0a29c7041aa4-utilities\") pod \"de298690-6d2a-44d6-9744-0a29c7041aa4\" (UID: \"de298690-6d2a-44d6-9744-0a29c7041aa4\") " Dec 06 10:01:19 crc kubenswrapper[4672]: I1206 10:01:19.334518 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de298690-6d2a-44d6-9744-0a29c7041aa4-catalog-content\") pod \"de298690-6d2a-44d6-9744-0a29c7041aa4\" (UID: \"de298690-6d2a-44d6-9744-0a29c7041aa4\") " Dec 06 10:01:19 crc kubenswrapper[4672]: I1206 10:01:19.334580 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5vbt\" (UniqueName: \"kubernetes.io/projected/de298690-6d2a-44d6-9744-0a29c7041aa4-kube-api-access-l5vbt\") pod \"de298690-6d2a-44d6-9744-0a29c7041aa4\" (UID: \"de298690-6d2a-44d6-9744-0a29c7041aa4\") " Dec 06 10:01:19 crc kubenswrapper[4672]: I1206 10:01:19.336163 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de298690-6d2a-44d6-9744-0a29c7041aa4-utilities" (OuterVolumeSpecName: "utilities") pod "de298690-6d2a-44d6-9744-0a29c7041aa4" (UID: "de298690-6d2a-44d6-9744-0a29c7041aa4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:01:19 crc kubenswrapper[4672]: I1206 10:01:19.343409 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de298690-6d2a-44d6-9744-0a29c7041aa4-kube-api-access-l5vbt" (OuterVolumeSpecName: "kube-api-access-l5vbt") pod "de298690-6d2a-44d6-9744-0a29c7041aa4" (UID: "de298690-6d2a-44d6-9744-0a29c7041aa4"). InnerVolumeSpecName "kube-api-access-l5vbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:01:19 crc kubenswrapper[4672]: I1206 10:01:19.358589 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de298690-6d2a-44d6-9744-0a29c7041aa4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de298690-6d2a-44d6-9744-0a29c7041aa4" (UID: "de298690-6d2a-44d6-9744-0a29c7041aa4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:01:19 crc kubenswrapper[4672]: I1206 10:01:19.438393 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de298690-6d2a-44d6-9744-0a29c7041aa4-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:19 crc kubenswrapper[4672]: I1206 10:01:19.438758 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de298690-6d2a-44d6-9744-0a29c7041aa4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:19 crc kubenswrapper[4672]: I1206 10:01:19.438847 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5vbt\" (UniqueName: \"kubernetes.io/projected/de298690-6d2a-44d6-9744-0a29c7041aa4-kube-api-access-l5vbt\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:19 crc kubenswrapper[4672]: I1206 10:01:19.937965 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b745d8b98-pzrsc" event={"ID":"afd13bd1-0e47-4739-9f82-e673232e3c61","Type":"ContainerStarted","Data":"5a27632bac37b4f4e263c64bd1c2b9c86c1c68c6b8f656990d806f14600af641"} Dec 06 10:01:19 crc kubenswrapper[4672]: I1206 10:01:19.950574 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76f96bf4d5-7wt8s" event={"ID":"9b071d77-1fdb-4938-95ee-10e91492c545","Type":"ContainerStarted","Data":"56b6135ba52ae6cdc807e3227322d4780647411d55647b5b69917ee780a7526a"} Dec 06 10:01:19 crc kubenswrapper[4672]: I1206 10:01:19.977950 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwt4q" event={"ID":"de298690-6d2a-44d6-9744-0a29c7041aa4","Type":"ContainerDied","Data":"d9bdfd9d127425ec6ff143aa9b9818d254ed760dccafc13c6846d07ed4883b7e"} Dec 06 10:01:19 crc kubenswrapper[4672]: I1206 10:01:19.978810 4672 scope.go:117] "RemoveContainer" containerID="265bc499f2c5367e860f9dde0172180b7161d871b58dd19bf6fbaea1c550fb38" Dec 06 10:01:19 crc kubenswrapper[4672]: I1206 10:01:19.979044 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dwt4q" Dec 06 10:01:20 crc kubenswrapper[4672]: I1206 10:01:20.009996 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c65c599b9-mz2lg" event={"ID":"d26c88b2-357e-4c41-8f93-1f6422329000","Type":"ContainerStarted","Data":"a8b766d4589057d37abd957828d40407b2391ad1169cfd32ac838699e89eec28"} Dec 06 10:01:20 crc kubenswrapper[4672]: I1206 10:01:20.010053 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c65c599b9-mz2lg" event={"ID":"d26c88b2-357e-4c41-8f93-1f6422329000","Type":"ContainerStarted","Data":"c0849faff0fa8114dafc6ad4ab8ba1a0e601ad343bcb00cd253d76d3d3ec7cdf"} Dec 06 10:01:20 crc kubenswrapper[4672]: I1206 10:01:20.010269 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c65c599b9-mz2lg" podUID="d26c88b2-357e-4c41-8f93-1f6422329000" containerName="horizon-log" containerID="cri-o://c0849faff0fa8114dafc6ad4ab8ba1a0e601ad343bcb00cd253d76d3d3ec7cdf" gracePeriod=30 Dec 06 10:01:20 crc kubenswrapper[4672]: I1206 10:01:20.010823 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c65c599b9-mz2lg" podUID="d26c88b2-357e-4c41-8f93-1f6422329000" containerName="horizon" containerID="cri-o://a8b766d4589057d37abd957828d40407b2391ad1169cfd32ac838699e89eec28" gracePeriod=30 Dec 06 10:01:20 crc kubenswrapper[4672]: I1206 10:01:20.028272 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerStarted","Data":"b52c4a53b3f1a8646d79189c3bda17a1d38aee3f1effad6325de3a3cc88f79b3"} Dec 06 10:01:20 crc kubenswrapper[4672]: I1206 10:01:20.042413 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8c74dbc66-8ghhf" event={"ID":"70d8ba3e-3f2d-4627-afab-5bb8908f89eb","Type":"ContainerStarted","Data":"d0d6c29a685a95836aaf2ede2070ddb2ab895d80aaa119f51e330d0551369c5e"} Dec 06 10:01:20 crc kubenswrapper[4672]: I1206 10:01:20.065252 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7c65c599b9-mz2lg" podStartSLOduration=3.086000359 podStartE2EDuration="25.065230658s" podCreationTimestamp="2025-12-06 10:00:55 +0000 UTC" firstStartedPulling="2025-12-06 10:00:57.014614367 +0000 UTC m=+3274.758874654" lastFinishedPulling="2025-12-06 10:01:18.993844666 +0000 UTC m=+3296.738104953" observedRunningTime="2025-12-06 10:01:20.050416746 +0000 UTC m=+3297.794677033" watchObservedRunningTime="2025-12-06 10:01:20.065230658 +0000 UTC m=+3297.809490945" Dec 06 10:01:20 crc kubenswrapper[4672]: I1206 10:01:20.070634 4672 scope.go:117] "RemoveContainer" containerID="49deab2f933611316ba84f96de72e25052d2f197920a9d3f024c6218fa3aa90e" Dec 06 10:01:20 crc kubenswrapper[4672]: I1206 10:01:20.131449 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwt4q"] Dec 06 10:01:20 crc kubenswrapper[4672]: I1206 10:01:20.138932 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwt4q"] Dec 06 10:01:20 crc kubenswrapper[4672]: I1206 10:01:20.171440 4672 scope.go:117] "RemoveContainer" containerID="bae3e204a33a66a3f350857f091b3045c789047d9dafbf3330ec419b50607eb0" Dec 06 10:01:20 crc kubenswrapper[4672]: I1206 10:01:20.567155 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de298690-6d2a-44d6-9744-0a29c7041aa4" path="/var/lib/kubelet/pods/de298690-6d2a-44d6-9744-0a29c7041aa4/volumes" Dec 06 10:01:21 crc kubenswrapper[4672]: I1206 10:01:21.077761 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-96zr2" event={"ID":"5bfe87b4-aa9f-475a-bba9-438425d79d47","Type":"ContainerStarted","Data":"146877679d3fbb7046b748f1038bf730ca69f61a25f128667cf8a4bb0cdc7c11"} Dec 06 10:01:21 crc kubenswrapper[4672]: I1206 10:01:21.085636 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8c74dbc66-8ghhf" event={"ID":"70d8ba3e-3f2d-4627-afab-5bb8908f89eb","Type":"ContainerStarted","Data":"5f6ce74ababfe59d034e146bc8b8ebc8188ff97113fae60924ad0a0aef93a93c"} Dec 06 10:01:21 crc kubenswrapper[4672]: I1206 10:01:21.100942 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-96zr2" podStartSLOduration=4.009530836 podStartE2EDuration="16.100925611s" podCreationTimestamp="2025-12-06 10:01:05 +0000 UTC" firstStartedPulling="2025-12-06 10:01:06.890042115 +0000 UTC m=+3284.634302402" lastFinishedPulling="2025-12-06 10:01:18.98143689 +0000 UTC m=+3296.725697177" observedRunningTime="2025-12-06 10:01:21.094053165 +0000 UTC m=+3298.838313462" watchObservedRunningTime="2025-12-06 10:01:21.100925611 +0000 UTC m=+3298.845185898" Dec 06 10:01:21 crc kubenswrapper[4672]: I1206 10:01:21.106749 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b745d8b98-pzrsc" event={"ID":"afd13bd1-0e47-4739-9f82-e673232e3c61","Type":"ContainerStarted","Data":"904ae83ce27cc40a81632b38ca1eeb1805e0dc909323a785271dde76df7f210d"} Dec 06 10:01:21 crc kubenswrapper[4672]: I1206 10:01:21.118521 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76f96bf4d5-7wt8s" event={"ID":"9b071d77-1fdb-4938-95ee-10e91492c545","Type":"ContainerStarted","Data":"b2add5d8255b5d6dc54de6dd5ffbb569f6e68989c4a04cc15b366791492dc2b6"} Dec 06 10:01:21 crc kubenswrapper[4672]: I1206 10:01:21.118677 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-76f96bf4d5-7wt8s" podUID="9b071d77-1fdb-4938-95ee-10e91492c545" containerName="horizon-log" containerID="cri-o://56b6135ba52ae6cdc807e3227322d4780647411d55647b5b69917ee780a7526a" gracePeriod=30 Dec 06 10:01:21 crc kubenswrapper[4672]: I1206 10:01:21.118924 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-76f96bf4d5-7wt8s" podUID="9b071d77-1fdb-4938-95ee-10e91492c545" containerName="horizon" containerID="cri-o://b2add5d8255b5d6dc54de6dd5ffbb569f6e68989c4a04cc15b366791492dc2b6" gracePeriod=30 Dec 06 10:01:21 crc kubenswrapper[4672]: I1206 10:01:21.127313 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-8c74dbc66-8ghhf" podStartSLOduration=4.590653084 podStartE2EDuration="23.127294507s" podCreationTimestamp="2025-12-06 10:00:58 +0000 UTC" firstStartedPulling="2025-12-06 10:01:00.447839579 +0000 UTC m=+3278.192099866" lastFinishedPulling="2025-12-06 10:01:18.984481002 +0000 UTC m=+3296.728741289" observedRunningTime="2025-12-06 10:01:21.115916288 +0000 UTC m=+3298.860176565" watchObservedRunningTime="2025-12-06 10:01:21.127294507 +0000 UTC m=+3298.871554794" Dec 06 10:01:21 crc kubenswrapper[4672]: I1206 10:01:21.158254 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5b745d8b98-pzrsc" podStartSLOduration=3.940418007 podStartE2EDuration="23.158235565s" podCreationTimestamp="2025-12-06 10:00:58 +0000 UTC" firstStartedPulling="2025-12-06 10:00:59.763611842 +0000 UTC m=+3277.507872129" lastFinishedPulling="2025-12-06 10:01:18.9814294 +0000 UTC m=+3296.725689687" observedRunningTime="2025-12-06 10:01:21.143812784 +0000 UTC m=+3298.888073091" watchObservedRunningTime="2025-12-06 10:01:21.158235565 +0000 UTC m=+3298.902495852" Dec 06 10:01:21 crc kubenswrapper[4672]: I1206 10:01:21.178529 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-76f96bf4d5-7wt8s" podStartSLOduration=4.269314525 podStartE2EDuration="26.178509995s" podCreationTimestamp="2025-12-06 10:00:55 +0000 UTC" firstStartedPulling="2025-12-06 10:00:57.084887542 +0000 UTC m=+3274.829147829" lastFinishedPulling="2025-12-06 10:01:18.994083012 +0000 UTC m=+3296.738343299" observedRunningTime="2025-12-06 10:01:21.175904224 +0000 UTC m=+3298.920164511" watchObservedRunningTime="2025-12-06 10:01:21.178509995 +0000 UTC m=+3298.922770282" Dec 06 10:01:25 crc kubenswrapper[4672]: I1206 10:01:25.873981 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7c65c599b9-mz2lg" Dec 06 10:01:26 crc kubenswrapper[4672]: I1206 10:01:26.246696 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-76f96bf4d5-7wt8s" Dec 06 10:01:28 crc kubenswrapper[4672]: I1206 10:01:28.899785 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5b745d8b98-pzrsc" Dec 06 10:01:28 crc kubenswrapper[4672]: I1206 10:01:28.900164 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5b745d8b98-pzrsc" Dec 06 10:01:29 crc kubenswrapper[4672]: I1206 10:01:29.148991 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-8c74dbc66-8ghhf" Dec 06 10:01:29 crc kubenswrapper[4672]: I1206 10:01:29.149401 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-8c74dbc66-8ghhf" Dec 06 10:01:33 crc kubenswrapper[4672]: I1206 10:01:33.221239 4672 generic.go:334] "Generic (PLEG): container finished" podID="5bfe87b4-aa9f-475a-bba9-438425d79d47" containerID="146877679d3fbb7046b748f1038bf730ca69f61a25f128667cf8a4bb0cdc7c11" exitCode=0 Dec 06 10:01:33 crc kubenswrapper[4672]: I1206 10:01:33.222105 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-96zr2" event={"ID":"5bfe87b4-aa9f-475a-bba9-438425d79d47","Type":"ContainerDied","Data":"146877679d3fbb7046b748f1038bf730ca69f61a25f128667cf8a4bb0cdc7c11"} Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.135770 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-96zr2" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.243405 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-96zr2" event={"ID":"5bfe87b4-aa9f-475a-bba9-438425d79d47","Type":"ContainerDied","Data":"9f01f977fcd48c7d56cf6e75ee3010915c5fbcbb538258a8d44ef2c2a15d0cb0"} Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.243444 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f01f977fcd48c7d56cf6e75ee3010915c5fbcbb538258a8d44ef2c2a15d0cb0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.243498 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-96zr2" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.266540 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/5bfe87b4-aa9f-475a-bba9-438425d79d47-job-config-data\") pod \"5bfe87b4-aa9f-475a-bba9-438425d79d47\" (UID: \"5bfe87b4-aa9f-475a-bba9-438425d79d47\") " Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.268056 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bfe87b4-aa9f-475a-bba9-438425d79d47-config-data\") pod \"5bfe87b4-aa9f-475a-bba9-438425d79d47\" (UID: \"5bfe87b4-aa9f-475a-bba9-438425d79d47\") " Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.270096 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtr88\" (UniqueName: \"kubernetes.io/projected/5bfe87b4-aa9f-475a-bba9-438425d79d47-kube-api-access-mtr88\") pod \"5bfe87b4-aa9f-475a-bba9-438425d79d47\" (UID: \"5bfe87b4-aa9f-475a-bba9-438425d79d47\") " Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.270146 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bfe87b4-aa9f-475a-bba9-438425d79d47-combined-ca-bundle\") pod \"5bfe87b4-aa9f-475a-bba9-438425d79d47\" (UID: \"5bfe87b4-aa9f-475a-bba9-438425d79d47\") " Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.282832 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bfe87b4-aa9f-475a-bba9-438425d79d47-kube-api-access-mtr88" (OuterVolumeSpecName: "kube-api-access-mtr88") pod "5bfe87b4-aa9f-475a-bba9-438425d79d47" (UID: "5bfe87b4-aa9f-475a-bba9-438425d79d47"). InnerVolumeSpecName "kube-api-access-mtr88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.288794 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bfe87b4-aa9f-475a-bba9-438425d79d47-config-data" (OuterVolumeSpecName: "config-data") pod "5bfe87b4-aa9f-475a-bba9-438425d79d47" (UID: "5bfe87b4-aa9f-475a-bba9-438425d79d47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.296184 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bfe87b4-aa9f-475a-bba9-438425d79d47-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "5bfe87b4-aa9f-475a-bba9-438425d79d47" (UID: "5bfe87b4-aa9f-475a-bba9-438425d79d47"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.320743 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bfe87b4-aa9f-475a-bba9-438425d79d47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5bfe87b4-aa9f-475a-bba9-438425d79d47" (UID: "5bfe87b4-aa9f-475a-bba9-438425d79d47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.376053 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bfe87b4-aa9f-475a-bba9-438425d79d47-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.376102 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtr88\" (UniqueName: \"kubernetes.io/projected/5bfe87b4-aa9f-475a-bba9-438425d79d47-kube-api-access-mtr88\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.376121 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bfe87b4-aa9f-475a-bba9-438425d79d47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.376137 4672 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/5bfe87b4-aa9f-475a-bba9-438425d79d47-job-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.626707 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Dec 06 10:01:35 crc kubenswrapper[4672]: E1206 10:01:35.627805 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de298690-6d2a-44d6-9744-0a29c7041aa4" containerName="extract-content" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.627826 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="de298690-6d2a-44d6-9744-0a29c7041aa4" containerName="extract-content" Dec 06 10:01:35 crc kubenswrapper[4672]: E1206 10:01:35.627835 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de298690-6d2a-44d6-9744-0a29c7041aa4" containerName="registry-server" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.627843 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="de298690-6d2a-44d6-9744-0a29c7041aa4" containerName="registry-server" Dec 06 10:01:35 crc kubenswrapper[4672]: E1206 10:01:35.627858 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f321169c-b38c-4403-8541-48064fd878b2" containerName="keystone-cron" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.627865 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="f321169c-b38c-4403-8541-48064fd878b2" containerName="keystone-cron" Dec 06 10:01:35 crc kubenswrapper[4672]: E1206 10:01:35.627882 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bfe87b4-aa9f-475a-bba9-438425d79d47" containerName="manila-db-sync" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.627888 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bfe87b4-aa9f-475a-bba9-438425d79d47" containerName="manila-db-sync" Dec 06 10:01:35 crc kubenswrapper[4672]: E1206 10:01:35.627897 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de298690-6d2a-44d6-9744-0a29c7041aa4" containerName="extract-utilities" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.627903 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="de298690-6d2a-44d6-9744-0a29c7041aa4" containerName="extract-utilities" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.628092 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="f321169c-b38c-4403-8541-48064fd878b2" containerName="keystone-cron" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.628107 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="de298690-6d2a-44d6-9744-0a29c7041aa4" containerName="registry-server" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.628122 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bfe87b4-aa9f-475a-bba9-438425d79d47" containerName="manila-db-sync" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.629104 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.633999 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.634076 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.634205 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-rv9j5" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.638185 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.644634 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.681331 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb92r\" (UniqueName: \"kubernetes.io/projected/08451f8e-b555-445f-b31d-e0e9f8011ed0-kube-api-access-bb92r\") pod \"manila-scheduler-0\" (UID: \"08451f8e-b555-445f-b31d-e0e9f8011ed0\") " pod="openstack/manila-scheduler-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.681626 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08451f8e-b555-445f-b31d-e0e9f8011ed0-scripts\") pod \"manila-scheduler-0\" (UID: \"08451f8e-b555-445f-b31d-e0e9f8011ed0\") " pod="openstack/manila-scheduler-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.681651 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08451f8e-b555-445f-b31d-e0e9f8011ed0-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"08451f8e-b555-445f-b31d-e0e9f8011ed0\") " pod="openstack/manila-scheduler-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.681763 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08451f8e-b555-445f-b31d-e0e9f8011ed0-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"08451f8e-b555-445f-b31d-e0e9f8011ed0\") " pod="openstack/manila-scheduler-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.681806 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08451f8e-b555-445f-b31d-e0e9f8011ed0-config-data\") pod \"manila-scheduler-0\" (UID: \"08451f8e-b555-445f-b31d-e0e9f8011ed0\") " pod="openstack/manila-scheduler-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.681836 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08451f8e-b555-445f-b31d-e0e9f8011ed0-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"08451f8e-b555-445f-b31d-e0e9f8011ed0\") " pod="openstack/manila-scheduler-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.710946 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.712443 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.731306 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.733701 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.784736 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c109eeef-0858-4722-a7d6-cefd2b8979b4-ceph\") pod \"manila-share-share1-0\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " pod="openstack/manila-share-share1-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.784795 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08451f8e-b555-445f-b31d-e0e9f8011ed0-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"08451f8e-b555-445f-b31d-e0e9f8011ed0\") " pod="openstack/manila-scheduler-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.784837 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08451f8e-b555-445f-b31d-e0e9f8011ed0-config-data\") pod \"manila-scheduler-0\" (UID: \"08451f8e-b555-445f-b31d-e0e9f8011ed0\") " pod="openstack/manila-scheduler-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.784869 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08451f8e-b555-445f-b31d-e0e9f8011ed0-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"08451f8e-b555-445f-b31d-e0e9f8011ed0\") " pod="openstack/manila-scheduler-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.784896 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb92r\" (UniqueName: \"kubernetes.io/projected/08451f8e-b555-445f-b31d-e0e9f8011ed0-kube-api-access-bb92r\") pod \"manila-scheduler-0\" (UID: \"08451f8e-b555-445f-b31d-e0e9f8011ed0\") " pod="openstack/manila-scheduler-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.784939 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c109eeef-0858-4722-a7d6-cefd2b8979b4-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " pod="openstack/manila-share-share1-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.784963 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxd7q\" (UniqueName: \"kubernetes.io/projected/c109eeef-0858-4722-a7d6-cefd2b8979b4-kube-api-access-xxd7q\") pod \"manila-share-share1-0\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " pod="openstack/manila-share-share1-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.784996 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08451f8e-b555-445f-b31d-e0e9f8011ed0-scripts\") pod \"manila-scheduler-0\" (UID: \"08451f8e-b555-445f-b31d-e0e9f8011ed0\") " pod="openstack/manila-scheduler-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.785017 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08451f8e-b555-445f-b31d-e0e9f8011ed0-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"08451f8e-b555-445f-b31d-e0e9f8011ed0\") " pod="openstack/manila-scheduler-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.785054 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/c109eeef-0858-4722-a7d6-cefd2b8979b4-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " pod="openstack/manila-share-share1-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.785076 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c109eeef-0858-4722-a7d6-cefd2b8979b4-scripts\") pod \"manila-share-share1-0\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " pod="openstack/manila-share-share1-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.785109 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c109eeef-0858-4722-a7d6-cefd2b8979b4-config-data\") pod \"manila-share-share1-0\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " pod="openstack/manila-share-share1-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.785127 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c109eeef-0858-4722-a7d6-cefd2b8979b4-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " pod="openstack/manila-share-share1-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.785145 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c109eeef-0858-4722-a7d6-cefd2b8979b4-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " pod="openstack/manila-share-share1-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.790107 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08451f8e-b555-445f-b31d-e0e9f8011ed0-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"08451f8e-b555-445f-b31d-e0e9f8011ed0\") " pod="openstack/manila-scheduler-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.791550 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08451f8e-b555-445f-b31d-e0e9f8011ed0-config-data\") pod \"manila-scheduler-0\" (UID: \"08451f8e-b555-445f-b31d-e0e9f8011ed0\") " pod="openstack/manila-scheduler-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.796632 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08451f8e-b555-445f-b31d-e0e9f8011ed0-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"08451f8e-b555-445f-b31d-e0e9f8011ed0\") " pod="openstack/manila-scheduler-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.812975 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08451f8e-b555-445f-b31d-e0e9f8011ed0-scripts\") pod \"manila-scheduler-0\" (UID: \"08451f8e-b555-445f-b31d-e0e9f8011ed0\") " pod="openstack/manila-scheduler-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.814204 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08451f8e-b555-445f-b31d-e0e9f8011ed0-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"08451f8e-b555-445f-b31d-e0e9f8011ed0\") " pod="openstack/manila-scheduler-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.845331 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb92r\" (UniqueName: \"kubernetes.io/projected/08451f8e-b555-445f-b31d-e0e9f8011ed0-kube-api-access-bb92r\") pod \"manila-scheduler-0\" (UID: \"08451f8e-b555-445f-b31d-e0e9f8011ed0\") " pod="openstack/manila-scheduler-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.888035 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c109eeef-0858-4722-a7d6-cefd2b8979b4-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " pod="openstack/manila-share-share1-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.888102 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxd7q\" (UniqueName: \"kubernetes.io/projected/c109eeef-0858-4722-a7d6-cefd2b8979b4-kube-api-access-xxd7q\") pod \"manila-share-share1-0\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " pod="openstack/manila-share-share1-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.888187 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/c109eeef-0858-4722-a7d6-cefd2b8979b4-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " pod="openstack/manila-share-share1-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.888211 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c109eeef-0858-4722-a7d6-cefd2b8979b4-scripts\") pod \"manila-share-share1-0\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " pod="openstack/manila-share-share1-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.888263 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c109eeef-0858-4722-a7d6-cefd2b8979b4-config-data\") pod \"manila-share-share1-0\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " pod="openstack/manila-share-share1-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.888319 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c109eeef-0858-4722-a7d6-cefd2b8979b4-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " pod="openstack/manila-share-share1-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.888343 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c109eeef-0858-4722-a7d6-cefd2b8979b4-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " pod="openstack/manila-share-share1-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.888370 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c109eeef-0858-4722-a7d6-cefd2b8979b4-ceph\") pod \"manila-share-share1-0\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " pod="openstack/manila-share-share1-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.888890 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/c109eeef-0858-4722-a7d6-cefd2b8979b4-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " pod="openstack/manila-share-share1-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.888947 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c109eeef-0858-4722-a7d6-cefd2b8979b4-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " pod="openstack/manila-share-share1-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.895294 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c109eeef-0858-4722-a7d6-cefd2b8979b4-ceph\") pod \"manila-share-share1-0\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " pod="openstack/manila-share-share1-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.895384 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c109eeef-0858-4722-a7d6-cefd2b8979b4-config-data\") pod \"manila-share-share1-0\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " pod="openstack/manila-share-share1-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.895840 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c109eeef-0858-4722-a7d6-cefd2b8979b4-scripts\") pod \"manila-share-share1-0\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " pod="openstack/manila-share-share1-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.895889 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c109eeef-0858-4722-a7d6-cefd2b8979b4-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " pod="openstack/manila-share-share1-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.902903 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c109eeef-0858-4722-a7d6-cefd2b8979b4-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " pod="openstack/manila-share-share1-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.919389 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxd7q\" (UniqueName: \"kubernetes.io/projected/c109eeef-0858-4722-a7d6-cefd2b8979b4-kube-api-access-xxd7q\") pod \"manila-share-share1-0\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " pod="openstack/manila-share-share1-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.923772 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b586f587-9rz4l"] Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.925917 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b586f587-9rz4l" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.950400 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.957241 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b586f587-9rz4l"] Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.990113 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/961904ba-a936-4912-b5a1-a20e4a4028e6-dns-svc\") pod \"dnsmasq-dns-6b586f587-9rz4l\" (UID: \"961904ba-a936-4912-b5a1-a20e4a4028e6\") " pod="openstack/dnsmasq-dns-6b586f587-9rz4l" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.990174 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/961904ba-a936-4912-b5a1-a20e4a4028e6-openstack-edpm-ipam\") pod \"dnsmasq-dns-6b586f587-9rz4l\" (UID: \"961904ba-a936-4912-b5a1-a20e4a4028e6\") " pod="openstack/dnsmasq-dns-6b586f587-9rz4l" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.990217 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6w7q\" (UniqueName: \"kubernetes.io/projected/961904ba-a936-4912-b5a1-a20e4a4028e6-kube-api-access-m6w7q\") pod \"dnsmasq-dns-6b586f587-9rz4l\" (UID: \"961904ba-a936-4912-b5a1-a20e4a4028e6\") " pod="openstack/dnsmasq-dns-6b586f587-9rz4l" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.990259 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/961904ba-a936-4912-b5a1-a20e4a4028e6-ovsdbserver-nb\") pod \"dnsmasq-dns-6b586f587-9rz4l\" (UID: \"961904ba-a936-4912-b5a1-a20e4a4028e6\") " pod="openstack/dnsmasq-dns-6b586f587-9rz4l" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.990320 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/961904ba-a936-4912-b5a1-a20e4a4028e6-config\") pod \"dnsmasq-dns-6b586f587-9rz4l\" (UID: \"961904ba-a936-4912-b5a1-a20e4a4028e6\") " pod="openstack/dnsmasq-dns-6b586f587-9rz4l" Dec 06 10:01:35 crc kubenswrapper[4672]: I1206 10:01:35.990387 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/961904ba-a936-4912-b5a1-a20e4a4028e6-ovsdbserver-sb\") pod \"dnsmasq-dns-6b586f587-9rz4l\" (UID: \"961904ba-a936-4912-b5a1-a20e4a4028e6\") " pod="openstack/dnsmasq-dns-6b586f587-9rz4l" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.032218 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.034033 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.039929 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.042388 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.047900 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.092621 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25e89f55-af8e-4fda-a70e-fee52e57568f-etc-machine-id\") pod \"manila-api-0\" (UID: \"25e89f55-af8e-4fda-a70e-fee52e57568f\") " pod="openstack/manila-api-0" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.092674 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/961904ba-a936-4912-b5a1-a20e4a4028e6-ovsdbserver-sb\") pod \"dnsmasq-dns-6b586f587-9rz4l\" (UID: \"961904ba-a936-4912-b5a1-a20e4a4028e6\") " pod="openstack/dnsmasq-dns-6b586f587-9rz4l" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.092755 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25e89f55-af8e-4fda-a70e-fee52e57568f-scripts\") pod \"manila-api-0\" (UID: \"25e89f55-af8e-4fda-a70e-fee52e57568f\") " pod="openstack/manila-api-0" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.092823 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25e89f55-af8e-4fda-a70e-fee52e57568f-logs\") pod \"manila-api-0\" (UID: \"25e89f55-af8e-4fda-a70e-fee52e57568f\") " pod="openstack/manila-api-0" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.092899 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/961904ba-a936-4912-b5a1-a20e4a4028e6-dns-svc\") pod \"dnsmasq-dns-6b586f587-9rz4l\" (UID: \"961904ba-a936-4912-b5a1-a20e4a4028e6\") " pod="openstack/dnsmasq-dns-6b586f587-9rz4l" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.092950 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/961904ba-a936-4912-b5a1-a20e4a4028e6-openstack-edpm-ipam\") pod \"dnsmasq-dns-6b586f587-9rz4l\" (UID: \"961904ba-a936-4912-b5a1-a20e4a4028e6\") " pod="openstack/dnsmasq-dns-6b586f587-9rz4l" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.092983 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25e89f55-af8e-4fda-a70e-fee52e57568f-config-data-custom\") pod \"manila-api-0\" (UID: \"25e89f55-af8e-4fda-a70e-fee52e57568f\") " pod="openstack/manila-api-0" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.093035 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6w7q\" (UniqueName: \"kubernetes.io/projected/961904ba-a936-4912-b5a1-a20e4a4028e6-kube-api-access-m6w7q\") pod \"dnsmasq-dns-6b586f587-9rz4l\" (UID: \"961904ba-a936-4912-b5a1-a20e4a4028e6\") " pod="openstack/dnsmasq-dns-6b586f587-9rz4l" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.093082 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/961904ba-a936-4912-b5a1-a20e4a4028e6-ovsdbserver-nb\") pod \"dnsmasq-dns-6b586f587-9rz4l\" (UID: \"961904ba-a936-4912-b5a1-a20e4a4028e6\") " pod="openstack/dnsmasq-dns-6b586f587-9rz4l" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.093149 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srz4k\" (UniqueName: \"kubernetes.io/projected/25e89f55-af8e-4fda-a70e-fee52e57568f-kube-api-access-srz4k\") pod \"manila-api-0\" (UID: \"25e89f55-af8e-4fda-a70e-fee52e57568f\") " pod="openstack/manila-api-0" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.093210 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/961904ba-a936-4912-b5a1-a20e4a4028e6-config\") pod \"dnsmasq-dns-6b586f587-9rz4l\" (UID: \"961904ba-a936-4912-b5a1-a20e4a4028e6\") " pod="openstack/dnsmasq-dns-6b586f587-9rz4l" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.093231 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25e89f55-af8e-4fda-a70e-fee52e57568f-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"25e89f55-af8e-4fda-a70e-fee52e57568f\") " pod="openstack/manila-api-0" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.093254 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25e89f55-af8e-4fda-a70e-fee52e57568f-config-data\") pod \"manila-api-0\" (UID: \"25e89f55-af8e-4fda-a70e-fee52e57568f\") " pod="openstack/manila-api-0" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.094752 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/961904ba-a936-4912-b5a1-a20e4a4028e6-ovsdbserver-sb\") pod \"dnsmasq-dns-6b586f587-9rz4l\" (UID: \"961904ba-a936-4912-b5a1-a20e4a4028e6\") " pod="openstack/dnsmasq-dns-6b586f587-9rz4l" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.095529 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/961904ba-a936-4912-b5a1-a20e4a4028e6-dns-svc\") pod \"dnsmasq-dns-6b586f587-9rz4l\" (UID: \"961904ba-a936-4912-b5a1-a20e4a4028e6\") " pod="openstack/dnsmasq-dns-6b586f587-9rz4l" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.096851 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/961904ba-a936-4912-b5a1-a20e4a4028e6-openstack-edpm-ipam\") pod \"dnsmasq-dns-6b586f587-9rz4l\" (UID: \"961904ba-a936-4912-b5a1-a20e4a4028e6\") " pod="openstack/dnsmasq-dns-6b586f587-9rz4l" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.099279 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/961904ba-a936-4912-b5a1-a20e4a4028e6-ovsdbserver-nb\") pod \"dnsmasq-dns-6b586f587-9rz4l\" (UID: \"961904ba-a936-4912-b5a1-a20e4a4028e6\") " pod="openstack/dnsmasq-dns-6b586f587-9rz4l" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.100498 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/961904ba-a936-4912-b5a1-a20e4a4028e6-config\") pod \"dnsmasq-dns-6b586f587-9rz4l\" (UID: \"961904ba-a936-4912-b5a1-a20e4a4028e6\") " pod="openstack/dnsmasq-dns-6b586f587-9rz4l" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.152656 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6w7q\" (UniqueName: \"kubernetes.io/projected/961904ba-a936-4912-b5a1-a20e4a4028e6-kube-api-access-m6w7q\") pod \"dnsmasq-dns-6b586f587-9rz4l\" (UID: \"961904ba-a936-4912-b5a1-a20e4a4028e6\") " pod="openstack/dnsmasq-dns-6b586f587-9rz4l" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.194984 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25e89f55-af8e-4fda-a70e-fee52e57568f-etc-machine-id\") pod \"manila-api-0\" (UID: \"25e89f55-af8e-4fda-a70e-fee52e57568f\") " pod="openstack/manila-api-0" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.195315 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25e89f55-af8e-4fda-a70e-fee52e57568f-scripts\") pod \"manila-api-0\" (UID: \"25e89f55-af8e-4fda-a70e-fee52e57568f\") " pod="openstack/manila-api-0" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.195375 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25e89f55-af8e-4fda-a70e-fee52e57568f-logs\") pod \"manila-api-0\" (UID: \"25e89f55-af8e-4fda-a70e-fee52e57568f\") " pod="openstack/manila-api-0" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.195473 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25e89f55-af8e-4fda-a70e-fee52e57568f-etc-machine-id\") pod \"manila-api-0\" (UID: \"25e89f55-af8e-4fda-a70e-fee52e57568f\") " pod="openstack/manila-api-0" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.195880 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25e89f55-af8e-4fda-a70e-fee52e57568f-config-data-custom\") pod \"manila-api-0\" (UID: \"25e89f55-af8e-4fda-a70e-fee52e57568f\") " pod="openstack/manila-api-0" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.196057 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srz4k\" (UniqueName: \"kubernetes.io/projected/25e89f55-af8e-4fda-a70e-fee52e57568f-kube-api-access-srz4k\") pod \"manila-api-0\" (UID: \"25e89f55-af8e-4fda-a70e-fee52e57568f\") " pod="openstack/manila-api-0" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.196378 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25e89f55-af8e-4fda-a70e-fee52e57568f-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"25e89f55-af8e-4fda-a70e-fee52e57568f\") " pod="openstack/manila-api-0" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.196678 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25e89f55-af8e-4fda-a70e-fee52e57568f-config-data\") pod \"manila-api-0\" (UID: \"25e89f55-af8e-4fda-a70e-fee52e57568f\") " pod="openstack/manila-api-0" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.196823 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25e89f55-af8e-4fda-a70e-fee52e57568f-logs\") pod \"manila-api-0\" (UID: \"25e89f55-af8e-4fda-a70e-fee52e57568f\") " pod="openstack/manila-api-0" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.213296 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25e89f55-af8e-4fda-a70e-fee52e57568f-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"25e89f55-af8e-4fda-a70e-fee52e57568f\") " pod="openstack/manila-api-0" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.216654 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25e89f55-af8e-4fda-a70e-fee52e57568f-config-data-custom\") pod \"manila-api-0\" (UID: \"25e89f55-af8e-4fda-a70e-fee52e57568f\") " pod="openstack/manila-api-0" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.217969 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25e89f55-af8e-4fda-a70e-fee52e57568f-scripts\") pod \"manila-api-0\" (UID: \"25e89f55-af8e-4fda-a70e-fee52e57568f\") " pod="openstack/manila-api-0" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.225382 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25e89f55-af8e-4fda-a70e-fee52e57568f-config-data\") pod \"manila-api-0\" (UID: \"25e89f55-af8e-4fda-a70e-fee52e57568f\") " pod="openstack/manila-api-0" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.229053 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srz4k\" (UniqueName: \"kubernetes.io/projected/25e89f55-af8e-4fda-a70e-fee52e57568f-kube-api-access-srz4k\") pod \"manila-api-0\" (UID: \"25e89f55-af8e-4fda-a70e-fee52e57568f\") " pod="openstack/manila-api-0" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.281961 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b586f587-9rz4l" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.322639 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 06 10:01:36 crc kubenswrapper[4672]: I1206 10:01:36.745933 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 06 10:01:37 crc kubenswrapper[4672]: I1206 10:01:37.075691 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b586f587-9rz4l"] Dec 06 10:01:37 crc kubenswrapper[4672]: I1206 10:01:37.296071 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 06 10:01:37 crc kubenswrapper[4672]: I1206 10:01:37.324706 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"08451f8e-b555-445f-b31d-e0e9f8011ed0","Type":"ContainerStarted","Data":"816a4578a08e84763807de9932a88cdbabe6d5bd69485183e00a84f783eeb5eb"} Dec 06 10:01:37 crc kubenswrapper[4672]: I1206 10:01:37.337388 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"c109eeef-0858-4722-a7d6-cefd2b8979b4","Type":"ContainerStarted","Data":"bf2b03df96b3e6d5a9fe1850d264e32ce28ddce1fdd8f2356e4a1531268b03bb"} Dec 06 10:01:37 crc kubenswrapper[4672]: I1206 10:01:37.340472 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b586f587-9rz4l" event={"ID":"961904ba-a936-4912-b5a1-a20e4a4028e6","Type":"ContainerStarted","Data":"caa8937fdd0e51bb5ae7ccaa2e679c6fe5bd7460eacd6c0d8ffc28505b5c52a3"} Dec 06 10:01:37 crc kubenswrapper[4672]: I1206 10:01:37.392237 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 06 10:01:38 crc kubenswrapper[4672]: I1206 10:01:38.354577 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"25e89f55-af8e-4fda-a70e-fee52e57568f","Type":"ContainerStarted","Data":"b7f300c09b7eb59dc1cc82054026f604f5b969dd3703d6a5b9c8f0a97fd92457"} Dec 06 10:01:38 crc kubenswrapper[4672]: I1206 10:01:38.368352 4672 generic.go:334] "Generic (PLEG): container finished" podID="961904ba-a936-4912-b5a1-a20e4a4028e6" containerID="0a39aa0739729eab40e55c1ebac7e6730e1d235d2bd7a0980f260d15b3e84559" exitCode=0 Dec 06 10:01:38 crc kubenswrapper[4672]: I1206 10:01:38.368391 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b586f587-9rz4l" event={"ID":"961904ba-a936-4912-b5a1-a20e4a4028e6","Type":"ContainerDied","Data":"0a39aa0739729eab40e55c1ebac7e6730e1d235d2bd7a0980f260d15b3e84559"} Dec 06 10:01:38 crc kubenswrapper[4672]: I1206 10:01:38.902977 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5b745d8b98-pzrsc" podUID="afd13bd1-0e47-4739-9f82-e673232e3c61" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.240:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.240:8443: connect: connection refused" Dec 06 10:01:39 crc kubenswrapper[4672]: I1206 10:01:39.153719 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-8c74dbc66-8ghhf" podUID="70d8ba3e-3f2d-4627-afab-5bb8908f89eb" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.241:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.241:8443: connect: connection refused" Dec 06 10:01:39 crc kubenswrapper[4672]: I1206 10:01:39.388796 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b586f587-9rz4l" event={"ID":"961904ba-a936-4912-b5a1-a20e4a4028e6","Type":"ContainerStarted","Data":"8e79ac5e0dfeae66a7960abae002e4591fe5eb8cdad2accb7f5be75c9f7b4c5f"} Dec 06 10:01:39 crc kubenswrapper[4672]: I1206 10:01:39.388965 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b586f587-9rz4l" Dec 06 10:01:39 crc kubenswrapper[4672]: I1206 10:01:39.398021 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"08451f8e-b555-445f-b31d-e0e9f8011ed0","Type":"ContainerStarted","Data":"828d9b901a2aa890f76a1742d2eb6df403db387fe3aa8e59c4d7a4b4c22189ac"} Dec 06 10:01:39 crc kubenswrapper[4672]: I1206 10:01:39.398058 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"08451f8e-b555-445f-b31d-e0e9f8011ed0","Type":"ContainerStarted","Data":"6a96ece6ecfa8e3c40d934c26b0b186331960aae43ac5c94667325666c5c7e8d"} Dec 06 10:01:39 crc kubenswrapper[4672]: I1206 10:01:39.407938 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"25e89f55-af8e-4fda-a70e-fee52e57568f","Type":"ContainerStarted","Data":"3b28f23ad9e2c8a24d91074f7436290ef3650df1372eb9b83046034b8698b41d"} Dec 06 10:01:39 crc kubenswrapper[4672]: I1206 10:01:39.407981 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"25e89f55-af8e-4fda-a70e-fee52e57568f","Type":"ContainerStarted","Data":"cd54ddc56c30affd5ac1887dc1e5338b5c22b0b4612e2c1d10dec357d228833e"} Dec 06 10:01:39 crc kubenswrapper[4672]: I1206 10:01:39.408079 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Dec 06 10:01:39 crc kubenswrapper[4672]: I1206 10:01:39.421300 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b586f587-9rz4l" podStartSLOduration=4.421281694 podStartE2EDuration="4.421281694s" podCreationTimestamp="2025-12-06 10:01:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 10:01:39.420521934 +0000 UTC m=+3317.164782221" watchObservedRunningTime="2025-12-06 10:01:39.421281694 +0000 UTC m=+3317.165541981" Dec 06 10:01:39 crc kubenswrapper[4672]: I1206 10:01:39.469265 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.469243634 podStartE2EDuration="4.469243634s" podCreationTimestamp="2025-12-06 10:01:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 10:01:39.445790679 +0000 UTC m=+3317.190050966" watchObservedRunningTime="2025-12-06 10:01:39.469243634 +0000 UTC m=+3317.213503921" Dec 06 10:01:39 crc kubenswrapper[4672]: I1206 10:01:39.481547 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.744426537 podStartE2EDuration="4.481528078s" podCreationTimestamp="2025-12-06 10:01:35 +0000 UTC" firstStartedPulling="2025-12-06 10:01:36.772240648 +0000 UTC m=+3314.516500935" lastFinishedPulling="2025-12-06 10:01:37.509342189 +0000 UTC m=+3315.253602476" observedRunningTime="2025-12-06 10:01:39.467614241 +0000 UTC m=+3317.211874538" watchObservedRunningTime="2025-12-06 10:01:39.481528078 +0000 UTC m=+3317.225788355" Dec 06 10:01:40 crc kubenswrapper[4672]: I1206 10:01:40.208219 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Dec 06 10:01:41 crc kubenswrapper[4672]: I1206 10:01:41.430575 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="25e89f55-af8e-4fda-a70e-fee52e57568f" containerName="manila-api-log" containerID="cri-o://cd54ddc56c30affd5ac1887dc1e5338b5c22b0b4612e2c1d10dec357d228833e" gracePeriod=30 Dec 06 10:01:41 crc kubenswrapper[4672]: I1206 10:01:41.431692 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="25e89f55-af8e-4fda-a70e-fee52e57568f" containerName="manila-api" containerID="cri-o://3b28f23ad9e2c8a24d91074f7436290ef3650df1372eb9b83046034b8698b41d" gracePeriod=30 Dec 06 10:01:42 crc kubenswrapper[4672]: I1206 10:01:42.460865 4672 generic.go:334] "Generic (PLEG): container finished" podID="25e89f55-af8e-4fda-a70e-fee52e57568f" containerID="3b28f23ad9e2c8a24d91074f7436290ef3650df1372eb9b83046034b8698b41d" exitCode=0 Dec 06 10:01:42 crc kubenswrapper[4672]: I1206 10:01:42.461277 4672 generic.go:334] "Generic (PLEG): container finished" podID="25e89f55-af8e-4fda-a70e-fee52e57568f" containerID="cd54ddc56c30affd5ac1887dc1e5338b5c22b0b4612e2c1d10dec357d228833e" exitCode=143 Dec 06 10:01:42 crc kubenswrapper[4672]: I1206 10:01:42.461357 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"25e89f55-af8e-4fda-a70e-fee52e57568f","Type":"ContainerDied","Data":"3b28f23ad9e2c8a24d91074f7436290ef3650df1372eb9b83046034b8698b41d"} Dec 06 10:01:42 crc kubenswrapper[4672]: I1206 10:01:42.461385 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"25e89f55-af8e-4fda-a70e-fee52e57568f","Type":"ContainerDied","Data":"cd54ddc56c30affd5ac1887dc1e5338b5c22b0b4612e2c1d10dec357d228833e"} Dec 06 10:01:42 crc kubenswrapper[4672]: I1206 10:01:42.461394 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"25e89f55-af8e-4fda-a70e-fee52e57568f","Type":"ContainerDied","Data":"b7f300c09b7eb59dc1cc82054026f604f5b969dd3703d6a5b9c8f0a97fd92457"} Dec 06 10:01:42 crc kubenswrapper[4672]: I1206 10:01:42.461403 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7f300c09b7eb59dc1cc82054026f604f5b969dd3703d6a5b9c8f0a97fd92457" Dec 06 10:01:42 crc kubenswrapper[4672]: I1206 10:01:42.475645 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 06 10:01:42 crc kubenswrapper[4672]: I1206 10:01:42.651172 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25e89f55-af8e-4fda-a70e-fee52e57568f-config-data-custom\") pod \"25e89f55-af8e-4fda-a70e-fee52e57568f\" (UID: \"25e89f55-af8e-4fda-a70e-fee52e57568f\") " Dec 06 10:01:42 crc kubenswrapper[4672]: I1206 10:01:42.651248 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25e89f55-af8e-4fda-a70e-fee52e57568f-etc-machine-id\") pod \"25e89f55-af8e-4fda-a70e-fee52e57568f\" (UID: \"25e89f55-af8e-4fda-a70e-fee52e57568f\") " Dec 06 10:01:42 crc kubenswrapper[4672]: I1206 10:01:42.651306 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25e89f55-af8e-4fda-a70e-fee52e57568f-config-data\") pod \"25e89f55-af8e-4fda-a70e-fee52e57568f\" (UID: \"25e89f55-af8e-4fda-a70e-fee52e57568f\") " Dec 06 10:01:42 crc kubenswrapper[4672]: I1206 10:01:42.651337 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25e89f55-af8e-4fda-a70e-fee52e57568f-logs\") pod \"25e89f55-af8e-4fda-a70e-fee52e57568f\" (UID: \"25e89f55-af8e-4fda-a70e-fee52e57568f\") " Dec 06 10:01:42 crc kubenswrapper[4672]: I1206 10:01:42.651350 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25e89f55-af8e-4fda-a70e-fee52e57568f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "25e89f55-af8e-4fda-a70e-fee52e57568f" (UID: "25e89f55-af8e-4fda-a70e-fee52e57568f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 10:01:42 crc kubenswrapper[4672]: I1206 10:01:42.651374 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25e89f55-af8e-4fda-a70e-fee52e57568f-scripts\") pod \"25e89f55-af8e-4fda-a70e-fee52e57568f\" (UID: \"25e89f55-af8e-4fda-a70e-fee52e57568f\") " Dec 06 10:01:42 crc kubenswrapper[4672]: I1206 10:01:42.651391 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srz4k\" (UniqueName: \"kubernetes.io/projected/25e89f55-af8e-4fda-a70e-fee52e57568f-kube-api-access-srz4k\") pod \"25e89f55-af8e-4fda-a70e-fee52e57568f\" (UID: \"25e89f55-af8e-4fda-a70e-fee52e57568f\") " Dec 06 10:01:42 crc kubenswrapper[4672]: I1206 10:01:42.651419 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25e89f55-af8e-4fda-a70e-fee52e57568f-combined-ca-bundle\") pod \"25e89f55-af8e-4fda-a70e-fee52e57568f\" (UID: \"25e89f55-af8e-4fda-a70e-fee52e57568f\") " Dec 06 10:01:42 crc kubenswrapper[4672]: I1206 10:01:42.651685 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25e89f55-af8e-4fda-a70e-fee52e57568f-logs" (OuterVolumeSpecName: "logs") pod "25e89f55-af8e-4fda-a70e-fee52e57568f" (UID: "25e89f55-af8e-4fda-a70e-fee52e57568f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:01:42 crc kubenswrapper[4672]: I1206 10:01:42.651898 4672 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25e89f55-af8e-4fda-a70e-fee52e57568f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:42 crc kubenswrapper[4672]: I1206 10:01:42.651916 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25e89f55-af8e-4fda-a70e-fee52e57568f-logs\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:42 crc kubenswrapper[4672]: I1206 10:01:42.676766 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e89f55-af8e-4fda-a70e-fee52e57568f-kube-api-access-srz4k" (OuterVolumeSpecName: "kube-api-access-srz4k") pod "25e89f55-af8e-4fda-a70e-fee52e57568f" (UID: "25e89f55-af8e-4fda-a70e-fee52e57568f"). InnerVolumeSpecName "kube-api-access-srz4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:01:42 crc kubenswrapper[4672]: I1206 10:01:42.681736 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e89f55-af8e-4fda-a70e-fee52e57568f-scripts" (OuterVolumeSpecName: "scripts") pod "25e89f55-af8e-4fda-a70e-fee52e57568f" (UID: "25e89f55-af8e-4fda-a70e-fee52e57568f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:42 crc kubenswrapper[4672]: I1206 10:01:42.695628 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e89f55-af8e-4fda-a70e-fee52e57568f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "25e89f55-af8e-4fda-a70e-fee52e57568f" (UID: "25e89f55-af8e-4fda-a70e-fee52e57568f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:42 crc kubenswrapper[4672]: I1206 10:01:42.715753 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e89f55-af8e-4fda-a70e-fee52e57568f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25e89f55-af8e-4fda-a70e-fee52e57568f" (UID: "25e89f55-af8e-4fda-a70e-fee52e57568f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:42 crc kubenswrapper[4672]: I1206 10:01:42.754232 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25e89f55-af8e-4fda-a70e-fee52e57568f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:42 crc kubenswrapper[4672]: I1206 10:01:42.758274 4672 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25e89f55-af8e-4fda-a70e-fee52e57568f-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:42 crc kubenswrapper[4672]: I1206 10:01:42.758296 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25e89f55-af8e-4fda-a70e-fee52e57568f-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:42 crc kubenswrapper[4672]: I1206 10:01:42.758305 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srz4k\" (UniqueName: \"kubernetes.io/projected/25e89f55-af8e-4fda-a70e-fee52e57568f-kube-api-access-srz4k\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:42 crc kubenswrapper[4672]: I1206 10:01:42.778711 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e89f55-af8e-4fda-a70e-fee52e57568f-config-data" (OuterVolumeSpecName: "config-data") pod "25e89f55-af8e-4fda-a70e-fee52e57568f" (UID: "25e89f55-af8e-4fda-a70e-fee52e57568f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:42 crc kubenswrapper[4672]: I1206 10:01:42.859783 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25e89f55-af8e-4fda-a70e-fee52e57568f-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.472139 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.527682 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.542382 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.564389 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Dec 06 10:01:43 crc kubenswrapper[4672]: E1206 10:01:43.564804 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25e89f55-af8e-4fda-a70e-fee52e57568f" containerName="manila-api" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.564821 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="25e89f55-af8e-4fda-a70e-fee52e57568f" containerName="manila-api" Dec 06 10:01:43 crc kubenswrapper[4672]: E1206 10:01:43.564852 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25e89f55-af8e-4fda-a70e-fee52e57568f" containerName="manila-api-log" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.564859 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="25e89f55-af8e-4fda-a70e-fee52e57568f" containerName="manila-api-log" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.565072 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="25e89f55-af8e-4fda-a70e-fee52e57568f" containerName="manila-api" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.565101 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="25e89f55-af8e-4fda-a70e-fee52e57568f" containerName="manila-api-log" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.566091 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.569280 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.570384 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.571714 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.581926 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.675531 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42ee5091-4d7a-4807-905e-19dddd238386-scripts\") pod \"manila-api-0\" (UID: \"42ee5091-4d7a-4807-905e-19dddd238386\") " pod="openstack/manila-api-0" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.675849 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrtjc\" (UniqueName: \"kubernetes.io/projected/42ee5091-4d7a-4807-905e-19dddd238386-kube-api-access-mrtjc\") pod \"manila-api-0\" (UID: \"42ee5091-4d7a-4807-905e-19dddd238386\") " pod="openstack/manila-api-0" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.676042 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42ee5091-4d7a-4807-905e-19dddd238386-config-data-custom\") pod \"manila-api-0\" (UID: \"42ee5091-4d7a-4807-905e-19dddd238386\") " pod="openstack/manila-api-0" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.676463 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/42ee5091-4d7a-4807-905e-19dddd238386-etc-machine-id\") pod \"manila-api-0\" (UID: \"42ee5091-4d7a-4807-905e-19dddd238386\") " pod="openstack/manila-api-0" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.676574 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42ee5091-4d7a-4807-905e-19dddd238386-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"42ee5091-4d7a-4807-905e-19dddd238386\") " pod="openstack/manila-api-0" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.676752 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42ee5091-4d7a-4807-905e-19dddd238386-public-tls-certs\") pod \"manila-api-0\" (UID: \"42ee5091-4d7a-4807-905e-19dddd238386\") " pod="openstack/manila-api-0" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.676843 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42ee5091-4d7a-4807-905e-19dddd238386-internal-tls-certs\") pod \"manila-api-0\" (UID: \"42ee5091-4d7a-4807-905e-19dddd238386\") " pod="openstack/manila-api-0" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.676975 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42ee5091-4d7a-4807-905e-19dddd238386-config-data\") pod \"manila-api-0\" (UID: \"42ee5091-4d7a-4807-905e-19dddd238386\") " pod="openstack/manila-api-0" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.677140 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42ee5091-4d7a-4807-905e-19dddd238386-logs\") pod \"manila-api-0\" (UID: \"42ee5091-4d7a-4807-905e-19dddd238386\") " pod="openstack/manila-api-0" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.778921 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/42ee5091-4d7a-4807-905e-19dddd238386-etc-machine-id\") pod \"manila-api-0\" (UID: \"42ee5091-4d7a-4807-905e-19dddd238386\") " pod="openstack/manila-api-0" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.778970 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42ee5091-4d7a-4807-905e-19dddd238386-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"42ee5091-4d7a-4807-905e-19dddd238386\") " pod="openstack/manila-api-0" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.779005 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42ee5091-4d7a-4807-905e-19dddd238386-public-tls-certs\") pod \"manila-api-0\" (UID: \"42ee5091-4d7a-4807-905e-19dddd238386\") " pod="openstack/manila-api-0" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.779023 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42ee5091-4d7a-4807-905e-19dddd238386-internal-tls-certs\") pod \"manila-api-0\" (UID: \"42ee5091-4d7a-4807-905e-19dddd238386\") " pod="openstack/manila-api-0" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.779018 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/42ee5091-4d7a-4807-905e-19dddd238386-etc-machine-id\") pod \"manila-api-0\" (UID: \"42ee5091-4d7a-4807-905e-19dddd238386\") " pod="openstack/manila-api-0" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.779066 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42ee5091-4d7a-4807-905e-19dddd238386-config-data\") pod \"manila-api-0\" (UID: \"42ee5091-4d7a-4807-905e-19dddd238386\") " pod="openstack/manila-api-0" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.779109 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42ee5091-4d7a-4807-905e-19dddd238386-logs\") pod \"manila-api-0\" (UID: \"42ee5091-4d7a-4807-905e-19dddd238386\") " pod="openstack/manila-api-0" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.779148 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42ee5091-4d7a-4807-905e-19dddd238386-scripts\") pod \"manila-api-0\" (UID: \"42ee5091-4d7a-4807-905e-19dddd238386\") " pod="openstack/manila-api-0" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.779225 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrtjc\" (UniqueName: \"kubernetes.io/projected/42ee5091-4d7a-4807-905e-19dddd238386-kube-api-access-mrtjc\") pod \"manila-api-0\" (UID: \"42ee5091-4d7a-4807-905e-19dddd238386\") " pod="openstack/manila-api-0" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.779312 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42ee5091-4d7a-4807-905e-19dddd238386-config-data-custom\") pod \"manila-api-0\" (UID: \"42ee5091-4d7a-4807-905e-19dddd238386\") " pod="openstack/manila-api-0" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.779574 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42ee5091-4d7a-4807-905e-19dddd238386-logs\") pod \"manila-api-0\" (UID: \"42ee5091-4d7a-4807-905e-19dddd238386\") " pod="openstack/manila-api-0" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.784373 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42ee5091-4d7a-4807-905e-19dddd238386-internal-tls-certs\") pod \"manila-api-0\" (UID: \"42ee5091-4d7a-4807-905e-19dddd238386\") " pod="openstack/manila-api-0" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.792093 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42ee5091-4d7a-4807-905e-19dddd238386-public-tls-certs\") pod \"manila-api-0\" (UID: \"42ee5091-4d7a-4807-905e-19dddd238386\") " pod="openstack/manila-api-0" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.796963 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42ee5091-4d7a-4807-905e-19dddd238386-config-data\") pod \"manila-api-0\" (UID: \"42ee5091-4d7a-4807-905e-19dddd238386\") " pod="openstack/manila-api-0" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.798979 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42ee5091-4d7a-4807-905e-19dddd238386-scripts\") pod \"manila-api-0\" (UID: \"42ee5091-4d7a-4807-905e-19dddd238386\") " pod="openstack/manila-api-0" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.802002 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42ee5091-4d7a-4807-905e-19dddd238386-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"42ee5091-4d7a-4807-905e-19dddd238386\") " pod="openstack/manila-api-0" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.805140 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrtjc\" (UniqueName: \"kubernetes.io/projected/42ee5091-4d7a-4807-905e-19dddd238386-kube-api-access-mrtjc\") pod \"manila-api-0\" (UID: \"42ee5091-4d7a-4807-905e-19dddd238386\") " pod="openstack/manila-api-0" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.831205 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42ee5091-4d7a-4807-905e-19dddd238386-config-data-custom\") pod \"manila-api-0\" (UID: \"42ee5091-4d7a-4807-905e-19dddd238386\") " pod="openstack/manila-api-0" Dec 06 10:01:43 crc kubenswrapper[4672]: I1206 10:01:43.914324 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 06 10:01:44 crc kubenswrapper[4672]: I1206 10:01:44.621014 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e89f55-af8e-4fda-a70e-fee52e57568f" path="/var/lib/kubelet/pods/25e89f55-af8e-4fda-a70e-fee52e57568f/volumes" Dec 06 10:01:44 crc kubenswrapper[4672]: I1206 10:01:44.673498 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 06 10:01:45 crc kubenswrapper[4672]: I1206 10:01:45.951943 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Dec 06 10:01:46 crc kubenswrapper[4672]: I1206 10:01:46.283952 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b586f587-9rz4l" Dec 06 10:01:46 crc kubenswrapper[4672]: I1206 10:01:46.471138 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f644dcf9-td5cr"] Dec 06 10:01:46 crc kubenswrapper[4672]: I1206 10:01:46.471404 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" podUID="3021c245-1d0d-4727-906b-26784c5e10bc" containerName="dnsmasq-dns" containerID="cri-o://085952fa0187515779031389f1e54ce2ad21ae44257525de5dc2037206b4aff4" gracePeriod=10 Dec 06 10:01:47 crc kubenswrapper[4672]: I1206 10:01:47.526211 4672 generic.go:334] "Generic (PLEG): container finished" podID="3021c245-1d0d-4727-906b-26784c5e10bc" containerID="085952fa0187515779031389f1e54ce2ad21ae44257525de5dc2037206b4aff4" exitCode=0 Dec 06 10:01:47 crc kubenswrapper[4672]: I1206 10:01:47.526290 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" event={"ID":"3021c245-1d0d-4727-906b-26784c5e10bc","Type":"ContainerDied","Data":"085952fa0187515779031389f1e54ce2ad21ae44257525de5dc2037206b4aff4"} Dec 06 10:01:49 crc kubenswrapper[4672]: I1206 10:01:49.528852 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" podUID="3021c245-1d0d-4727-906b-26784c5e10bc" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.191:5353: connect: connection refused" Dec 06 10:01:49 crc kubenswrapper[4672]: W1206 10:01:49.937325 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42ee5091_4d7a_4807_905e_19dddd238386.slice/crio-dad140d48059db518353df0beea61e4c14d192ad548d4e715f4f42be826ae241 WatchSource:0}: Error finding container dad140d48059db518353df0beea61e4c14d192ad548d4e715f4f42be826ae241: Status 404 returned error can't find the container with id dad140d48059db518353df0beea61e4c14d192ad548d4e715f4f42be826ae241 Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.303046 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.487156 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.487451 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4ab86bc2-089c-46d4-9c2c-a05140110779" containerName="ceilometer-central-agent" containerID="cri-o://e936120c2580c0bcc7e57e8663991ac5bc0f338b9ba9058f75b969936cd4c4df" gracePeriod=30 Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.487583 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4ab86bc2-089c-46d4-9c2c-a05140110779" containerName="proxy-httpd" containerID="cri-o://fbb25ae67a88e3e2e2bae85c668babac0e3d42d08abe8b87f58e1bae3e8886cf" gracePeriod=30 Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.487666 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4ab86bc2-089c-46d4-9c2c-a05140110779" containerName="sg-core" containerID="cri-o://50ef7bdad1a9bf111054fbd7bb4178b2a45b503438a5979010c382d3f38acb1b" gracePeriod=30 Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.487699 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4ab86bc2-089c-46d4-9c2c-a05140110779" containerName="ceilometer-notification-agent" containerID="cri-o://fe312428c5b7cbb369d1801b2a999534b6fb673812b268823eff70f7676b869e" gracePeriod=30 Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.491899 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3021c245-1d0d-4727-906b-26784c5e10bc-openstack-edpm-ipam\") pod \"3021c245-1d0d-4727-906b-26784c5e10bc\" (UID: \"3021c245-1d0d-4727-906b-26784c5e10bc\") " Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.492056 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3021c245-1d0d-4727-906b-26784c5e10bc-ovsdbserver-sb\") pod \"3021c245-1d0d-4727-906b-26784c5e10bc\" (UID: \"3021c245-1d0d-4727-906b-26784c5e10bc\") " Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.492110 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2cln\" (UniqueName: \"kubernetes.io/projected/3021c245-1d0d-4727-906b-26784c5e10bc-kube-api-access-w2cln\") pod \"3021c245-1d0d-4727-906b-26784c5e10bc\" (UID: \"3021c245-1d0d-4727-906b-26784c5e10bc\") " Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.492176 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3021c245-1d0d-4727-906b-26784c5e10bc-ovsdbserver-nb\") pod \"3021c245-1d0d-4727-906b-26784c5e10bc\" (UID: \"3021c245-1d0d-4727-906b-26784c5e10bc\") " Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.492198 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3021c245-1d0d-4727-906b-26784c5e10bc-dns-svc\") pod \"3021c245-1d0d-4727-906b-26784c5e10bc\" (UID: \"3021c245-1d0d-4727-906b-26784c5e10bc\") " Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.492220 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3021c245-1d0d-4727-906b-26784c5e10bc-config\") pod \"3021c245-1d0d-4727-906b-26784c5e10bc\" (UID: \"3021c245-1d0d-4727-906b-26784c5e10bc\") " Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.512294 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3021c245-1d0d-4727-906b-26784c5e10bc-kube-api-access-w2cln" (OuterVolumeSpecName: "kube-api-access-w2cln") pod "3021c245-1d0d-4727-906b-26784c5e10bc" (UID: "3021c245-1d0d-4727-906b-26784c5e10bc"). InnerVolumeSpecName "kube-api-access-w2cln". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.597425 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2cln\" (UniqueName: \"kubernetes.io/projected/3021c245-1d0d-4727-906b-26784c5e10bc-kube-api-access-w2cln\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.603746 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3021c245-1d0d-4727-906b-26784c5e10bc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3021c245-1d0d-4727-906b-26784c5e10bc" (UID: "3021c245-1d0d-4727-906b-26784c5e10bc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.629490 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"42ee5091-4d7a-4807-905e-19dddd238386","Type":"ContainerStarted","Data":"dad140d48059db518353df0beea61e4c14d192ad548d4e715f4f42be826ae241"} Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.639051 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" event={"ID":"3021c245-1d0d-4727-906b-26784c5e10bc","Type":"ContainerDied","Data":"209674e34a76b9fbeb4ab75a065667e3e443fe16557599425edf8a27bca87755"} Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.639108 4672 scope.go:117] "RemoveContainer" containerID="085952fa0187515779031389f1e54ce2ad21ae44257525de5dc2037206b4aff4" Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.639258 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f644dcf9-td5cr" Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.687817 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3021c245-1d0d-4727-906b-26784c5e10bc-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "3021c245-1d0d-4727-906b-26784c5e10bc" (UID: "3021c245-1d0d-4727-906b-26784c5e10bc"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.692791 4672 generic.go:334] "Generic (PLEG): container finished" podID="d26c88b2-357e-4c41-8f93-1f6422329000" containerID="a8b766d4589057d37abd957828d40407b2391ad1169cfd32ac838699e89eec28" exitCode=137 Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.692826 4672 generic.go:334] "Generic (PLEG): container finished" podID="d26c88b2-357e-4c41-8f93-1f6422329000" containerID="c0849faff0fa8114dafc6ad4ab8ba1a0e601ad343bcb00cd253d76d3d3ec7cdf" exitCode=137 Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.692851 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c65c599b9-mz2lg" event={"ID":"d26c88b2-357e-4c41-8f93-1f6422329000","Type":"ContainerDied","Data":"a8b766d4589057d37abd957828d40407b2391ad1169cfd32ac838699e89eec28"} Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.692887 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c65c599b9-mz2lg" event={"ID":"d26c88b2-357e-4c41-8f93-1f6422329000","Type":"ContainerDied","Data":"c0849faff0fa8114dafc6ad4ab8ba1a0e601ad343bcb00cd253d76d3d3ec7cdf"} Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.699451 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3021c245-1d0d-4727-906b-26784c5e10bc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3021c245-1d0d-4727-906b-26784c5e10bc" (UID: "3021c245-1d0d-4727-906b-26784c5e10bc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.700077 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3021c245-1d0d-4727-906b-26784c5e10bc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.700100 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3021c245-1d0d-4727-906b-26784c5e10bc-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.700152 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3021c245-1d0d-4727-906b-26784c5e10bc-config" (OuterVolumeSpecName: "config") pod "3021c245-1d0d-4727-906b-26784c5e10bc" (UID: "3021c245-1d0d-4727-906b-26784c5e10bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.700894 4672 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3021c245-1d0d-4727-906b-26784c5e10bc-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.710735 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3021c245-1d0d-4727-906b-26784c5e10bc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3021c245-1d0d-4727-906b-26784c5e10bc" (UID: "3021c245-1d0d-4727-906b-26784c5e10bc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.785127 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c65c599b9-mz2lg" Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.802963 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3021c245-1d0d-4727-906b-26784c5e10bc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.803002 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3021c245-1d0d-4727-906b-26784c5e10bc-config\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.803874 4672 scope.go:117] "RemoveContainer" containerID="aac2eb8cb92e521fb47aef0baa2e09a0281592ee44653b6364f4856fe669fa0c" Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.904925 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d26c88b2-357e-4c41-8f93-1f6422329000-scripts\") pod \"d26c88b2-357e-4c41-8f93-1f6422329000\" (UID: \"d26c88b2-357e-4c41-8f93-1f6422329000\") " Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.905081 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d26c88b2-357e-4c41-8f93-1f6422329000-config-data\") pod \"d26c88b2-357e-4c41-8f93-1f6422329000\" (UID: \"d26c88b2-357e-4c41-8f93-1f6422329000\") " Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.905197 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d26c88b2-357e-4c41-8f93-1f6422329000-logs\") pod \"d26c88b2-357e-4c41-8f93-1f6422329000\" (UID: \"d26c88b2-357e-4c41-8f93-1f6422329000\") " Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.905261 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d26c88b2-357e-4c41-8f93-1f6422329000-horizon-secret-key\") pod \"d26c88b2-357e-4c41-8f93-1f6422329000\" (UID: \"d26c88b2-357e-4c41-8f93-1f6422329000\") " Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.905282 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrckh\" (UniqueName: \"kubernetes.io/projected/d26c88b2-357e-4c41-8f93-1f6422329000-kube-api-access-vrckh\") pod \"d26c88b2-357e-4c41-8f93-1f6422329000\" (UID: \"d26c88b2-357e-4c41-8f93-1f6422329000\") " Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.907046 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d26c88b2-357e-4c41-8f93-1f6422329000-logs" (OuterVolumeSpecName: "logs") pod "d26c88b2-357e-4c41-8f93-1f6422329000" (UID: "d26c88b2-357e-4c41-8f93-1f6422329000"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.918182 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d26c88b2-357e-4c41-8f93-1f6422329000-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d26c88b2-357e-4c41-8f93-1f6422329000" (UID: "d26c88b2-357e-4c41-8f93-1f6422329000"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.921959 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d26c88b2-357e-4c41-8f93-1f6422329000-kube-api-access-vrckh" (OuterVolumeSpecName: "kube-api-access-vrckh") pod "d26c88b2-357e-4c41-8f93-1f6422329000" (UID: "d26c88b2-357e-4c41-8f93-1f6422329000"). InnerVolumeSpecName "kube-api-access-vrckh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.954453 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d26c88b2-357e-4c41-8f93-1f6422329000-config-data" (OuterVolumeSpecName: "config-data") pod "d26c88b2-357e-4c41-8f93-1f6422329000" (UID: "d26c88b2-357e-4c41-8f93-1f6422329000"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:01:50 crc kubenswrapper[4672]: I1206 10:01:50.975746 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d26c88b2-357e-4c41-8f93-1f6422329000-scripts" (OuterVolumeSpecName: "scripts") pod "d26c88b2-357e-4c41-8f93-1f6422329000" (UID: "d26c88b2-357e-4c41-8f93-1f6422329000"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:01:51 crc kubenswrapper[4672]: I1206 10:01:51.008348 4672 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d26c88b2-357e-4c41-8f93-1f6422329000-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:51 crc kubenswrapper[4672]: I1206 10:01:51.008386 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrckh\" (UniqueName: \"kubernetes.io/projected/d26c88b2-357e-4c41-8f93-1f6422329000-kube-api-access-vrckh\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:51 crc kubenswrapper[4672]: I1206 10:01:51.008401 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d26c88b2-357e-4c41-8f93-1f6422329000-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:51 crc kubenswrapper[4672]: I1206 10:01:51.008411 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d26c88b2-357e-4c41-8f93-1f6422329000-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:51 crc kubenswrapper[4672]: I1206 10:01:51.008423 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d26c88b2-357e-4c41-8f93-1f6422329000-logs\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:51 crc kubenswrapper[4672]: I1206 10:01:51.158962 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f644dcf9-td5cr"] Dec 06 10:01:51 crc kubenswrapper[4672]: I1206 10:01:51.167073 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54f644dcf9-td5cr"] Dec 06 10:01:51 crc kubenswrapper[4672]: I1206 10:01:51.736092 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"c109eeef-0858-4722-a7d6-cefd2b8979b4","Type":"ContainerStarted","Data":"8b2d0be221687a567f51ed7fb58337300f6ca2d25de24d0c8dea815b511d0025"} Dec 06 10:01:51 crc kubenswrapper[4672]: I1206 10:01:51.764891 4672 generic.go:334] "Generic (PLEG): container finished" podID="4ab86bc2-089c-46d4-9c2c-a05140110779" containerID="fbb25ae67a88e3e2e2bae85c668babac0e3d42d08abe8b87f58e1bae3e8886cf" exitCode=0 Dec 06 10:01:51 crc kubenswrapper[4672]: I1206 10:01:51.764921 4672 generic.go:334] "Generic (PLEG): container finished" podID="4ab86bc2-089c-46d4-9c2c-a05140110779" containerID="50ef7bdad1a9bf111054fbd7bb4178b2a45b503438a5979010c382d3f38acb1b" exitCode=2 Dec 06 10:01:51 crc kubenswrapper[4672]: I1206 10:01:51.764928 4672 generic.go:334] "Generic (PLEG): container finished" podID="4ab86bc2-089c-46d4-9c2c-a05140110779" containerID="e936120c2580c0bcc7e57e8663991ac5bc0f338b9ba9058f75b969936cd4c4df" exitCode=0 Dec 06 10:01:51 crc kubenswrapper[4672]: I1206 10:01:51.764971 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ab86bc2-089c-46d4-9c2c-a05140110779","Type":"ContainerDied","Data":"fbb25ae67a88e3e2e2bae85c668babac0e3d42d08abe8b87f58e1bae3e8886cf"} Dec 06 10:01:51 crc kubenswrapper[4672]: I1206 10:01:51.764997 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ab86bc2-089c-46d4-9c2c-a05140110779","Type":"ContainerDied","Data":"50ef7bdad1a9bf111054fbd7bb4178b2a45b503438a5979010c382d3f38acb1b"} Dec 06 10:01:51 crc kubenswrapper[4672]: I1206 10:01:51.765008 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ab86bc2-089c-46d4-9c2c-a05140110779","Type":"ContainerDied","Data":"e936120c2580c0bcc7e57e8663991ac5bc0f338b9ba9058f75b969936cd4c4df"} Dec 06 10:01:51 crc kubenswrapper[4672]: I1206 10:01:51.776764 4672 generic.go:334] "Generic (PLEG): container finished" podID="9b071d77-1fdb-4938-95ee-10e91492c545" containerID="b2add5d8255b5d6dc54de6dd5ffbb569f6e68989c4a04cc15b366791492dc2b6" exitCode=137 Dec 06 10:01:51 crc kubenswrapper[4672]: I1206 10:01:51.776797 4672 generic.go:334] "Generic (PLEG): container finished" podID="9b071d77-1fdb-4938-95ee-10e91492c545" containerID="56b6135ba52ae6cdc807e3227322d4780647411d55647b5b69917ee780a7526a" exitCode=137 Dec 06 10:01:51 crc kubenswrapper[4672]: I1206 10:01:51.776841 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76f96bf4d5-7wt8s" event={"ID":"9b071d77-1fdb-4938-95ee-10e91492c545","Type":"ContainerDied","Data":"b2add5d8255b5d6dc54de6dd5ffbb569f6e68989c4a04cc15b366791492dc2b6"} Dec 06 10:01:51 crc kubenswrapper[4672]: I1206 10:01:51.776864 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76f96bf4d5-7wt8s" event={"ID":"9b071d77-1fdb-4938-95ee-10e91492c545","Type":"ContainerDied","Data":"56b6135ba52ae6cdc807e3227322d4780647411d55647b5b69917ee780a7526a"} Dec 06 10:01:51 crc kubenswrapper[4672]: I1206 10:01:51.810858 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c65c599b9-mz2lg" event={"ID":"d26c88b2-357e-4c41-8f93-1f6422329000","Type":"ContainerDied","Data":"3337070ccd27e6a4582d339749efb97cb0bccc215ff1d359f46e806dfee91191"} Dec 06 10:01:51 crc kubenswrapper[4672]: I1206 10:01:51.810906 4672 scope.go:117] "RemoveContainer" containerID="a8b766d4589057d37abd957828d40407b2391ad1169cfd32ac838699e89eec28" Dec 06 10:01:51 crc kubenswrapper[4672]: I1206 10:01:51.811016 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c65c599b9-mz2lg" Dec 06 10:01:51 crc kubenswrapper[4672]: I1206 10:01:51.819712 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"42ee5091-4d7a-4807-905e-19dddd238386","Type":"ContainerStarted","Data":"6591e071331809cce901b29fd0e052a12b5a99422565eb49589276e70bc90446"} Dec 06 10:01:51 crc kubenswrapper[4672]: I1206 10:01:51.869767 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c65c599b9-mz2lg"] Dec 06 10:01:51 crc kubenswrapper[4672]: I1206 10:01:51.881359 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7c65c599b9-mz2lg"] Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.023044 4672 scope.go:117] "RemoveContainer" containerID="c0849faff0fa8114dafc6ad4ab8ba1a0e601ad343bcb00cd253d76d3d3ec7cdf" Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.125490 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76f96bf4d5-7wt8s" Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.244287 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b071d77-1fdb-4938-95ee-10e91492c545-scripts\") pod \"9b071d77-1fdb-4938-95ee-10e91492c545\" (UID: \"9b071d77-1fdb-4938-95ee-10e91492c545\") " Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.244416 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrzvq\" (UniqueName: \"kubernetes.io/projected/9b071d77-1fdb-4938-95ee-10e91492c545-kube-api-access-lrzvq\") pod \"9b071d77-1fdb-4938-95ee-10e91492c545\" (UID: \"9b071d77-1fdb-4938-95ee-10e91492c545\") " Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.244441 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9b071d77-1fdb-4938-95ee-10e91492c545-horizon-secret-key\") pod \"9b071d77-1fdb-4938-95ee-10e91492c545\" (UID: \"9b071d77-1fdb-4938-95ee-10e91492c545\") " Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.244475 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b071d77-1fdb-4938-95ee-10e91492c545-logs\") pod \"9b071d77-1fdb-4938-95ee-10e91492c545\" (UID: \"9b071d77-1fdb-4938-95ee-10e91492c545\") " Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.244511 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9b071d77-1fdb-4938-95ee-10e91492c545-config-data\") pod \"9b071d77-1fdb-4938-95ee-10e91492c545\" (UID: \"9b071d77-1fdb-4938-95ee-10e91492c545\") " Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.250155 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b071d77-1fdb-4938-95ee-10e91492c545-logs" (OuterVolumeSpecName: "logs") pod "9b071d77-1fdb-4938-95ee-10e91492c545" (UID: "9b071d77-1fdb-4938-95ee-10e91492c545"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.261783 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b071d77-1fdb-4938-95ee-10e91492c545-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "9b071d77-1fdb-4938-95ee-10e91492c545" (UID: "9b071d77-1fdb-4938-95ee-10e91492c545"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.267867 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b071d77-1fdb-4938-95ee-10e91492c545-kube-api-access-lrzvq" (OuterVolumeSpecName: "kube-api-access-lrzvq") pod "9b071d77-1fdb-4938-95ee-10e91492c545" (UID: "9b071d77-1fdb-4938-95ee-10e91492c545"). InnerVolumeSpecName "kube-api-access-lrzvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.309106 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b071d77-1fdb-4938-95ee-10e91492c545-scripts" (OuterVolumeSpecName: "scripts") pod "9b071d77-1fdb-4938-95ee-10e91492c545" (UID: "9b071d77-1fdb-4938-95ee-10e91492c545"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.309927 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b071d77-1fdb-4938-95ee-10e91492c545-config-data" (OuterVolumeSpecName: "config-data") pod "9b071d77-1fdb-4938-95ee-10e91492c545" (UID: "9b071d77-1fdb-4938-95ee-10e91492c545"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.349724 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b071d77-1fdb-4938-95ee-10e91492c545-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.349755 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrzvq\" (UniqueName: \"kubernetes.io/projected/9b071d77-1fdb-4938-95ee-10e91492c545-kube-api-access-lrzvq\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.349785 4672 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9b071d77-1fdb-4938-95ee-10e91492c545-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.349794 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b071d77-1fdb-4938-95ee-10e91492c545-logs\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.349803 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9b071d77-1fdb-4938-95ee-10e91492c545-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.576918 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3021c245-1d0d-4727-906b-26784c5e10bc" path="/var/lib/kubelet/pods/3021c245-1d0d-4727-906b-26784c5e10bc/volumes" Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.577954 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d26c88b2-357e-4c41-8f93-1f6422329000" path="/var/lib/kubelet/pods/d26c88b2-357e-4c41-8f93-1f6422329000/volumes" Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.757348 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.831428 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"c109eeef-0858-4722-a7d6-cefd2b8979b4","Type":"ContainerStarted","Data":"ed3a4957a69cdc5f5bfb86d0339c6fe0333d784786dc059c85b7f8c3c13e1209"} Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.851712 4672 generic.go:334] "Generic (PLEG): container finished" podID="4ab86bc2-089c-46d4-9c2c-a05140110779" containerID="fe312428c5b7cbb369d1801b2a999534b6fb673812b268823eff70f7676b869e" exitCode=0 Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.852012 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ab86bc2-089c-46d4-9c2c-a05140110779","Type":"ContainerDied","Data":"fe312428c5b7cbb369d1801b2a999534b6fb673812b268823eff70f7676b869e"} Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.852040 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ab86bc2-089c-46d4-9c2c-a05140110779","Type":"ContainerDied","Data":"a75c7ebcf73ef883ff1c97ca59b881d6695c71f9987d9938ca5feab3ba378c5b"} Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.852056 4672 scope.go:117] "RemoveContainer" containerID="fbb25ae67a88e3e2e2bae85c668babac0e3d42d08abe8b87f58e1bae3e8886cf" Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.852190 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.856222 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76f96bf4d5-7wt8s" event={"ID":"9b071d77-1fdb-4938-95ee-10e91492c545","Type":"ContainerDied","Data":"d8384d411952f3eb9bb4d82aded129e5eab4d0e33ccc750bb50ebed0bda1bd96"} Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.856283 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76f96bf4d5-7wt8s" Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.871208 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=5.064164701 podStartE2EDuration="17.871191564s" podCreationTimestamp="2025-12-06 10:01:35 +0000 UTC" firstStartedPulling="2025-12-06 10:01:37.249207907 +0000 UTC m=+3314.993468194" lastFinishedPulling="2025-12-06 10:01:50.05623477 +0000 UTC m=+3327.800495057" observedRunningTime="2025-12-06 10:01:52.860562036 +0000 UTC m=+3330.604822323" watchObservedRunningTime="2025-12-06 10:01:52.871191564 +0000 UTC m=+3330.615451851" Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.875933 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"42ee5091-4d7a-4807-905e-19dddd238386","Type":"ContainerStarted","Data":"c3cbabf86f33f8e05ce24f2ff17148544fbc73e7e26555b7826622f14d665127"} Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.876792 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.914507 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-76f96bf4d5-7wt8s"] Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.919201 4672 scope.go:117] "RemoveContainer" containerID="50ef7bdad1a9bf111054fbd7bb4178b2a45b503438a5979010c382d3f38acb1b" Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.923872 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-76f96bf4d5-7wt8s"] Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.929546 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=9.929526856 podStartE2EDuration="9.929526856s" podCreationTimestamp="2025-12-06 10:01:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 10:01:52.909985236 +0000 UTC m=+3330.654245543" watchObservedRunningTime="2025-12-06 10:01:52.929526856 +0000 UTC m=+3330.673787143" Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.961304 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ab86bc2-089c-46d4-9c2c-a05140110779-log-httpd\") pod \"4ab86bc2-089c-46d4-9c2c-a05140110779\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.961369 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ab86bc2-089c-46d4-9c2c-a05140110779-config-data\") pod \"4ab86bc2-089c-46d4-9c2c-a05140110779\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.961435 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ab86bc2-089c-46d4-9c2c-a05140110779-run-httpd\") pod \"4ab86bc2-089c-46d4-9c2c-a05140110779\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.961634 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ab86bc2-089c-46d4-9c2c-a05140110779-ceilometer-tls-certs\") pod \"4ab86bc2-089c-46d4-9c2c-a05140110779\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.961711 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ab86bc2-089c-46d4-9c2c-a05140110779-sg-core-conf-yaml\") pod \"4ab86bc2-089c-46d4-9c2c-a05140110779\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.961760 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkpd8\" (UniqueName: \"kubernetes.io/projected/4ab86bc2-089c-46d4-9c2c-a05140110779-kube-api-access-zkpd8\") pod \"4ab86bc2-089c-46d4-9c2c-a05140110779\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.961798 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ab86bc2-089c-46d4-9c2c-a05140110779-scripts\") pod \"4ab86bc2-089c-46d4-9c2c-a05140110779\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.961859 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ab86bc2-089c-46d4-9c2c-a05140110779-combined-ca-bundle\") pod \"4ab86bc2-089c-46d4-9c2c-a05140110779\" (UID: \"4ab86bc2-089c-46d4-9c2c-a05140110779\") " Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.964243 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ab86bc2-089c-46d4-9c2c-a05140110779-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4ab86bc2-089c-46d4-9c2c-a05140110779" (UID: "4ab86bc2-089c-46d4-9c2c-a05140110779"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.974625 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ab86bc2-089c-46d4-9c2c-a05140110779-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4ab86bc2-089c-46d4-9c2c-a05140110779" (UID: "4ab86bc2-089c-46d4-9c2c-a05140110779"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:01:52 crc kubenswrapper[4672]: I1206 10:01:52.978865 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ab86bc2-089c-46d4-9c2c-a05140110779-scripts" (OuterVolumeSpecName: "scripts") pod "4ab86bc2-089c-46d4-9c2c-a05140110779" (UID: "4ab86bc2-089c-46d4-9c2c-a05140110779"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.010914 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ab86bc2-089c-46d4-9c2c-a05140110779-kube-api-access-zkpd8" (OuterVolumeSpecName: "kube-api-access-zkpd8") pod "4ab86bc2-089c-46d4-9c2c-a05140110779" (UID: "4ab86bc2-089c-46d4-9c2c-a05140110779"). InnerVolumeSpecName "kube-api-access-zkpd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.011101 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-8c74dbc66-8ghhf" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.064278 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkpd8\" (UniqueName: \"kubernetes.io/projected/4ab86bc2-089c-46d4-9c2c-a05140110779-kube-api-access-zkpd8\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.064326 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ab86bc2-089c-46d4-9c2c-a05140110779-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.064335 4672 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ab86bc2-089c-46d4-9c2c-a05140110779-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.064345 4672 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ab86bc2-089c-46d4-9c2c-a05140110779-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.129753 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ab86bc2-089c-46d4-9c2c-a05140110779-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "4ab86bc2-089c-46d4-9c2c-a05140110779" (UID: "4ab86bc2-089c-46d4-9c2c-a05140110779"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.139762 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ab86bc2-089c-46d4-9c2c-a05140110779-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4ab86bc2-089c-46d4-9c2c-a05140110779" (UID: "4ab86bc2-089c-46d4-9c2c-a05140110779"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.175001 4672 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ab86bc2-089c-46d4-9c2c-a05140110779-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.175036 4672 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ab86bc2-089c-46d4-9c2c-a05140110779-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.185157 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ab86bc2-089c-46d4-9c2c-a05140110779-config-data" (OuterVolumeSpecName: "config-data") pod "4ab86bc2-089c-46d4-9c2c-a05140110779" (UID: "4ab86bc2-089c-46d4-9c2c-a05140110779"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.204384 4672 scope.go:117] "RemoveContainer" containerID="fe312428c5b7cbb369d1801b2a999534b6fb673812b268823eff70f7676b869e" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.204919 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ab86bc2-089c-46d4-9c2c-a05140110779-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ab86bc2-089c-46d4-9c2c-a05140110779" (UID: "4ab86bc2-089c-46d4-9c2c-a05140110779"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.247470 4672 scope.go:117] "RemoveContainer" containerID="e936120c2580c0bcc7e57e8663991ac5bc0f338b9ba9058f75b969936cd4c4df" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.282285 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ab86bc2-089c-46d4-9c2c-a05140110779-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.282310 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ab86bc2-089c-46d4-9c2c-a05140110779-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.285301 4672 scope.go:117] "RemoveContainer" containerID="fbb25ae67a88e3e2e2bae85c668babac0e3d42d08abe8b87f58e1bae3e8886cf" Dec 06 10:01:53 crc kubenswrapper[4672]: E1206 10:01:53.286870 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbb25ae67a88e3e2e2bae85c668babac0e3d42d08abe8b87f58e1bae3e8886cf\": container with ID starting with fbb25ae67a88e3e2e2bae85c668babac0e3d42d08abe8b87f58e1bae3e8886cf not found: ID does not exist" containerID="fbb25ae67a88e3e2e2bae85c668babac0e3d42d08abe8b87f58e1bae3e8886cf" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.286903 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbb25ae67a88e3e2e2bae85c668babac0e3d42d08abe8b87f58e1bae3e8886cf"} err="failed to get container status \"fbb25ae67a88e3e2e2bae85c668babac0e3d42d08abe8b87f58e1bae3e8886cf\": rpc error: code = NotFound desc = could not find container \"fbb25ae67a88e3e2e2bae85c668babac0e3d42d08abe8b87f58e1bae3e8886cf\": container with ID starting with fbb25ae67a88e3e2e2bae85c668babac0e3d42d08abe8b87f58e1bae3e8886cf not found: ID does not exist" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.286925 4672 scope.go:117] "RemoveContainer" containerID="50ef7bdad1a9bf111054fbd7bb4178b2a45b503438a5979010c382d3f38acb1b" Dec 06 10:01:53 crc kubenswrapper[4672]: E1206 10:01:53.287093 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50ef7bdad1a9bf111054fbd7bb4178b2a45b503438a5979010c382d3f38acb1b\": container with ID starting with 50ef7bdad1a9bf111054fbd7bb4178b2a45b503438a5979010c382d3f38acb1b not found: ID does not exist" containerID="50ef7bdad1a9bf111054fbd7bb4178b2a45b503438a5979010c382d3f38acb1b" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.287109 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50ef7bdad1a9bf111054fbd7bb4178b2a45b503438a5979010c382d3f38acb1b"} err="failed to get container status \"50ef7bdad1a9bf111054fbd7bb4178b2a45b503438a5979010c382d3f38acb1b\": rpc error: code = NotFound desc = could not find container \"50ef7bdad1a9bf111054fbd7bb4178b2a45b503438a5979010c382d3f38acb1b\": container with ID starting with 50ef7bdad1a9bf111054fbd7bb4178b2a45b503438a5979010c382d3f38acb1b not found: ID does not exist" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.287122 4672 scope.go:117] "RemoveContainer" containerID="fe312428c5b7cbb369d1801b2a999534b6fb673812b268823eff70f7676b869e" Dec 06 10:01:53 crc kubenswrapper[4672]: E1206 10:01:53.287302 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe312428c5b7cbb369d1801b2a999534b6fb673812b268823eff70f7676b869e\": container with ID starting with fe312428c5b7cbb369d1801b2a999534b6fb673812b268823eff70f7676b869e not found: ID does not exist" containerID="fe312428c5b7cbb369d1801b2a999534b6fb673812b268823eff70f7676b869e" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.287322 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe312428c5b7cbb369d1801b2a999534b6fb673812b268823eff70f7676b869e"} err="failed to get container status \"fe312428c5b7cbb369d1801b2a999534b6fb673812b268823eff70f7676b869e\": rpc error: code = NotFound desc = could not find container \"fe312428c5b7cbb369d1801b2a999534b6fb673812b268823eff70f7676b869e\": container with ID starting with fe312428c5b7cbb369d1801b2a999534b6fb673812b268823eff70f7676b869e not found: ID does not exist" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.287334 4672 scope.go:117] "RemoveContainer" containerID="e936120c2580c0bcc7e57e8663991ac5bc0f338b9ba9058f75b969936cd4c4df" Dec 06 10:01:53 crc kubenswrapper[4672]: E1206 10:01:53.287486 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e936120c2580c0bcc7e57e8663991ac5bc0f338b9ba9058f75b969936cd4c4df\": container with ID starting with e936120c2580c0bcc7e57e8663991ac5bc0f338b9ba9058f75b969936cd4c4df not found: ID does not exist" containerID="e936120c2580c0bcc7e57e8663991ac5bc0f338b9ba9058f75b969936cd4c4df" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.287505 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e936120c2580c0bcc7e57e8663991ac5bc0f338b9ba9058f75b969936cd4c4df"} err="failed to get container status \"e936120c2580c0bcc7e57e8663991ac5bc0f338b9ba9058f75b969936cd4c4df\": rpc error: code = NotFound desc = could not find container \"e936120c2580c0bcc7e57e8663991ac5bc0f338b9ba9058f75b969936cd4c4df\": container with ID starting with e936120c2580c0bcc7e57e8663991ac5bc0f338b9ba9058f75b969936cd4c4df not found: ID does not exist" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.287516 4672 scope.go:117] "RemoveContainer" containerID="b2add5d8255b5d6dc54de6dd5ffbb569f6e68989c4a04cc15b366791492dc2b6" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.506340 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.506975 4672 scope.go:117] "RemoveContainer" containerID="56b6135ba52ae6cdc807e3227322d4780647411d55647b5b69917ee780a7526a" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.520415 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.531622 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 10:01:53 crc kubenswrapper[4672]: E1206 10:01:53.534687 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ab86bc2-089c-46d4-9c2c-a05140110779" containerName="ceilometer-notification-agent" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.534716 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ab86bc2-089c-46d4-9c2c-a05140110779" containerName="ceilometer-notification-agent" Dec 06 10:01:53 crc kubenswrapper[4672]: E1206 10:01:53.534739 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d26c88b2-357e-4c41-8f93-1f6422329000" containerName="horizon-log" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.534746 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d26c88b2-357e-4c41-8f93-1f6422329000" containerName="horizon-log" Dec 06 10:01:53 crc kubenswrapper[4672]: E1206 10:01:53.534760 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b071d77-1fdb-4938-95ee-10e91492c545" containerName="horizon-log" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.534766 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b071d77-1fdb-4938-95ee-10e91492c545" containerName="horizon-log" Dec 06 10:01:53 crc kubenswrapper[4672]: E1206 10:01:53.534796 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b071d77-1fdb-4938-95ee-10e91492c545" containerName="horizon" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.534801 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b071d77-1fdb-4938-95ee-10e91492c545" containerName="horizon" Dec 06 10:01:53 crc kubenswrapper[4672]: E1206 10:01:53.534825 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d26c88b2-357e-4c41-8f93-1f6422329000" containerName="horizon" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.534830 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d26c88b2-357e-4c41-8f93-1f6422329000" containerName="horizon" Dec 06 10:01:53 crc kubenswrapper[4672]: E1206 10:01:53.534843 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3021c245-1d0d-4727-906b-26784c5e10bc" containerName="dnsmasq-dns" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.534850 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="3021c245-1d0d-4727-906b-26784c5e10bc" containerName="dnsmasq-dns" Dec 06 10:01:53 crc kubenswrapper[4672]: E1206 10:01:53.534866 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3021c245-1d0d-4727-906b-26784c5e10bc" containerName="init" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.534872 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="3021c245-1d0d-4727-906b-26784c5e10bc" containerName="init" Dec 06 10:01:53 crc kubenswrapper[4672]: E1206 10:01:53.534890 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ab86bc2-089c-46d4-9c2c-a05140110779" containerName="ceilometer-central-agent" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.534895 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ab86bc2-089c-46d4-9c2c-a05140110779" containerName="ceilometer-central-agent" Dec 06 10:01:53 crc kubenswrapper[4672]: E1206 10:01:53.534909 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ab86bc2-089c-46d4-9c2c-a05140110779" containerName="proxy-httpd" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.534915 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ab86bc2-089c-46d4-9c2c-a05140110779" containerName="proxy-httpd" Dec 06 10:01:53 crc kubenswrapper[4672]: E1206 10:01:53.534928 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ab86bc2-089c-46d4-9c2c-a05140110779" containerName="sg-core" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.534933 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ab86bc2-089c-46d4-9c2c-a05140110779" containerName="sg-core" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.544139 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ab86bc2-089c-46d4-9c2c-a05140110779" containerName="ceilometer-notification-agent" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.544177 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b071d77-1fdb-4938-95ee-10e91492c545" containerName="horizon" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.544185 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ab86bc2-089c-46d4-9c2c-a05140110779" containerName="proxy-httpd" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.544197 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="3021c245-1d0d-4727-906b-26784c5e10bc" containerName="dnsmasq-dns" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.544215 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ab86bc2-089c-46d4-9c2c-a05140110779" containerName="ceilometer-central-agent" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.544224 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="d26c88b2-357e-4c41-8f93-1f6422329000" containerName="horizon-log" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.544246 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ab86bc2-089c-46d4-9c2c-a05140110779" containerName="sg-core" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.544260 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b071d77-1fdb-4938-95ee-10e91492c545" containerName="horizon-log" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.544271 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="d26c88b2-357e-4c41-8f93-1f6422329000" containerName="horizon" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.546388 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.546497 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.551779 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.551899 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.552006 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.589353 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5b745d8b98-pzrsc" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.689164 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a9e79b9-7523-4874-9912-68f9f580c1ba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " pod="openstack/ceilometer-0" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.689229 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a9e79b9-7523-4874-9912-68f9f580c1ba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " pod="openstack/ceilometer-0" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.689266 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a9e79b9-7523-4874-9912-68f9f580c1ba-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " pod="openstack/ceilometer-0" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.689302 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a9e79b9-7523-4874-9912-68f9f580c1ba-log-httpd\") pod \"ceilometer-0\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " pod="openstack/ceilometer-0" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.689354 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a9e79b9-7523-4874-9912-68f9f580c1ba-run-httpd\") pod \"ceilometer-0\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " pod="openstack/ceilometer-0" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.689697 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g6fj\" (UniqueName: \"kubernetes.io/projected/5a9e79b9-7523-4874-9912-68f9f580c1ba-kube-api-access-9g6fj\") pod \"ceilometer-0\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " pod="openstack/ceilometer-0" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.690525 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a9e79b9-7523-4874-9912-68f9f580c1ba-scripts\") pod \"ceilometer-0\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " pod="openstack/ceilometer-0" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.690900 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a9e79b9-7523-4874-9912-68f9f580c1ba-config-data\") pod \"ceilometer-0\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " pod="openstack/ceilometer-0" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.732744 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 10:01:53 crc kubenswrapper[4672]: E1206 10:01:53.733472 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-9g6fj log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="5a9e79b9-7523-4874-9912-68f9f580c1ba" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.792359 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g6fj\" (UniqueName: \"kubernetes.io/projected/5a9e79b9-7523-4874-9912-68f9f580c1ba-kube-api-access-9g6fj\") pod \"ceilometer-0\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " pod="openstack/ceilometer-0" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.792423 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a9e79b9-7523-4874-9912-68f9f580c1ba-scripts\") pod \"ceilometer-0\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " pod="openstack/ceilometer-0" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.792493 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a9e79b9-7523-4874-9912-68f9f580c1ba-config-data\") pod \"ceilometer-0\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " pod="openstack/ceilometer-0" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.792551 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a9e79b9-7523-4874-9912-68f9f580c1ba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " pod="openstack/ceilometer-0" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.792585 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a9e79b9-7523-4874-9912-68f9f580c1ba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " pod="openstack/ceilometer-0" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.792638 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a9e79b9-7523-4874-9912-68f9f580c1ba-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " pod="openstack/ceilometer-0" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.792693 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a9e79b9-7523-4874-9912-68f9f580c1ba-log-httpd\") pod \"ceilometer-0\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " pod="openstack/ceilometer-0" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.792721 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a9e79b9-7523-4874-9912-68f9f580c1ba-run-httpd\") pod \"ceilometer-0\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " pod="openstack/ceilometer-0" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.793108 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a9e79b9-7523-4874-9912-68f9f580c1ba-log-httpd\") pod \"ceilometer-0\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " pod="openstack/ceilometer-0" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.793197 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a9e79b9-7523-4874-9912-68f9f580c1ba-run-httpd\") pod \"ceilometer-0\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " pod="openstack/ceilometer-0" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.797185 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a9e79b9-7523-4874-9912-68f9f580c1ba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " pod="openstack/ceilometer-0" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.797373 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a9e79b9-7523-4874-9912-68f9f580c1ba-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " pod="openstack/ceilometer-0" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.798510 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a9e79b9-7523-4874-9912-68f9f580c1ba-config-data\") pod \"ceilometer-0\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " pod="openstack/ceilometer-0" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.799227 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a9e79b9-7523-4874-9912-68f9f580c1ba-scripts\") pod \"ceilometer-0\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " pod="openstack/ceilometer-0" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.801201 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a9e79b9-7523-4874-9912-68f9f580c1ba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " pod="openstack/ceilometer-0" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.811002 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g6fj\" (UniqueName: \"kubernetes.io/projected/5a9e79b9-7523-4874-9912-68f9f580c1ba-kube-api-access-9g6fj\") pod \"ceilometer-0\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " pod="openstack/ceilometer-0" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.893637 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.902153 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.996125 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a9e79b9-7523-4874-9912-68f9f580c1ba-run-httpd\") pod \"5a9e79b9-7523-4874-9912-68f9f580c1ba\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.996198 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a9e79b9-7523-4874-9912-68f9f580c1ba-ceilometer-tls-certs\") pod \"5a9e79b9-7523-4874-9912-68f9f580c1ba\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.996221 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a9e79b9-7523-4874-9912-68f9f580c1ba-scripts\") pod \"5a9e79b9-7523-4874-9912-68f9f580c1ba\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.996267 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a9e79b9-7523-4874-9912-68f9f580c1ba-sg-core-conf-yaml\") pod \"5a9e79b9-7523-4874-9912-68f9f580c1ba\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.996406 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a9e79b9-7523-4874-9912-68f9f580c1ba-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5a9e79b9-7523-4874-9912-68f9f580c1ba" (UID: "5a9e79b9-7523-4874-9912-68f9f580c1ba"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.997411 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a9e79b9-7523-4874-9912-68f9f580c1ba-log-httpd\") pod \"5a9e79b9-7523-4874-9912-68f9f580c1ba\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.997491 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9g6fj\" (UniqueName: \"kubernetes.io/projected/5a9e79b9-7523-4874-9912-68f9f580c1ba-kube-api-access-9g6fj\") pod \"5a9e79b9-7523-4874-9912-68f9f580c1ba\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.997538 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a9e79b9-7523-4874-9912-68f9f580c1ba-combined-ca-bundle\") pod \"5a9e79b9-7523-4874-9912-68f9f580c1ba\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.997660 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a9e79b9-7523-4874-9912-68f9f580c1ba-config-data\") pod \"5a9e79b9-7523-4874-9912-68f9f580c1ba\" (UID: \"5a9e79b9-7523-4874-9912-68f9f580c1ba\") " Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.997777 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a9e79b9-7523-4874-9912-68f9f580c1ba-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5a9e79b9-7523-4874-9912-68f9f580c1ba" (UID: "5a9e79b9-7523-4874-9912-68f9f580c1ba"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.998225 4672 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a9e79b9-7523-4874-9912-68f9f580c1ba-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:53 crc kubenswrapper[4672]: I1206 10:01:53.998247 4672 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a9e79b9-7523-4874-9912-68f9f580c1ba-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:54 crc kubenswrapper[4672]: I1206 10:01:54.002576 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a9e79b9-7523-4874-9912-68f9f580c1ba-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5a9e79b9-7523-4874-9912-68f9f580c1ba" (UID: "5a9e79b9-7523-4874-9912-68f9f580c1ba"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:54 crc kubenswrapper[4672]: I1206 10:01:54.002662 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a9e79b9-7523-4874-9912-68f9f580c1ba-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "5a9e79b9-7523-4874-9912-68f9f580c1ba" (UID: "5a9e79b9-7523-4874-9912-68f9f580c1ba"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:54 crc kubenswrapper[4672]: I1206 10:01:54.009645 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a9e79b9-7523-4874-9912-68f9f580c1ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a9e79b9-7523-4874-9912-68f9f580c1ba" (UID: "5a9e79b9-7523-4874-9912-68f9f580c1ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:54 crc kubenswrapper[4672]: I1206 10:01:54.010830 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a9e79b9-7523-4874-9912-68f9f580c1ba-kube-api-access-9g6fj" (OuterVolumeSpecName: "kube-api-access-9g6fj") pod "5a9e79b9-7523-4874-9912-68f9f580c1ba" (UID: "5a9e79b9-7523-4874-9912-68f9f580c1ba"). InnerVolumeSpecName "kube-api-access-9g6fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:01:54 crc kubenswrapper[4672]: I1206 10:01:54.013796 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a9e79b9-7523-4874-9912-68f9f580c1ba-config-data" (OuterVolumeSpecName: "config-data") pod "5a9e79b9-7523-4874-9912-68f9f580c1ba" (UID: "5a9e79b9-7523-4874-9912-68f9f580c1ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:54 crc kubenswrapper[4672]: I1206 10:01:54.032891 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a9e79b9-7523-4874-9912-68f9f580c1ba-scripts" (OuterVolumeSpecName: "scripts") pod "5a9e79b9-7523-4874-9912-68f9f580c1ba" (UID: "5a9e79b9-7523-4874-9912-68f9f580c1ba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:54 crc kubenswrapper[4672]: I1206 10:01:54.101196 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a9e79b9-7523-4874-9912-68f9f580c1ba-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:54 crc kubenswrapper[4672]: I1206 10:01:54.101228 4672 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a9e79b9-7523-4874-9912-68f9f580c1ba-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:54 crc kubenswrapper[4672]: I1206 10:01:54.101240 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a9e79b9-7523-4874-9912-68f9f580c1ba-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:54 crc kubenswrapper[4672]: I1206 10:01:54.101250 4672 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a9e79b9-7523-4874-9912-68f9f580c1ba-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:54 crc kubenswrapper[4672]: I1206 10:01:54.101259 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9g6fj\" (UniqueName: \"kubernetes.io/projected/5a9e79b9-7523-4874-9912-68f9f580c1ba-kube-api-access-9g6fj\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:54 crc kubenswrapper[4672]: I1206 10:01:54.101267 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a9e79b9-7523-4874-9912-68f9f580c1ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:54 crc kubenswrapper[4672]: I1206 10:01:54.566441 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ab86bc2-089c-46d4-9c2c-a05140110779" path="/var/lib/kubelet/pods/4ab86bc2-089c-46d4-9c2c-a05140110779/volumes" Dec 06 10:01:54 crc kubenswrapper[4672]: I1206 10:01:54.567687 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b071d77-1fdb-4938-95ee-10e91492c545" path="/var/lib/kubelet/pods/9b071d77-1fdb-4938-95ee-10e91492c545/volumes" Dec 06 10:01:54 crc kubenswrapper[4672]: I1206 10:01:54.900259 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 10:01:54 crc kubenswrapper[4672]: I1206 10:01:54.956636 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 10:01:54 crc kubenswrapper[4672]: I1206 10:01:54.971946 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 10:01:54 crc kubenswrapper[4672]: I1206 10:01:54.987240 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 10:01:54 crc kubenswrapper[4672]: I1206 10:01:54.992432 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 10:01:54 crc kubenswrapper[4672]: I1206 10:01:54.995278 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 06 10:01:54 crc kubenswrapper[4672]: I1206 10:01:54.995652 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 10:01:54 crc kubenswrapper[4672]: I1206 10:01:54.995795 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.000829 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.127777 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42976197-15a4-4ceb-baf3-fa56682d89a6-log-httpd\") pod \"ceilometer-0\" (UID: \"42976197-15a4-4ceb-baf3-fa56682d89a6\") " pod="openstack/ceilometer-0" Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.127826 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g2gc\" (UniqueName: \"kubernetes.io/projected/42976197-15a4-4ceb-baf3-fa56682d89a6-kube-api-access-2g2gc\") pod \"ceilometer-0\" (UID: \"42976197-15a4-4ceb-baf3-fa56682d89a6\") " pod="openstack/ceilometer-0" Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.127876 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42976197-15a4-4ceb-baf3-fa56682d89a6-config-data\") pod \"ceilometer-0\" (UID: \"42976197-15a4-4ceb-baf3-fa56682d89a6\") " pod="openstack/ceilometer-0" Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.127925 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/42976197-15a4-4ceb-baf3-fa56682d89a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"42976197-15a4-4ceb-baf3-fa56682d89a6\") " pod="openstack/ceilometer-0" Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.128019 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42976197-15a4-4ceb-baf3-fa56682d89a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"42976197-15a4-4ceb-baf3-fa56682d89a6\") " pod="openstack/ceilometer-0" Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.128051 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/42976197-15a4-4ceb-baf3-fa56682d89a6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"42976197-15a4-4ceb-baf3-fa56682d89a6\") " pod="openstack/ceilometer-0" Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.128092 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42976197-15a4-4ceb-baf3-fa56682d89a6-scripts\") pod \"ceilometer-0\" (UID: \"42976197-15a4-4ceb-baf3-fa56682d89a6\") " pod="openstack/ceilometer-0" Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.128108 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42976197-15a4-4ceb-baf3-fa56682d89a6-run-httpd\") pod \"ceilometer-0\" (UID: \"42976197-15a4-4ceb-baf3-fa56682d89a6\") " pod="openstack/ceilometer-0" Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.229780 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42976197-15a4-4ceb-baf3-fa56682d89a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"42976197-15a4-4ceb-baf3-fa56682d89a6\") " pod="openstack/ceilometer-0" Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.229825 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/42976197-15a4-4ceb-baf3-fa56682d89a6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"42976197-15a4-4ceb-baf3-fa56682d89a6\") " pod="openstack/ceilometer-0" Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.229862 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42976197-15a4-4ceb-baf3-fa56682d89a6-scripts\") pod \"ceilometer-0\" (UID: \"42976197-15a4-4ceb-baf3-fa56682d89a6\") " pod="openstack/ceilometer-0" Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.229879 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42976197-15a4-4ceb-baf3-fa56682d89a6-run-httpd\") pod \"ceilometer-0\" (UID: \"42976197-15a4-4ceb-baf3-fa56682d89a6\") " pod="openstack/ceilometer-0" Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.229945 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42976197-15a4-4ceb-baf3-fa56682d89a6-log-httpd\") pod \"ceilometer-0\" (UID: \"42976197-15a4-4ceb-baf3-fa56682d89a6\") " pod="openstack/ceilometer-0" Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.229973 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g2gc\" (UniqueName: \"kubernetes.io/projected/42976197-15a4-4ceb-baf3-fa56682d89a6-kube-api-access-2g2gc\") pod \"ceilometer-0\" (UID: \"42976197-15a4-4ceb-baf3-fa56682d89a6\") " pod="openstack/ceilometer-0" Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.230012 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42976197-15a4-4ceb-baf3-fa56682d89a6-config-data\") pod \"ceilometer-0\" (UID: \"42976197-15a4-4ceb-baf3-fa56682d89a6\") " pod="openstack/ceilometer-0" Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.230027 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/42976197-15a4-4ceb-baf3-fa56682d89a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"42976197-15a4-4ceb-baf3-fa56682d89a6\") " pod="openstack/ceilometer-0" Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.231435 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42976197-15a4-4ceb-baf3-fa56682d89a6-run-httpd\") pod \"ceilometer-0\" (UID: \"42976197-15a4-4ceb-baf3-fa56682d89a6\") " pod="openstack/ceilometer-0" Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.231829 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42976197-15a4-4ceb-baf3-fa56682d89a6-log-httpd\") pod \"ceilometer-0\" (UID: \"42976197-15a4-4ceb-baf3-fa56682d89a6\") " pod="openstack/ceilometer-0" Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.234338 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/42976197-15a4-4ceb-baf3-fa56682d89a6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"42976197-15a4-4ceb-baf3-fa56682d89a6\") " pod="openstack/ceilometer-0" Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.235039 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42976197-15a4-4ceb-baf3-fa56682d89a6-config-data\") pod \"ceilometer-0\" (UID: \"42976197-15a4-4ceb-baf3-fa56682d89a6\") " pod="openstack/ceilometer-0" Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.248350 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42976197-15a4-4ceb-baf3-fa56682d89a6-scripts\") pod \"ceilometer-0\" (UID: \"42976197-15a4-4ceb-baf3-fa56682d89a6\") " pod="openstack/ceilometer-0" Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.248886 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/42976197-15a4-4ceb-baf3-fa56682d89a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"42976197-15a4-4ceb-baf3-fa56682d89a6\") " pod="openstack/ceilometer-0" Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.249498 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42976197-15a4-4ceb-baf3-fa56682d89a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"42976197-15a4-4ceb-baf3-fa56682d89a6\") " pod="openstack/ceilometer-0" Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.258271 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g2gc\" (UniqueName: \"kubernetes.io/projected/42976197-15a4-4ceb-baf3-fa56682d89a6-kube-api-access-2g2gc\") pod \"ceilometer-0\" (UID: \"42976197-15a4-4ceb-baf3-fa56682d89a6\") " pod="openstack/ceilometer-0" Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.322187 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.394935 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-8c74dbc66-8ghhf" Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.488474 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5b745d8b98-pzrsc"] Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.490712 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5b745d8b98-pzrsc" podUID="afd13bd1-0e47-4739-9f82-e673232e3c61" containerName="horizon-log" containerID="cri-o://5a27632bac37b4f4e263c64bd1c2b9c86c1c68c6b8f656990d806f14600af641" gracePeriod=30 Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.492746 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5b745d8b98-pzrsc" podUID="afd13bd1-0e47-4739-9f82-e673232e3c61" containerName="horizon" containerID="cri-o://904ae83ce27cc40a81632b38ca1eeb1805e0dc909323a785271dde76df7f210d" gracePeriod=30 Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.506018 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5b745d8b98-pzrsc" podUID="afd13bd1-0e47-4739-9f82-e673232e3c61" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.240:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.515739 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5b745d8b98-pzrsc" podUID="afd13bd1-0e47-4739-9f82-e673232e3c61" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.240:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:47930->10.217.0.240:8443: read: connection reset by peer" Dec 06 10:01:55 crc kubenswrapper[4672]: I1206 10:01:55.964917 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 10:01:56 crc kubenswrapper[4672]: I1206 10:01:56.044180 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Dec 06 10:01:56 crc kubenswrapper[4672]: I1206 10:01:56.577625 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a9e79b9-7523-4874-9912-68f9f580c1ba" path="/var/lib/kubelet/pods/5a9e79b9-7523-4874-9912-68f9f580c1ba/volumes" Dec 06 10:01:56 crc kubenswrapper[4672]: I1206 10:01:56.917782 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42976197-15a4-4ceb-baf3-fa56682d89a6","Type":"ContainerStarted","Data":"32d468201ac8e131489233ef5e52a82707d975412174ccfe34afd75af551be69"} Dec 06 10:01:58 crc kubenswrapper[4672]: I1206 10:01:58.119700 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Dec 06 10:01:58 crc kubenswrapper[4672]: I1206 10:01:58.178847 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Dec 06 10:01:58 crc kubenswrapper[4672]: I1206 10:01:58.908420 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5b745d8b98-pzrsc" podUID="afd13bd1-0e47-4739-9f82-e673232e3c61" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.240:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:47946->10.217.0.240:8443: read: connection reset by peer" Dec 06 10:01:58 crc kubenswrapper[4672]: I1206 10:01:58.937182 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42976197-15a4-4ceb-baf3-fa56682d89a6","Type":"ContainerStarted","Data":"fa248802cd26feb266705190d2139a13ad616931956e1901444e42f184ebe661"} Dec 06 10:01:58 crc kubenswrapper[4672]: I1206 10:01:58.937224 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42976197-15a4-4ceb-baf3-fa56682d89a6","Type":"ContainerStarted","Data":"7138d22d362de1875574ae0cf350db5c8b74d78bd3295bdf6ef53b98eed56f8d"} Dec 06 10:01:58 crc kubenswrapper[4672]: I1206 10:01:58.937296 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="08451f8e-b555-445f-b31d-e0e9f8011ed0" containerName="manila-scheduler" containerID="cri-o://6a96ece6ecfa8e3c40d934c26b0b186331960aae43ac5c94667325666c5c7e8d" gracePeriod=30 Dec 06 10:01:58 crc kubenswrapper[4672]: I1206 10:01:58.937690 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="08451f8e-b555-445f-b31d-e0e9f8011ed0" containerName="probe" containerID="cri-o://828d9b901a2aa890f76a1742d2eb6df403db387fe3aa8e59c4d7a4b4c22189ac" gracePeriod=30 Dec 06 10:01:59 crc kubenswrapper[4672]: I1206 10:01:59.950688 4672 generic.go:334] "Generic (PLEG): container finished" podID="08451f8e-b555-445f-b31d-e0e9f8011ed0" containerID="828d9b901a2aa890f76a1742d2eb6df403db387fe3aa8e59c4d7a4b4c22189ac" exitCode=0 Dec 06 10:01:59 crc kubenswrapper[4672]: I1206 10:01:59.950909 4672 generic.go:334] "Generic (PLEG): container finished" podID="08451f8e-b555-445f-b31d-e0e9f8011ed0" containerID="6a96ece6ecfa8e3c40d934c26b0b186331960aae43ac5c94667325666c5c7e8d" exitCode=0 Dec 06 10:01:59 crc kubenswrapper[4672]: I1206 10:01:59.950861 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"08451f8e-b555-445f-b31d-e0e9f8011ed0","Type":"ContainerDied","Data":"828d9b901a2aa890f76a1742d2eb6df403db387fe3aa8e59c4d7a4b4c22189ac"} Dec 06 10:01:59 crc kubenswrapper[4672]: I1206 10:01:59.950970 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"08451f8e-b555-445f-b31d-e0e9f8011ed0","Type":"ContainerDied","Data":"6a96ece6ecfa8e3c40d934c26b0b186331960aae43ac5c94667325666c5c7e8d"} Dec 06 10:01:59 crc kubenswrapper[4672]: I1206 10:01:59.955178 4672 generic.go:334] "Generic (PLEG): container finished" podID="afd13bd1-0e47-4739-9f82-e673232e3c61" containerID="904ae83ce27cc40a81632b38ca1eeb1805e0dc909323a785271dde76df7f210d" exitCode=0 Dec 06 10:01:59 crc kubenswrapper[4672]: I1206 10:01:59.955237 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b745d8b98-pzrsc" event={"ID":"afd13bd1-0e47-4739-9f82-e673232e3c61","Type":"ContainerDied","Data":"904ae83ce27cc40a81632b38ca1eeb1805e0dc909323a785271dde76df7f210d"} Dec 06 10:01:59 crc kubenswrapper[4672]: I1206 10:01:59.957492 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42976197-15a4-4ceb-baf3-fa56682d89a6","Type":"ContainerStarted","Data":"857dba92b6937bd07d6a99347b5dcf26446f323587c9f91fe85112a4d940f2aa"} Dec 06 10:02:00 crc kubenswrapper[4672]: I1206 10:02:00.214695 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 06 10:02:00 crc kubenswrapper[4672]: I1206 10:02:00.329232 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb92r\" (UniqueName: \"kubernetes.io/projected/08451f8e-b555-445f-b31d-e0e9f8011ed0-kube-api-access-bb92r\") pod \"08451f8e-b555-445f-b31d-e0e9f8011ed0\" (UID: \"08451f8e-b555-445f-b31d-e0e9f8011ed0\") " Dec 06 10:02:00 crc kubenswrapper[4672]: I1206 10:02:00.329278 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08451f8e-b555-445f-b31d-e0e9f8011ed0-etc-machine-id\") pod \"08451f8e-b555-445f-b31d-e0e9f8011ed0\" (UID: \"08451f8e-b555-445f-b31d-e0e9f8011ed0\") " Dec 06 10:02:00 crc kubenswrapper[4672]: I1206 10:02:00.329343 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08451f8e-b555-445f-b31d-e0e9f8011ed0-scripts\") pod \"08451f8e-b555-445f-b31d-e0e9f8011ed0\" (UID: \"08451f8e-b555-445f-b31d-e0e9f8011ed0\") " Dec 06 10:02:00 crc kubenswrapper[4672]: I1206 10:02:00.329455 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08451f8e-b555-445f-b31d-e0e9f8011ed0-config-data\") pod \"08451f8e-b555-445f-b31d-e0e9f8011ed0\" (UID: \"08451f8e-b555-445f-b31d-e0e9f8011ed0\") " Dec 06 10:02:00 crc kubenswrapper[4672]: I1206 10:02:00.329470 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08451f8e-b555-445f-b31d-e0e9f8011ed0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "08451f8e-b555-445f-b31d-e0e9f8011ed0" (UID: "08451f8e-b555-445f-b31d-e0e9f8011ed0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 10:02:00 crc kubenswrapper[4672]: I1206 10:02:00.329510 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08451f8e-b555-445f-b31d-e0e9f8011ed0-combined-ca-bundle\") pod \"08451f8e-b555-445f-b31d-e0e9f8011ed0\" (UID: \"08451f8e-b555-445f-b31d-e0e9f8011ed0\") " Dec 06 10:02:00 crc kubenswrapper[4672]: I1206 10:02:00.329528 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08451f8e-b555-445f-b31d-e0e9f8011ed0-config-data-custom\") pod \"08451f8e-b555-445f-b31d-e0e9f8011ed0\" (UID: \"08451f8e-b555-445f-b31d-e0e9f8011ed0\") " Dec 06 10:02:00 crc kubenswrapper[4672]: I1206 10:02:00.329900 4672 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08451f8e-b555-445f-b31d-e0e9f8011ed0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:00 crc kubenswrapper[4672]: I1206 10:02:00.339533 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08451f8e-b555-445f-b31d-e0e9f8011ed0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "08451f8e-b555-445f-b31d-e0e9f8011ed0" (UID: "08451f8e-b555-445f-b31d-e0e9f8011ed0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:02:00 crc kubenswrapper[4672]: I1206 10:02:00.344222 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08451f8e-b555-445f-b31d-e0e9f8011ed0-scripts" (OuterVolumeSpecName: "scripts") pod "08451f8e-b555-445f-b31d-e0e9f8011ed0" (UID: "08451f8e-b555-445f-b31d-e0e9f8011ed0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:02:00 crc kubenswrapper[4672]: I1206 10:02:00.355777 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08451f8e-b555-445f-b31d-e0e9f8011ed0-kube-api-access-bb92r" (OuterVolumeSpecName: "kube-api-access-bb92r") pod "08451f8e-b555-445f-b31d-e0e9f8011ed0" (UID: "08451f8e-b555-445f-b31d-e0e9f8011ed0"). InnerVolumeSpecName "kube-api-access-bb92r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:02:00 crc kubenswrapper[4672]: I1206 10:02:00.433681 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08451f8e-b555-445f-b31d-e0e9f8011ed0-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:00 crc kubenswrapper[4672]: I1206 10:02:00.433713 4672 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08451f8e-b555-445f-b31d-e0e9f8011ed0-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:00 crc kubenswrapper[4672]: I1206 10:02:00.433723 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bb92r\" (UniqueName: \"kubernetes.io/projected/08451f8e-b555-445f-b31d-e0e9f8011ed0-kube-api-access-bb92r\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:00 crc kubenswrapper[4672]: I1206 10:02:00.451998 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08451f8e-b555-445f-b31d-e0e9f8011ed0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08451f8e-b555-445f-b31d-e0e9f8011ed0" (UID: "08451f8e-b555-445f-b31d-e0e9f8011ed0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:02:00 crc kubenswrapper[4672]: I1206 10:02:00.481488 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08451f8e-b555-445f-b31d-e0e9f8011ed0-config-data" (OuterVolumeSpecName: "config-data") pod "08451f8e-b555-445f-b31d-e0e9f8011ed0" (UID: "08451f8e-b555-445f-b31d-e0e9f8011ed0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:02:00 crc kubenswrapper[4672]: I1206 10:02:00.535168 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08451f8e-b555-445f-b31d-e0e9f8011ed0-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:00 crc kubenswrapper[4672]: I1206 10:02:00.535204 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08451f8e-b555-445f-b31d-e0e9f8011ed0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.061915 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"08451f8e-b555-445f-b31d-e0e9f8011ed0","Type":"ContainerDied","Data":"816a4578a08e84763807de9932a88cdbabe6d5bd69485183e00a84f783eeb5eb"} Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.062272 4672 scope.go:117] "RemoveContainer" containerID="828d9b901a2aa890f76a1742d2eb6df403db387fe3aa8e59c4d7a4b4c22189ac" Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.062442 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.136372 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.157455 4672 scope.go:117] "RemoveContainer" containerID="6a96ece6ecfa8e3c40d934c26b0b186331960aae43ac5c94667325666c5c7e8d" Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.157681 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.174257 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Dec 06 10:02:01 crc kubenswrapper[4672]: E1206 10:02:01.174632 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08451f8e-b555-445f-b31d-e0e9f8011ed0" containerName="probe" Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.174649 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="08451f8e-b555-445f-b31d-e0e9f8011ed0" containerName="probe" Dec 06 10:02:01 crc kubenswrapper[4672]: E1206 10:02:01.200943 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08451f8e-b555-445f-b31d-e0e9f8011ed0" containerName="manila-scheduler" Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.200985 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="08451f8e-b555-445f-b31d-e0e9f8011ed0" containerName="manila-scheduler" Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.201348 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="08451f8e-b555-445f-b31d-e0e9f8011ed0" containerName="manila-scheduler" Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.201394 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="08451f8e-b555-445f-b31d-e0e9f8011ed0" containerName="probe" Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.202711 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.202811 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.207803 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.257781 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab157348-d161-4de2-bf4c-084cb71b0982-config-data\") pod \"manila-scheduler-0\" (UID: \"ab157348-d161-4de2-bf4c-084cb71b0982\") " pod="openstack/manila-scheduler-0" Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.257845 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab157348-d161-4de2-bf4c-084cb71b0982-scripts\") pod \"manila-scheduler-0\" (UID: \"ab157348-d161-4de2-bf4c-084cb71b0982\") " pod="openstack/manila-scheduler-0" Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.257915 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab157348-d161-4de2-bf4c-084cb71b0982-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"ab157348-d161-4de2-bf4c-084cb71b0982\") " pod="openstack/manila-scheduler-0" Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.258028 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b9t9\" (UniqueName: \"kubernetes.io/projected/ab157348-d161-4de2-bf4c-084cb71b0982-kube-api-access-9b9t9\") pod \"manila-scheduler-0\" (UID: \"ab157348-d161-4de2-bf4c-084cb71b0982\") " pod="openstack/manila-scheduler-0" Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.258125 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab157348-d161-4de2-bf4c-084cb71b0982-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"ab157348-d161-4de2-bf4c-084cb71b0982\") " pod="openstack/manila-scheduler-0" Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.258161 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ab157348-d161-4de2-bf4c-084cb71b0982-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"ab157348-d161-4de2-bf4c-084cb71b0982\") " pod="openstack/manila-scheduler-0" Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.367635 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab157348-d161-4de2-bf4c-084cb71b0982-scripts\") pod \"manila-scheduler-0\" (UID: \"ab157348-d161-4de2-bf4c-084cb71b0982\") " pod="openstack/manila-scheduler-0" Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.367703 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab157348-d161-4de2-bf4c-084cb71b0982-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"ab157348-d161-4de2-bf4c-084cb71b0982\") " pod="openstack/manila-scheduler-0" Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.367741 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b9t9\" (UniqueName: \"kubernetes.io/projected/ab157348-d161-4de2-bf4c-084cb71b0982-kube-api-access-9b9t9\") pod \"manila-scheduler-0\" (UID: \"ab157348-d161-4de2-bf4c-084cb71b0982\") " pod="openstack/manila-scheduler-0" Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.367801 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab157348-d161-4de2-bf4c-084cb71b0982-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"ab157348-d161-4de2-bf4c-084cb71b0982\") " pod="openstack/manila-scheduler-0" Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.367822 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ab157348-d161-4de2-bf4c-084cb71b0982-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"ab157348-d161-4de2-bf4c-084cb71b0982\") " pod="openstack/manila-scheduler-0" Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.367899 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab157348-d161-4de2-bf4c-084cb71b0982-config-data\") pod \"manila-scheduler-0\" (UID: \"ab157348-d161-4de2-bf4c-084cb71b0982\") " pod="openstack/manila-scheduler-0" Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.375991 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ab157348-d161-4de2-bf4c-084cb71b0982-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"ab157348-d161-4de2-bf4c-084cb71b0982\") " pod="openstack/manila-scheduler-0" Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.376826 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab157348-d161-4de2-bf4c-084cb71b0982-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"ab157348-d161-4de2-bf4c-084cb71b0982\") " pod="openstack/manila-scheduler-0" Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.378180 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab157348-d161-4de2-bf4c-084cb71b0982-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"ab157348-d161-4de2-bf4c-084cb71b0982\") " pod="openstack/manila-scheduler-0" Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.387784 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab157348-d161-4de2-bf4c-084cb71b0982-config-data\") pod \"manila-scheduler-0\" (UID: \"ab157348-d161-4de2-bf4c-084cb71b0982\") " pod="openstack/manila-scheduler-0" Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.426495 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab157348-d161-4de2-bf4c-084cb71b0982-scripts\") pod \"manila-scheduler-0\" (UID: \"ab157348-d161-4de2-bf4c-084cb71b0982\") " pod="openstack/manila-scheduler-0" Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.438467 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b9t9\" (UniqueName: \"kubernetes.io/projected/ab157348-d161-4de2-bf4c-084cb71b0982-kube-api-access-9b9t9\") pod \"manila-scheduler-0\" (UID: \"ab157348-d161-4de2-bf4c-084cb71b0982\") " pod="openstack/manila-scheduler-0" Dec 06 10:02:01 crc kubenswrapper[4672]: I1206 10:02:01.530315 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 06 10:02:02 crc kubenswrapper[4672]: I1206 10:02:02.044362 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 06 10:02:02 crc kubenswrapper[4672]: I1206 10:02:02.071421 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"ab157348-d161-4de2-bf4c-084cb71b0982","Type":"ContainerStarted","Data":"2c6d29e94fb6faebc74730e33105aa708ad56684500936873e6f65935a073c6c"} Dec 06 10:02:02 crc kubenswrapper[4672]: I1206 10:02:02.579551 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08451f8e-b555-445f-b31d-e0e9f8011ed0" path="/var/lib/kubelet/pods/08451f8e-b555-445f-b31d-e0e9f8011ed0/volumes" Dec 06 10:02:03 crc kubenswrapper[4672]: I1206 10:02:03.083881 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"ab157348-d161-4de2-bf4c-084cb71b0982","Type":"ContainerStarted","Data":"cc38fab6d17ba29525eacf997ca49bc7e1d7ccd2bc787276dc6de9988c2a95e1"} Dec 06 10:02:03 crc kubenswrapper[4672]: I1206 10:02:03.083932 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"ab157348-d161-4de2-bf4c-084cb71b0982","Type":"ContainerStarted","Data":"553eb5dcb8eecb537920464791285e4c9426918638cc49dc8203bb1c9e7d00bf"} Dec 06 10:02:03 crc kubenswrapper[4672]: I1206 10:02:03.112043 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=2.112020387 podStartE2EDuration="2.112020387s" podCreationTimestamp="2025-12-06 10:02:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 10:02:03.100366322 +0000 UTC m=+3340.844626619" watchObservedRunningTime="2025-12-06 10:02:03.112020387 +0000 UTC m=+3340.856280674" Dec 06 10:02:05 crc kubenswrapper[4672]: I1206 10:02:05.490495 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Dec 06 10:02:07 crc kubenswrapper[4672]: I1206 10:02:07.121584 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42976197-15a4-4ceb-baf3-fa56682d89a6","Type":"ContainerStarted","Data":"ba795818d8ac6b50da6e7e676ca234c7f1f52b493b068f093ace86e18d18fdb7"} Dec 06 10:02:07 crc kubenswrapper[4672]: I1206 10:02:07.122440 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 10:02:08 crc kubenswrapper[4672]: I1206 10:02:08.013193 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Dec 06 10:02:08 crc kubenswrapper[4672]: I1206 10:02:08.040093 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.80341298 podStartE2EDuration="14.040071329s" podCreationTimestamp="2025-12-06 10:01:54 +0000 UTC" firstStartedPulling="2025-12-06 10:01:55.980757444 +0000 UTC m=+3333.725017731" lastFinishedPulling="2025-12-06 10:02:06.217415793 +0000 UTC m=+3343.961676080" observedRunningTime="2025-12-06 10:02:07.166283914 +0000 UTC m=+3344.910544201" watchObservedRunningTime="2025-12-06 10:02:08.040071329 +0000 UTC m=+3345.784331616" Dec 06 10:02:08 crc kubenswrapper[4672]: I1206 10:02:08.075660 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Dec 06 10:02:08 crc kubenswrapper[4672]: I1206 10:02:08.160932 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="c109eeef-0858-4722-a7d6-cefd2b8979b4" containerName="manila-share" containerID="cri-o://8b2d0be221687a567f51ed7fb58337300f6ca2d25de24d0c8dea815b511d0025" gracePeriod=30 Dec 06 10:02:08 crc kubenswrapper[4672]: I1206 10:02:08.161832 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="c109eeef-0858-4722-a7d6-cefd2b8979b4" containerName="probe" containerID="cri-o://ed3a4957a69cdc5f5bfb86d0339c6fe0333d784786dc059c85b7f8c3c13e1209" gracePeriod=30 Dec 06 10:02:08 crc kubenswrapper[4672]: I1206 10:02:08.900612 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5b745d8b98-pzrsc" podUID="afd13bd1-0e47-4739-9f82-e673232e3c61" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.240:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.240:8443: connect: connection refused" Dec 06 10:02:09 crc kubenswrapper[4672]: I1206 10:02:09.155067 4672 generic.go:334] "Generic (PLEG): container finished" podID="c109eeef-0858-4722-a7d6-cefd2b8979b4" containerID="ed3a4957a69cdc5f5bfb86d0339c6fe0333d784786dc059c85b7f8c3c13e1209" exitCode=0 Dec 06 10:02:09 crc kubenswrapper[4672]: I1206 10:02:09.155104 4672 generic.go:334] "Generic (PLEG): container finished" podID="c109eeef-0858-4722-a7d6-cefd2b8979b4" containerID="8b2d0be221687a567f51ed7fb58337300f6ca2d25de24d0c8dea815b511d0025" exitCode=1 Dec 06 10:02:09 crc kubenswrapper[4672]: I1206 10:02:09.155129 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"c109eeef-0858-4722-a7d6-cefd2b8979b4","Type":"ContainerDied","Data":"ed3a4957a69cdc5f5bfb86d0339c6fe0333d784786dc059c85b7f8c3c13e1209"} Dec 06 10:02:09 crc kubenswrapper[4672]: I1206 10:02:09.155157 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"c109eeef-0858-4722-a7d6-cefd2b8979b4","Type":"ContainerDied","Data":"8b2d0be221687a567f51ed7fb58337300f6ca2d25de24d0c8dea815b511d0025"} Dec 06 10:02:09 crc kubenswrapper[4672]: I1206 10:02:09.429873 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 06 10:02:09 crc kubenswrapper[4672]: I1206 10:02:09.547940 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c109eeef-0858-4722-a7d6-cefd2b8979b4-combined-ca-bundle\") pod \"c109eeef-0858-4722-a7d6-cefd2b8979b4\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " Dec 06 10:02:09 crc kubenswrapper[4672]: I1206 10:02:09.548279 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/c109eeef-0858-4722-a7d6-cefd2b8979b4-var-lib-manila\") pod \"c109eeef-0858-4722-a7d6-cefd2b8979b4\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " Dec 06 10:02:09 crc kubenswrapper[4672]: I1206 10:02:09.548312 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c109eeef-0858-4722-a7d6-cefd2b8979b4-etc-machine-id\") pod \"c109eeef-0858-4722-a7d6-cefd2b8979b4\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " Dec 06 10:02:09 crc kubenswrapper[4672]: I1206 10:02:09.548359 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c109eeef-0858-4722-a7d6-cefd2b8979b4-config-data\") pod \"c109eeef-0858-4722-a7d6-cefd2b8979b4\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " Dec 06 10:02:09 crc kubenswrapper[4672]: I1206 10:02:09.548387 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c109eeef-0858-4722-a7d6-cefd2b8979b4-config-data-custom\") pod \"c109eeef-0858-4722-a7d6-cefd2b8979b4\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " Dec 06 10:02:09 crc kubenswrapper[4672]: I1206 10:02:09.548426 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c109eeef-0858-4722-a7d6-cefd2b8979b4-ceph\") pod \"c109eeef-0858-4722-a7d6-cefd2b8979b4\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " Dec 06 10:02:09 crc kubenswrapper[4672]: I1206 10:02:09.548421 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c109eeef-0858-4722-a7d6-cefd2b8979b4-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "c109eeef-0858-4722-a7d6-cefd2b8979b4" (UID: "c109eeef-0858-4722-a7d6-cefd2b8979b4"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 10:02:09 crc kubenswrapper[4672]: I1206 10:02:09.548486 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxd7q\" (UniqueName: \"kubernetes.io/projected/c109eeef-0858-4722-a7d6-cefd2b8979b4-kube-api-access-xxd7q\") pod \"c109eeef-0858-4722-a7d6-cefd2b8979b4\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " Dec 06 10:02:09 crc kubenswrapper[4672]: I1206 10:02:09.548518 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c109eeef-0858-4722-a7d6-cefd2b8979b4-scripts\") pod \"c109eeef-0858-4722-a7d6-cefd2b8979b4\" (UID: \"c109eeef-0858-4722-a7d6-cefd2b8979b4\") " Dec 06 10:02:09 crc kubenswrapper[4672]: I1206 10:02:09.548978 4672 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/c109eeef-0858-4722-a7d6-cefd2b8979b4-var-lib-manila\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:09 crc kubenswrapper[4672]: I1206 10:02:09.549548 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c109eeef-0858-4722-a7d6-cefd2b8979b4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c109eeef-0858-4722-a7d6-cefd2b8979b4" (UID: "c109eeef-0858-4722-a7d6-cefd2b8979b4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 10:02:09 crc kubenswrapper[4672]: I1206 10:02:09.557916 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c109eeef-0858-4722-a7d6-cefd2b8979b4-kube-api-access-xxd7q" (OuterVolumeSpecName: "kube-api-access-xxd7q") pod "c109eeef-0858-4722-a7d6-cefd2b8979b4" (UID: "c109eeef-0858-4722-a7d6-cefd2b8979b4"). InnerVolumeSpecName "kube-api-access-xxd7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:02:09 crc kubenswrapper[4672]: I1206 10:02:09.564522 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c109eeef-0858-4722-a7d6-cefd2b8979b4-ceph" (OuterVolumeSpecName: "ceph") pod "c109eeef-0858-4722-a7d6-cefd2b8979b4" (UID: "c109eeef-0858-4722-a7d6-cefd2b8979b4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:02:09 crc kubenswrapper[4672]: I1206 10:02:09.565388 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c109eeef-0858-4722-a7d6-cefd2b8979b4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c109eeef-0858-4722-a7d6-cefd2b8979b4" (UID: "c109eeef-0858-4722-a7d6-cefd2b8979b4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:02:09 crc kubenswrapper[4672]: I1206 10:02:09.565755 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c109eeef-0858-4722-a7d6-cefd2b8979b4-scripts" (OuterVolumeSpecName: "scripts") pod "c109eeef-0858-4722-a7d6-cefd2b8979b4" (UID: "c109eeef-0858-4722-a7d6-cefd2b8979b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:02:09 crc kubenswrapper[4672]: I1206 10:02:09.622769 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c109eeef-0858-4722-a7d6-cefd2b8979b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c109eeef-0858-4722-a7d6-cefd2b8979b4" (UID: "c109eeef-0858-4722-a7d6-cefd2b8979b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:02:09 crc kubenswrapper[4672]: I1206 10:02:09.650810 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c109eeef-0858-4722-a7d6-cefd2b8979b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:09 crc kubenswrapper[4672]: I1206 10:02:09.651191 4672 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c109eeef-0858-4722-a7d6-cefd2b8979b4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:09 crc kubenswrapper[4672]: I1206 10:02:09.651267 4672 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c109eeef-0858-4722-a7d6-cefd2b8979b4-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:09 crc kubenswrapper[4672]: I1206 10:02:09.651505 4672 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c109eeef-0858-4722-a7d6-cefd2b8979b4-ceph\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:09 crc kubenswrapper[4672]: I1206 10:02:09.651625 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxd7q\" (UniqueName: \"kubernetes.io/projected/c109eeef-0858-4722-a7d6-cefd2b8979b4-kube-api-access-xxd7q\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:09 crc kubenswrapper[4672]: I1206 10:02:09.651684 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c109eeef-0858-4722-a7d6-cefd2b8979b4-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:09 crc kubenswrapper[4672]: I1206 10:02:09.710788 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c109eeef-0858-4722-a7d6-cefd2b8979b4-config-data" (OuterVolumeSpecName: "config-data") pod "c109eeef-0858-4722-a7d6-cefd2b8979b4" (UID: "c109eeef-0858-4722-a7d6-cefd2b8979b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:02:09 crc kubenswrapper[4672]: I1206 10:02:09.753563 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c109eeef-0858-4722-a7d6-cefd2b8979b4-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.164925 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"c109eeef-0858-4722-a7d6-cefd2b8979b4","Type":"ContainerDied","Data":"bf2b03df96b3e6d5a9fe1850d264e32ce28ddce1fdd8f2356e4a1531268b03bb"} Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.164982 4672 scope.go:117] "RemoveContainer" containerID="ed3a4957a69cdc5f5bfb86d0339c6fe0333d784786dc059c85b7f8c3c13e1209" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.165111 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.207019 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.212228 4672 scope.go:117] "RemoveContainer" containerID="8b2d0be221687a567f51ed7fb58337300f6ca2d25de24d0c8dea815b511d0025" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.217156 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.246470 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Dec 06 10:02:10 crc kubenswrapper[4672]: E1206 10:02:10.246858 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c109eeef-0858-4722-a7d6-cefd2b8979b4" containerName="probe" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.246874 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c109eeef-0858-4722-a7d6-cefd2b8979b4" containerName="probe" Dec 06 10:02:10 crc kubenswrapper[4672]: E1206 10:02:10.246904 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c109eeef-0858-4722-a7d6-cefd2b8979b4" containerName="manila-share" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.246912 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c109eeef-0858-4722-a7d6-cefd2b8979b4" containerName="manila-share" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.247074 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="c109eeef-0858-4722-a7d6-cefd2b8979b4" containerName="manila-share" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.247107 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="c109eeef-0858-4722-a7d6-cefd2b8979b4" containerName="probe" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.248253 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.250783 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.268226 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.368686 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7cfcb36-13c1-4215-b316-b2082d41bcae-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"d7cfcb36-13c1-4215-b316-b2082d41bcae\") " pod="openstack/manila-share-share1-0" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.368786 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7cfcb36-13c1-4215-b316-b2082d41bcae-scripts\") pod \"manila-share-share1-0\" (UID: \"d7cfcb36-13c1-4215-b316-b2082d41bcae\") " pod="openstack/manila-share-share1-0" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.368810 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnsss\" (UniqueName: \"kubernetes.io/projected/d7cfcb36-13c1-4215-b316-b2082d41bcae-kube-api-access-qnsss\") pod \"manila-share-share1-0\" (UID: \"d7cfcb36-13c1-4215-b316-b2082d41bcae\") " pod="openstack/manila-share-share1-0" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.368844 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/d7cfcb36-13c1-4215-b316-b2082d41bcae-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"d7cfcb36-13c1-4215-b316-b2082d41bcae\") " pod="openstack/manila-share-share1-0" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.368898 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7cfcb36-13c1-4215-b316-b2082d41bcae-config-data\") pod \"manila-share-share1-0\" (UID: \"d7cfcb36-13c1-4215-b316-b2082d41bcae\") " pod="openstack/manila-share-share1-0" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.368922 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d7cfcb36-13c1-4215-b316-b2082d41bcae-ceph\") pod \"manila-share-share1-0\" (UID: \"d7cfcb36-13c1-4215-b316-b2082d41bcae\") " pod="openstack/manila-share-share1-0" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.368954 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7cfcb36-13c1-4215-b316-b2082d41bcae-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"d7cfcb36-13c1-4215-b316-b2082d41bcae\") " pod="openstack/manila-share-share1-0" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.368979 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7cfcb36-13c1-4215-b316-b2082d41bcae-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"d7cfcb36-13c1-4215-b316-b2082d41bcae\") " pod="openstack/manila-share-share1-0" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.470873 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7cfcb36-13c1-4215-b316-b2082d41bcae-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"d7cfcb36-13c1-4215-b316-b2082d41bcae\") " pod="openstack/manila-share-share1-0" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.470998 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7cfcb36-13c1-4215-b316-b2082d41bcae-scripts\") pod \"manila-share-share1-0\" (UID: \"d7cfcb36-13c1-4215-b316-b2082d41bcae\") " pod="openstack/manila-share-share1-0" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.471036 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnsss\" (UniqueName: \"kubernetes.io/projected/d7cfcb36-13c1-4215-b316-b2082d41bcae-kube-api-access-qnsss\") pod \"manila-share-share1-0\" (UID: \"d7cfcb36-13c1-4215-b316-b2082d41bcae\") " pod="openstack/manila-share-share1-0" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.471067 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/d7cfcb36-13c1-4215-b316-b2082d41bcae-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"d7cfcb36-13c1-4215-b316-b2082d41bcae\") " pod="openstack/manila-share-share1-0" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.471107 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7cfcb36-13c1-4215-b316-b2082d41bcae-config-data\") pod \"manila-share-share1-0\" (UID: \"d7cfcb36-13c1-4215-b316-b2082d41bcae\") " pod="openstack/manila-share-share1-0" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.471142 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d7cfcb36-13c1-4215-b316-b2082d41bcae-ceph\") pod \"manila-share-share1-0\" (UID: \"d7cfcb36-13c1-4215-b316-b2082d41bcae\") " pod="openstack/manila-share-share1-0" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.471184 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7cfcb36-13c1-4215-b316-b2082d41bcae-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"d7cfcb36-13c1-4215-b316-b2082d41bcae\") " pod="openstack/manila-share-share1-0" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.471223 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7cfcb36-13c1-4215-b316-b2082d41bcae-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"d7cfcb36-13c1-4215-b316-b2082d41bcae\") " pod="openstack/manila-share-share1-0" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.471444 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/d7cfcb36-13c1-4215-b316-b2082d41bcae-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"d7cfcb36-13c1-4215-b316-b2082d41bcae\") " pod="openstack/manila-share-share1-0" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.471444 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7cfcb36-13c1-4215-b316-b2082d41bcae-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"d7cfcb36-13c1-4215-b316-b2082d41bcae\") " pod="openstack/manila-share-share1-0" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.481634 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7cfcb36-13c1-4215-b316-b2082d41bcae-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"d7cfcb36-13c1-4215-b316-b2082d41bcae\") " pod="openstack/manila-share-share1-0" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.482067 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7cfcb36-13c1-4215-b316-b2082d41bcae-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"d7cfcb36-13c1-4215-b316-b2082d41bcae\") " pod="openstack/manila-share-share1-0" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.482628 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7cfcb36-13c1-4215-b316-b2082d41bcae-config-data\") pod \"manila-share-share1-0\" (UID: \"d7cfcb36-13c1-4215-b316-b2082d41bcae\") " pod="openstack/manila-share-share1-0" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.499235 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7cfcb36-13c1-4215-b316-b2082d41bcae-scripts\") pod \"manila-share-share1-0\" (UID: \"d7cfcb36-13c1-4215-b316-b2082d41bcae\") " pod="openstack/manila-share-share1-0" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.506896 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d7cfcb36-13c1-4215-b316-b2082d41bcae-ceph\") pod \"manila-share-share1-0\" (UID: \"d7cfcb36-13c1-4215-b316-b2082d41bcae\") " pod="openstack/manila-share-share1-0" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.531054 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnsss\" (UniqueName: \"kubernetes.io/projected/d7cfcb36-13c1-4215-b316-b2082d41bcae-kube-api-access-qnsss\") pod \"manila-share-share1-0\" (UID: \"d7cfcb36-13c1-4215-b316-b2082d41bcae\") " pod="openstack/manila-share-share1-0" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.580102 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 06 10:02:10 crc kubenswrapper[4672]: I1206 10:02:10.603850 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c109eeef-0858-4722-a7d6-cefd2b8979b4" path="/var/lib/kubelet/pods/c109eeef-0858-4722-a7d6-cefd2b8979b4/volumes" Dec 06 10:02:11 crc kubenswrapper[4672]: I1206 10:02:11.249233 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 06 10:02:11 crc kubenswrapper[4672]: I1206 10:02:11.532966 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Dec 06 10:02:12 crc kubenswrapper[4672]: I1206 10:02:12.189679 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"d7cfcb36-13c1-4215-b316-b2082d41bcae","Type":"ContainerStarted","Data":"9a6306dfb17729f1170cabae62b767defcd92e926b54920e3d136cf9cc04f295"} Dec 06 10:02:12 crc kubenswrapper[4672]: I1206 10:02:12.190008 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"d7cfcb36-13c1-4215-b316-b2082d41bcae","Type":"ContainerStarted","Data":"217bbf84c65aacbb5ee9403ccb1d840876f614521a880927d648082fa098eeb1"} Dec 06 10:02:13 crc kubenswrapper[4672]: I1206 10:02:13.202819 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"d7cfcb36-13c1-4215-b316-b2082d41bcae","Type":"ContainerStarted","Data":"bfcfec15bb0aa648d32167240fcc558e7d324debdffff8211988dea5c3643bcb"} Dec 06 10:02:13 crc kubenswrapper[4672]: I1206 10:02:13.234896 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.234877633 podStartE2EDuration="3.234877633s" podCreationTimestamp="2025-12-06 10:02:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 10:02:13.233318151 +0000 UTC m=+3350.977578448" watchObservedRunningTime="2025-12-06 10:02:13.234877633 +0000 UTC m=+3350.979137930" Dec 06 10:02:18 crc kubenswrapper[4672]: I1206 10:02:18.900236 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5b745d8b98-pzrsc" podUID="afd13bd1-0e47-4739-9f82-e673232e3c61" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.240:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.240:8443: connect: connection refused" Dec 06 10:02:20 crc kubenswrapper[4672]: I1206 10:02:20.580905 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Dec 06 10:02:23 crc kubenswrapper[4672]: I1206 10:02:23.122972 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Dec 06 10:02:25 crc kubenswrapper[4672]: I1206 10:02:25.330786 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.116391 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b745d8b98-pzrsc" Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.225832 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd13bd1-0e47-4739-9f82-e673232e3c61-combined-ca-bundle\") pod \"afd13bd1-0e47-4739-9f82-e673232e3c61\" (UID: \"afd13bd1-0e47-4739-9f82-e673232e3c61\") " Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.225866 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/afd13bd1-0e47-4739-9f82-e673232e3c61-horizon-tls-certs\") pod \"afd13bd1-0e47-4739-9f82-e673232e3c61\" (UID: \"afd13bd1-0e47-4739-9f82-e673232e3c61\") " Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.225926 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clsmh\" (UniqueName: \"kubernetes.io/projected/afd13bd1-0e47-4739-9f82-e673232e3c61-kube-api-access-clsmh\") pod \"afd13bd1-0e47-4739-9f82-e673232e3c61\" (UID: \"afd13bd1-0e47-4739-9f82-e673232e3c61\") " Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.225972 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afd13bd1-0e47-4739-9f82-e673232e3c61-logs\") pod \"afd13bd1-0e47-4739-9f82-e673232e3c61\" (UID: \"afd13bd1-0e47-4739-9f82-e673232e3c61\") " Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.226002 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/afd13bd1-0e47-4739-9f82-e673232e3c61-scripts\") pod \"afd13bd1-0e47-4739-9f82-e673232e3c61\" (UID: \"afd13bd1-0e47-4739-9f82-e673232e3c61\") " Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.226035 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/afd13bd1-0e47-4739-9f82-e673232e3c61-config-data\") pod \"afd13bd1-0e47-4739-9f82-e673232e3c61\" (UID: \"afd13bd1-0e47-4739-9f82-e673232e3c61\") " Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.226100 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/afd13bd1-0e47-4739-9f82-e673232e3c61-horizon-secret-key\") pod \"afd13bd1-0e47-4739-9f82-e673232e3c61\" (UID: \"afd13bd1-0e47-4739-9f82-e673232e3c61\") " Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.229947 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afd13bd1-0e47-4739-9f82-e673232e3c61-logs" (OuterVolumeSpecName: "logs") pod "afd13bd1-0e47-4739-9f82-e673232e3c61" (UID: "afd13bd1-0e47-4739-9f82-e673232e3c61"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.234082 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afd13bd1-0e47-4739-9f82-e673232e3c61-kube-api-access-clsmh" (OuterVolumeSpecName: "kube-api-access-clsmh") pod "afd13bd1-0e47-4739-9f82-e673232e3c61" (UID: "afd13bd1-0e47-4739-9f82-e673232e3c61"). InnerVolumeSpecName "kube-api-access-clsmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.234577 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afd13bd1-0e47-4739-9f82-e673232e3c61-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "afd13bd1-0e47-4739-9f82-e673232e3c61" (UID: "afd13bd1-0e47-4739-9f82-e673232e3c61"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.259452 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afd13bd1-0e47-4739-9f82-e673232e3c61-config-data" (OuterVolumeSpecName: "config-data") pod "afd13bd1-0e47-4739-9f82-e673232e3c61" (UID: "afd13bd1-0e47-4739-9f82-e673232e3c61"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.270207 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afd13bd1-0e47-4739-9f82-e673232e3c61-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "afd13bd1-0e47-4739-9f82-e673232e3c61" (UID: "afd13bd1-0e47-4739-9f82-e673232e3c61"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.278765 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afd13bd1-0e47-4739-9f82-e673232e3c61-scripts" (OuterVolumeSpecName: "scripts") pod "afd13bd1-0e47-4739-9f82-e673232e3c61" (UID: "afd13bd1-0e47-4739-9f82-e673232e3c61"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.292914 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afd13bd1-0e47-4739-9f82-e673232e3c61-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "afd13bd1-0e47-4739-9f82-e673232e3c61" (UID: "afd13bd1-0e47-4739-9f82-e673232e3c61"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.332776 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/afd13bd1-0e47-4739-9f82-e673232e3c61-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.332811 4672 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/afd13bd1-0e47-4739-9f82-e673232e3c61-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.332823 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd13bd1-0e47-4739-9f82-e673232e3c61-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.332832 4672 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/afd13bd1-0e47-4739-9f82-e673232e3c61-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.332843 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clsmh\" (UniqueName: \"kubernetes.io/projected/afd13bd1-0e47-4739-9f82-e673232e3c61-kube-api-access-clsmh\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.332851 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afd13bd1-0e47-4739-9f82-e673232e3c61-logs\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.332860 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/afd13bd1-0e47-4739-9f82-e673232e3c61-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.340863 4672 generic.go:334] "Generic (PLEG): container finished" podID="afd13bd1-0e47-4739-9f82-e673232e3c61" containerID="5a27632bac37b4f4e263c64bd1c2b9c86c1c68c6b8f656990d806f14600af641" exitCode=137 Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.340979 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b745d8b98-pzrsc" event={"ID":"afd13bd1-0e47-4739-9f82-e673232e3c61","Type":"ContainerDied","Data":"5a27632bac37b4f4e263c64bd1c2b9c86c1c68c6b8f656990d806f14600af641"} Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.341050 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b745d8b98-pzrsc" event={"ID":"afd13bd1-0e47-4739-9f82-e673232e3c61","Type":"ContainerDied","Data":"63be707c1f4a160a8d5ec72b4a3245a9569529caf6eaf1780ad01344c825e31a"} Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.341108 4672 scope.go:117] "RemoveContainer" containerID="904ae83ce27cc40a81632b38ca1eeb1805e0dc909323a785271dde76df7f210d" Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.341266 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b745d8b98-pzrsc" Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.387401 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5b745d8b98-pzrsc"] Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.399961 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5b745d8b98-pzrsc"] Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.534221 4672 scope.go:117] "RemoveContainer" containerID="5a27632bac37b4f4e263c64bd1c2b9c86c1c68c6b8f656990d806f14600af641" Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.554553 4672 scope.go:117] "RemoveContainer" containerID="904ae83ce27cc40a81632b38ca1eeb1805e0dc909323a785271dde76df7f210d" Dec 06 10:02:26 crc kubenswrapper[4672]: E1206 10:02:26.555118 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"904ae83ce27cc40a81632b38ca1eeb1805e0dc909323a785271dde76df7f210d\": container with ID starting with 904ae83ce27cc40a81632b38ca1eeb1805e0dc909323a785271dde76df7f210d not found: ID does not exist" containerID="904ae83ce27cc40a81632b38ca1eeb1805e0dc909323a785271dde76df7f210d" Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.555156 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"904ae83ce27cc40a81632b38ca1eeb1805e0dc909323a785271dde76df7f210d"} err="failed to get container status \"904ae83ce27cc40a81632b38ca1eeb1805e0dc909323a785271dde76df7f210d\": rpc error: code = NotFound desc = could not find container \"904ae83ce27cc40a81632b38ca1eeb1805e0dc909323a785271dde76df7f210d\": container with ID starting with 904ae83ce27cc40a81632b38ca1eeb1805e0dc909323a785271dde76df7f210d not found: ID does not exist" Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.555180 4672 scope.go:117] "RemoveContainer" containerID="5a27632bac37b4f4e263c64bd1c2b9c86c1c68c6b8f656990d806f14600af641" Dec 06 10:02:26 crc kubenswrapper[4672]: E1206 10:02:26.555703 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a27632bac37b4f4e263c64bd1c2b9c86c1c68c6b8f656990d806f14600af641\": container with ID starting with 5a27632bac37b4f4e263c64bd1c2b9c86c1c68c6b8f656990d806f14600af641 not found: ID does not exist" containerID="5a27632bac37b4f4e263c64bd1c2b9c86c1c68c6b8f656990d806f14600af641" Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.555737 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a27632bac37b4f4e263c64bd1c2b9c86c1c68c6b8f656990d806f14600af641"} err="failed to get container status \"5a27632bac37b4f4e263c64bd1c2b9c86c1c68c6b8f656990d806f14600af641\": rpc error: code = NotFound desc = could not find container \"5a27632bac37b4f4e263c64bd1c2b9c86c1c68c6b8f656990d806f14600af641\": container with ID starting with 5a27632bac37b4f4e263c64bd1c2b9c86c1c68c6b8f656990d806f14600af641 not found: ID does not exist" Dec 06 10:02:26 crc kubenswrapper[4672]: I1206 10:02:26.574662 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afd13bd1-0e47-4739-9f82-e673232e3c61" path="/var/lib/kubelet/pods/afd13bd1-0e47-4739-9f82-e673232e3c61/volumes" Dec 06 10:02:32 crc kubenswrapper[4672]: I1206 10:02:32.151997 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Dec 06 10:03:38 crc kubenswrapper[4672]: I1206 10:03:38.838840 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 06 10:03:38 crc kubenswrapper[4672]: E1206 10:03:38.839770 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afd13bd1-0e47-4739-9f82-e673232e3c61" containerName="horizon-log" Dec 06 10:03:38 crc kubenswrapper[4672]: I1206 10:03:38.839787 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="afd13bd1-0e47-4739-9f82-e673232e3c61" containerName="horizon-log" Dec 06 10:03:38 crc kubenswrapper[4672]: E1206 10:03:38.839815 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afd13bd1-0e47-4739-9f82-e673232e3c61" containerName="horizon" Dec 06 10:03:38 crc kubenswrapper[4672]: I1206 10:03:38.839821 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="afd13bd1-0e47-4739-9f82-e673232e3c61" containerName="horizon" Dec 06 10:03:38 crc kubenswrapper[4672]: I1206 10:03:38.840021 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="afd13bd1-0e47-4739-9f82-e673232e3c61" containerName="horizon-log" Dec 06 10:03:38 crc kubenswrapper[4672]: I1206 10:03:38.840043 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="afd13bd1-0e47-4739-9f82-e673232e3c61" containerName="horizon" Dec 06 10:03:38 crc kubenswrapper[4672]: I1206 10:03:38.840767 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 06 10:03:38 crc kubenswrapper[4672]: I1206 10:03:38.843146 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-7gqdm" Dec 06 10:03:38 crc kubenswrapper[4672]: I1206 10:03:38.843414 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 06 10:03:38 crc kubenswrapper[4672]: I1206 10:03:38.843681 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 06 10:03:38 crc kubenswrapper[4672]: I1206 10:03:38.843692 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 06 10:03:38 crc kubenswrapper[4672]: I1206 10:03:38.859966 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 06 10:03:38 crc kubenswrapper[4672]: I1206 10:03:38.928240 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " pod="openstack/tempest-tests-tempest" Dec 06 10:03:38 crc kubenswrapper[4672]: I1206 10:03:38.928302 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " pod="openstack/tempest-tests-tempest" Dec 06 10:03:38 crc kubenswrapper[4672]: I1206 10:03:38.928328 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " pod="openstack/tempest-tests-tempest" Dec 06 10:03:38 crc kubenswrapper[4672]: I1206 10:03:38.928355 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " pod="openstack/tempest-tests-tempest" Dec 06 10:03:38 crc kubenswrapper[4672]: I1206 10:03:38.928505 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s57mp\" (UniqueName: \"kubernetes.io/projected/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-kube-api-access-s57mp\") pod \"tempest-tests-tempest\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " pod="openstack/tempest-tests-tempest" Dec 06 10:03:38 crc kubenswrapper[4672]: I1206 10:03:38.928637 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-config-data\") pod \"tempest-tests-tempest\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " pod="openstack/tempest-tests-tempest" Dec 06 10:03:38 crc kubenswrapper[4672]: I1206 10:03:38.928739 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " pod="openstack/tempest-tests-tempest" Dec 06 10:03:38 crc kubenswrapper[4672]: I1206 10:03:38.928857 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " pod="openstack/tempest-tests-tempest" Dec 06 10:03:38 crc kubenswrapper[4672]: I1206 10:03:38.928959 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " pod="openstack/tempest-tests-tempest" Dec 06 10:03:39 crc kubenswrapper[4672]: I1206 10:03:39.030974 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " pod="openstack/tempest-tests-tempest" Dec 06 10:03:39 crc kubenswrapper[4672]: I1206 10:03:39.031040 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " pod="openstack/tempest-tests-tempest" Dec 06 10:03:39 crc kubenswrapper[4672]: I1206 10:03:39.031062 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " pod="openstack/tempest-tests-tempest" Dec 06 10:03:39 crc kubenswrapper[4672]: I1206 10:03:39.031088 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " pod="openstack/tempest-tests-tempest" Dec 06 10:03:39 crc kubenswrapper[4672]: I1206 10:03:39.031122 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s57mp\" (UniqueName: \"kubernetes.io/projected/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-kube-api-access-s57mp\") pod \"tempest-tests-tempest\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " pod="openstack/tempest-tests-tempest" Dec 06 10:03:39 crc kubenswrapper[4672]: I1206 10:03:39.031152 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-config-data\") pod \"tempest-tests-tempest\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " pod="openstack/tempest-tests-tempest" Dec 06 10:03:39 crc kubenswrapper[4672]: I1206 10:03:39.031186 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " pod="openstack/tempest-tests-tempest" Dec 06 10:03:39 crc kubenswrapper[4672]: I1206 10:03:39.031228 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " pod="openstack/tempest-tests-tempest" Dec 06 10:03:39 crc kubenswrapper[4672]: I1206 10:03:39.031270 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " pod="openstack/tempest-tests-tempest" Dec 06 10:03:39 crc kubenswrapper[4672]: I1206 10:03:39.031704 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/tempest-tests-tempest" Dec 06 10:03:39 crc kubenswrapper[4672]: I1206 10:03:39.032477 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " pod="openstack/tempest-tests-tempest" Dec 06 10:03:39 crc kubenswrapper[4672]: I1206 10:03:39.032844 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " pod="openstack/tempest-tests-tempest" Dec 06 10:03:39 crc kubenswrapper[4672]: I1206 10:03:39.032863 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-config-data\") pod \"tempest-tests-tempest\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " pod="openstack/tempest-tests-tempest" Dec 06 10:03:39 crc kubenswrapper[4672]: I1206 10:03:39.033219 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " pod="openstack/tempest-tests-tempest" Dec 06 10:03:39 crc kubenswrapper[4672]: I1206 10:03:39.046858 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " pod="openstack/tempest-tests-tempest" Dec 06 10:03:39 crc kubenswrapper[4672]: I1206 10:03:39.047427 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " pod="openstack/tempest-tests-tempest" Dec 06 10:03:39 crc kubenswrapper[4672]: I1206 10:03:39.048177 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " pod="openstack/tempest-tests-tempest" Dec 06 10:03:39 crc kubenswrapper[4672]: I1206 10:03:39.062135 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s57mp\" (UniqueName: \"kubernetes.io/projected/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-kube-api-access-s57mp\") pod \"tempest-tests-tempest\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " pod="openstack/tempest-tests-tempest" Dec 06 10:03:39 crc kubenswrapper[4672]: I1206 10:03:39.064720 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " pod="openstack/tempest-tests-tempest" Dec 06 10:03:39 crc kubenswrapper[4672]: I1206 10:03:39.161096 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 06 10:03:39 crc kubenswrapper[4672]: I1206 10:03:39.626749 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 06 10:03:39 crc kubenswrapper[4672]: I1206 10:03:39.627410 4672 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 10:03:40 crc kubenswrapper[4672]: I1206 10:03:40.067770 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d","Type":"ContainerStarted","Data":"7ecacedd532b69bacd6aca045a8634d2bcbfc14689d5df865d085e5cfd5f46b9"} Dec 06 10:03:42 crc kubenswrapper[4672]: I1206 10:03:42.319996 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:03:42 crc kubenswrapper[4672]: I1206 10:03:42.320590 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:04:12 crc kubenswrapper[4672]: I1206 10:04:12.319225 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:04:12 crc kubenswrapper[4672]: I1206 10:04:12.319859 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:04:19 crc kubenswrapper[4672]: E1206 10:04:19.105399 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 06 10:04:19 crc kubenswrapper[4672]: E1206 10:04:19.108431 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s57mp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 10:04:19 crc kubenswrapper[4672]: E1206 10:04:19.109694 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d" Dec 06 10:04:19 crc kubenswrapper[4672]: E1206 10:04:19.506846 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d" Dec 06 10:04:31 crc kubenswrapper[4672]: I1206 10:04:31.210958 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 06 10:04:32 crc kubenswrapper[4672]: I1206 10:04:32.628540 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d","Type":"ContainerStarted","Data":"f5b3d8749f523904d43ba52bdbfcbc08406e845e238f2f9d8ea548d9e56c9f73"} Dec 06 10:04:42 crc kubenswrapper[4672]: I1206 10:04:42.320143 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:04:42 crc kubenswrapper[4672]: I1206 10:04:42.320802 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:04:42 crc kubenswrapper[4672]: I1206 10:04:42.320850 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 10:04:42 crc kubenswrapper[4672]: I1206 10:04:42.321581 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b52c4a53b3f1a8646d79189c3bda17a1d38aee3f1effad6325de3a3cc88f79b3"} pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 10:04:42 crc kubenswrapper[4672]: I1206 10:04:42.321650 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" containerID="cri-o://b52c4a53b3f1a8646d79189c3bda17a1d38aee3f1effad6325de3a3cc88f79b3" gracePeriod=600 Dec 06 10:04:42 crc kubenswrapper[4672]: I1206 10:04:42.753135 4672 generic.go:334] "Generic (PLEG): container finished" podID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerID="b52c4a53b3f1a8646d79189c3bda17a1d38aee3f1effad6325de3a3cc88f79b3" exitCode=0 Dec 06 10:04:42 crc kubenswrapper[4672]: I1206 10:04:42.753370 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerDied","Data":"b52c4a53b3f1a8646d79189c3bda17a1d38aee3f1effad6325de3a3cc88f79b3"} Dec 06 10:04:42 crc kubenswrapper[4672]: I1206 10:04:42.753523 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerStarted","Data":"ca4e55181ab085a6d2d94a78978707a35a3af3000a5af2216eb580aa96202e83"} Dec 06 10:04:42 crc kubenswrapper[4672]: I1206 10:04:42.753551 4672 scope.go:117] "RemoveContainer" containerID="7bbf2781550e7a61427a2b236cb3b966725940a4a76024981629c0dc4fd1af55" Dec 06 10:04:42 crc kubenswrapper[4672]: I1206 10:04:42.776966 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=14.196826984 podStartE2EDuration="1m5.776949192s" podCreationTimestamp="2025-12-06 10:03:37 +0000 UTC" firstStartedPulling="2025-12-06 10:03:39.62716083 +0000 UTC m=+3437.371421107" lastFinishedPulling="2025-12-06 10:04:31.207283028 +0000 UTC m=+3488.951543315" observedRunningTime="2025-12-06 10:04:32.655204417 +0000 UTC m=+3490.399464724" watchObservedRunningTime="2025-12-06 10:04:42.776949192 +0000 UTC m=+3500.521209479" Dec 06 10:04:48 crc kubenswrapper[4672]: I1206 10:04:48.065844 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nsljk"] Dec 06 10:04:48 crc kubenswrapper[4672]: I1206 10:04:48.079613 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nsljk"] Dec 06 10:04:48 crc kubenswrapper[4672]: I1206 10:04:48.079789 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nsljk" Dec 06 10:04:48 crc kubenswrapper[4672]: I1206 10:04:48.127800 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d898c68a-5600-4634-b005-69e25e854b40-catalog-content\") pod \"redhat-operators-nsljk\" (UID: \"d898c68a-5600-4634-b005-69e25e854b40\") " pod="openshift-marketplace/redhat-operators-nsljk" Dec 06 10:04:48 crc kubenswrapper[4672]: I1206 10:04:48.127906 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d898c68a-5600-4634-b005-69e25e854b40-utilities\") pod \"redhat-operators-nsljk\" (UID: \"d898c68a-5600-4634-b005-69e25e854b40\") " pod="openshift-marketplace/redhat-operators-nsljk" Dec 06 10:04:48 crc kubenswrapper[4672]: I1206 10:04:48.127981 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmvld\" (UniqueName: \"kubernetes.io/projected/d898c68a-5600-4634-b005-69e25e854b40-kube-api-access-qmvld\") pod \"redhat-operators-nsljk\" (UID: \"d898c68a-5600-4634-b005-69e25e854b40\") " pod="openshift-marketplace/redhat-operators-nsljk" Dec 06 10:04:48 crc kubenswrapper[4672]: I1206 10:04:48.229891 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d898c68a-5600-4634-b005-69e25e854b40-catalog-content\") pod \"redhat-operators-nsljk\" (UID: \"d898c68a-5600-4634-b005-69e25e854b40\") " pod="openshift-marketplace/redhat-operators-nsljk" Dec 06 10:04:48 crc kubenswrapper[4672]: I1206 10:04:48.230072 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d898c68a-5600-4634-b005-69e25e854b40-utilities\") pod \"redhat-operators-nsljk\" (UID: \"d898c68a-5600-4634-b005-69e25e854b40\") " pod="openshift-marketplace/redhat-operators-nsljk" Dec 06 10:04:48 crc kubenswrapper[4672]: I1206 10:04:48.230168 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmvld\" (UniqueName: \"kubernetes.io/projected/d898c68a-5600-4634-b005-69e25e854b40-kube-api-access-qmvld\") pod \"redhat-operators-nsljk\" (UID: \"d898c68a-5600-4634-b005-69e25e854b40\") " pod="openshift-marketplace/redhat-operators-nsljk" Dec 06 10:04:48 crc kubenswrapper[4672]: I1206 10:04:48.230420 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d898c68a-5600-4634-b005-69e25e854b40-catalog-content\") pod \"redhat-operators-nsljk\" (UID: \"d898c68a-5600-4634-b005-69e25e854b40\") " pod="openshift-marketplace/redhat-operators-nsljk" Dec 06 10:04:48 crc kubenswrapper[4672]: I1206 10:04:48.231132 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d898c68a-5600-4634-b005-69e25e854b40-utilities\") pod \"redhat-operators-nsljk\" (UID: \"d898c68a-5600-4634-b005-69e25e854b40\") " pod="openshift-marketplace/redhat-operators-nsljk" Dec 06 10:04:48 crc kubenswrapper[4672]: I1206 10:04:48.253423 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmvld\" (UniqueName: \"kubernetes.io/projected/d898c68a-5600-4634-b005-69e25e854b40-kube-api-access-qmvld\") pod \"redhat-operators-nsljk\" (UID: \"d898c68a-5600-4634-b005-69e25e854b40\") " pod="openshift-marketplace/redhat-operators-nsljk" Dec 06 10:04:48 crc kubenswrapper[4672]: I1206 10:04:48.408280 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nsljk" Dec 06 10:04:48 crc kubenswrapper[4672]: I1206 10:04:48.953854 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nsljk"] Dec 06 10:04:49 crc kubenswrapper[4672]: I1206 10:04:49.827302 4672 generic.go:334] "Generic (PLEG): container finished" podID="d898c68a-5600-4634-b005-69e25e854b40" containerID="60025c3a61ccab73864fb4f179e814e5ea694a5975305e140f95ec991ce8b1a0" exitCode=0 Dec 06 10:04:49 crc kubenswrapper[4672]: I1206 10:04:49.827398 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nsljk" event={"ID":"d898c68a-5600-4634-b005-69e25e854b40","Type":"ContainerDied","Data":"60025c3a61ccab73864fb4f179e814e5ea694a5975305e140f95ec991ce8b1a0"} Dec 06 10:04:49 crc kubenswrapper[4672]: I1206 10:04:49.828282 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nsljk" event={"ID":"d898c68a-5600-4634-b005-69e25e854b40","Type":"ContainerStarted","Data":"d03f60ee2cec747afcd49ee29be413d56fdfc48c4241d33a1a65d2beb909d130"} Dec 06 10:04:50 crc kubenswrapper[4672]: I1206 10:04:50.841577 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nsljk" event={"ID":"d898c68a-5600-4634-b005-69e25e854b40","Type":"ContainerStarted","Data":"f22ff2e6db5022a5957e6339b22b4e05cb7c71e150389319c5523347cc90fd9d"} Dec 06 10:04:54 crc kubenswrapper[4672]: I1206 10:04:54.909047 4672 generic.go:334] "Generic (PLEG): container finished" podID="d898c68a-5600-4634-b005-69e25e854b40" containerID="f22ff2e6db5022a5957e6339b22b4e05cb7c71e150389319c5523347cc90fd9d" exitCode=0 Dec 06 10:04:54 crc kubenswrapper[4672]: I1206 10:04:54.909149 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nsljk" event={"ID":"d898c68a-5600-4634-b005-69e25e854b40","Type":"ContainerDied","Data":"f22ff2e6db5022a5957e6339b22b4e05cb7c71e150389319c5523347cc90fd9d"} Dec 06 10:04:55 crc kubenswrapper[4672]: I1206 10:04:55.920822 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nsljk" event={"ID":"d898c68a-5600-4634-b005-69e25e854b40","Type":"ContainerStarted","Data":"f417bdf3f829bceb2ea1d43dc778b1db7aa697fc79145159cd87237e7e2f1ddd"} Dec 06 10:04:55 crc kubenswrapper[4672]: I1206 10:04:55.944239 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nsljk" podStartSLOduration=2.462445342 podStartE2EDuration="7.944216227s" podCreationTimestamp="2025-12-06 10:04:48 +0000 UTC" firstStartedPulling="2025-12-06 10:04:49.831354778 +0000 UTC m=+3507.575615075" lastFinishedPulling="2025-12-06 10:04:55.313125673 +0000 UTC m=+3513.057385960" observedRunningTime="2025-12-06 10:04:55.941145204 +0000 UTC m=+3513.685405541" watchObservedRunningTime="2025-12-06 10:04:55.944216227 +0000 UTC m=+3513.688476514" Dec 06 10:04:58 crc kubenswrapper[4672]: I1206 10:04:58.412147 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nsljk" Dec 06 10:04:58 crc kubenswrapper[4672]: I1206 10:04:58.412659 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nsljk" Dec 06 10:04:59 crc kubenswrapper[4672]: I1206 10:04:59.466556 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nsljk" podUID="d898c68a-5600-4634-b005-69e25e854b40" containerName="registry-server" probeResult="failure" output=< Dec 06 10:04:59 crc kubenswrapper[4672]: timeout: failed to connect service ":50051" within 1s Dec 06 10:04:59 crc kubenswrapper[4672]: > Dec 06 10:05:09 crc kubenswrapper[4672]: I1206 10:05:09.472199 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nsljk" podUID="d898c68a-5600-4634-b005-69e25e854b40" containerName="registry-server" probeResult="failure" output=< Dec 06 10:05:09 crc kubenswrapper[4672]: timeout: failed to connect service ":50051" within 1s Dec 06 10:05:09 crc kubenswrapper[4672]: > Dec 06 10:05:15 crc kubenswrapper[4672]: I1206 10:05:15.847790 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8rnmb"] Dec 06 10:05:15 crc kubenswrapper[4672]: I1206 10:05:15.850041 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rnmb" Dec 06 10:05:15 crc kubenswrapper[4672]: I1206 10:05:15.916459 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8rnmb"] Dec 06 10:05:15 crc kubenswrapper[4672]: I1206 10:05:15.955672 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02616aec-9825-4c17-9969-eadd0405341c-utilities\") pod \"certified-operators-8rnmb\" (UID: \"02616aec-9825-4c17-9969-eadd0405341c\") " pod="openshift-marketplace/certified-operators-8rnmb" Dec 06 10:05:15 crc kubenswrapper[4672]: I1206 10:05:15.955730 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02616aec-9825-4c17-9969-eadd0405341c-catalog-content\") pod \"certified-operators-8rnmb\" (UID: \"02616aec-9825-4c17-9969-eadd0405341c\") " pod="openshift-marketplace/certified-operators-8rnmb" Dec 06 10:05:15 crc kubenswrapper[4672]: I1206 10:05:15.956260 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzl5q\" (UniqueName: \"kubernetes.io/projected/02616aec-9825-4c17-9969-eadd0405341c-kube-api-access-rzl5q\") pod \"certified-operators-8rnmb\" (UID: \"02616aec-9825-4c17-9969-eadd0405341c\") " pod="openshift-marketplace/certified-operators-8rnmb" Dec 06 10:05:16 crc kubenswrapper[4672]: I1206 10:05:16.062042 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02616aec-9825-4c17-9969-eadd0405341c-utilities\") pod \"certified-operators-8rnmb\" (UID: \"02616aec-9825-4c17-9969-eadd0405341c\") " pod="openshift-marketplace/certified-operators-8rnmb" Dec 06 10:05:16 crc kubenswrapper[4672]: I1206 10:05:16.062090 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02616aec-9825-4c17-9969-eadd0405341c-catalog-content\") pod \"certified-operators-8rnmb\" (UID: \"02616aec-9825-4c17-9969-eadd0405341c\") " pod="openshift-marketplace/certified-operators-8rnmb" Dec 06 10:05:16 crc kubenswrapper[4672]: I1206 10:05:16.062419 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzl5q\" (UniqueName: \"kubernetes.io/projected/02616aec-9825-4c17-9969-eadd0405341c-kube-api-access-rzl5q\") pod \"certified-operators-8rnmb\" (UID: \"02616aec-9825-4c17-9969-eadd0405341c\") " pod="openshift-marketplace/certified-operators-8rnmb" Dec 06 10:05:16 crc kubenswrapper[4672]: I1206 10:05:16.063287 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02616aec-9825-4c17-9969-eadd0405341c-utilities\") pod \"certified-operators-8rnmb\" (UID: \"02616aec-9825-4c17-9969-eadd0405341c\") " pod="openshift-marketplace/certified-operators-8rnmb" Dec 06 10:05:16 crc kubenswrapper[4672]: I1206 10:05:16.063546 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02616aec-9825-4c17-9969-eadd0405341c-catalog-content\") pod \"certified-operators-8rnmb\" (UID: \"02616aec-9825-4c17-9969-eadd0405341c\") " pod="openshift-marketplace/certified-operators-8rnmb" Dec 06 10:05:16 crc kubenswrapper[4672]: I1206 10:05:16.089705 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzl5q\" (UniqueName: \"kubernetes.io/projected/02616aec-9825-4c17-9969-eadd0405341c-kube-api-access-rzl5q\") pod \"certified-operators-8rnmb\" (UID: \"02616aec-9825-4c17-9969-eadd0405341c\") " pod="openshift-marketplace/certified-operators-8rnmb" Dec 06 10:05:16 crc kubenswrapper[4672]: I1206 10:05:16.167538 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rnmb" Dec 06 10:05:16 crc kubenswrapper[4672]: W1206 10:05:16.656233 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02616aec_9825_4c17_9969_eadd0405341c.slice/crio-ca997d8ec01cbab54838da9907ed237e5a7dd755959332e0fee1c5be1fe44ff1 WatchSource:0}: Error finding container ca997d8ec01cbab54838da9907ed237e5a7dd755959332e0fee1c5be1fe44ff1: Status 404 returned error can't find the container with id ca997d8ec01cbab54838da9907ed237e5a7dd755959332e0fee1c5be1fe44ff1 Dec 06 10:05:16 crc kubenswrapper[4672]: I1206 10:05:16.657296 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8rnmb"] Dec 06 10:05:17 crc kubenswrapper[4672]: I1206 10:05:17.115579 4672 generic.go:334] "Generic (PLEG): container finished" podID="02616aec-9825-4c17-9969-eadd0405341c" containerID="0d159b2bd7ac2cddfb719a231e36946fb72b0bda24c96df64bb17f374e37141e" exitCode=0 Dec 06 10:05:17 crc kubenswrapper[4672]: I1206 10:05:17.115677 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rnmb" event={"ID":"02616aec-9825-4c17-9969-eadd0405341c","Type":"ContainerDied","Data":"0d159b2bd7ac2cddfb719a231e36946fb72b0bda24c96df64bb17f374e37141e"} Dec 06 10:05:17 crc kubenswrapper[4672]: I1206 10:05:17.115729 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rnmb" event={"ID":"02616aec-9825-4c17-9969-eadd0405341c","Type":"ContainerStarted","Data":"ca997d8ec01cbab54838da9907ed237e5a7dd755959332e0fee1c5be1fe44ff1"} Dec 06 10:05:17 crc kubenswrapper[4672]: I1206 10:05:17.241459 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h7n65"] Dec 06 10:05:17 crc kubenswrapper[4672]: I1206 10:05:17.243759 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h7n65" Dec 06 10:05:17 crc kubenswrapper[4672]: I1206 10:05:17.280751 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h7n65"] Dec 06 10:05:17 crc kubenswrapper[4672]: I1206 10:05:17.386892 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c35debca-4e88-40f8-b458-0800427ff967-utilities\") pod \"community-operators-h7n65\" (UID: \"c35debca-4e88-40f8-b458-0800427ff967\") " pod="openshift-marketplace/community-operators-h7n65" Dec 06 10:05:17 crc kubenswrapper[4672]: I1206 10:05:17.387283 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s475g\" (UniqueName: \"kubernetes.io/projected/c35debca-4e88-40f8-b458-0800427ff967-kube-api-access-s475g\") pod \"community-operators-h7n65\" (UID: \"c35debca-4e88-40f8-b458-0800427ff967\") " pod="openshift-marketplace/community-operators-h7n65" Dec 06 10:05:17 crc kubenswrapper[4672]: I1206 10:05:17.387319 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c35debca-4e88-40f8-b458-0800427ff967-catalog-content\") pod \"community-operators-h7n65\" (UID: \"c35debca-4e88-40f8-b458-0800427ff967\") " pod="openshift-marketplace/community-operators-h7n65" Dec 06 10:05:17 crc kubenswrapper[4672]: I1206 10:05:17.489275 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c35debca-4e88-40f8-b458-0800427ff967-utilities\") pod \"community-operators-h7n65\" (UID: \"c35debca-4e88-40f8-b458-0800427ff967\") " pod="openshift-marketplace/community-operators-h7n65" Dec 06 10:05:17 crc kubenswrapper[4672]: I1206 10:05:17.489372 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s475g\" (UniqueName: \"kubernetes.io/projected/c35debca-4e88-40f8-b458-0800427ff967-kube-api-access-s475g\") pod \"community-operators-h7n65\" (UID: \"c35debca-4e88-40f8-b458-0800427ff967\") " pod="openshift-marketplace/community-operators-h7n65" Dec 06 10:05:17 crc kubenswrapper[4672]: I1206 10:05:17.489405 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c35debca-4e88-40f8-b458-0800427ff967-catalog-content\") pod \"community-operators-h7n65\" (UID: \"c35debca-4e88-40f8-b458-0800427ff967\") " pod="openshift-marketplace/community-operators-h7n65" Dec 06 10:05:17 crc kubenswrapper[4672]: I1206 10:05:17.489980 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c35debca-4e88-40f8-b458-0800427ff967-utilities\") pod \"community-operators-h7n65\" (UID: \"c35debca-4e88-40f8-b458-0800427ff967\") " pod="openshift-marketplace/community-operators-h7n65" Dec 06 10:05:17 crc kubenswrapper[4672]: I1206 10:05:17.490034 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c35debca-4e88-40f8-b458-0800427ff967-catalog-content\") pod \"community-operators-h7n65\" (UID: \"c35debca-4e88-40f8-b458-0800427ff967\") " pod="openshift-marketplace/community-operators-h7n65" Dec 06 10:05:17 crc kubenswrapper[4672]: I1206 10:05:17.508028 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s475g\" (UniqueName: \"kubernetes.io/projected/c35debca-4e88-40f8-b458-0800427ff967-kube-api-access-s475g\") pod \"community-operators-h7n65\" (UID: \"c35debca-4e88-40f8-b458-0800427ff967\") " pod="openshift-marketplace/community-operators-h7n65" Dec 06 10:05:17 crc kubenswrapper[4672]: I1206 10:05:17.581457 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h7n65" Dec 06 10:05:18 crc kubenswrapper[4672]: I1206 10:05:18.159944 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h7n65"] Dec 06 10:05:18 crc kubenswrapper[4672]: I1206 10:05:18.452137 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nsljk" Dec 06 10:05:18 crc kubenswrapper[4672]: I1206 10:05:18.518583 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nsljk" Dec 06 10:05:19 crc kubenswrapper[4672]: I1206 10:05:19.134805 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rnmb" event={"ID":"02616aec-9825-4c17-9969-eadd0405341c","Type":"ContainerStarted","Data":"fdce0f93ba8c34365f94ba89930ec4b1fa233ca8b99e279a4562941e2dec40fd"} Dec 06 10:05:19 crc kubenswrapper[4672]: I1206 10:05:19.146351 4672 generic.go:334] "Generic (PLEG): container finished" podID="c35debca-4e88-40f8-b458-0800427ff967" containerID="ff7b8f7d0d2a0d6bb1a0afb669bb765a8efca7b77b10688c3e12217414959999" exitCode=0 Dec 06 10:05:19 crc kubenswrapper[4672]: I1206 10:05:19.148273 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7n65" event={"ID":"c35debca-4e88-40f8-b458-0800427ff967","Type":"ContainerDied","Data":"ff7b8f7d0d2a0d6bb1a0afb669bb765a8efca7b77b10688c3e12217414959999"} Dec 06 10:05:19 crc kubenswrapper[4672]: I1206 10:05:19.148329 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7n65" event={"ID":"c35debca-4e88-40f8-b458-0800427ff967","Type":"ContainerStarted","Data":"1e29d3b84b2cab8f66282dbfe77f78ec27e6871204f55e6834b37d89d36a9bbb"} Dec 06 10:05:20 crc kubenswrapper[4672]: I1206 10:05:20.156439 4672 generic.go:334] "Generic (PLEG): container finished" podID="02616aec-9825-4c17-9969-eadd0405341c" containerID="fdce0f93ba8c34365f94ba89930ec4b1fa233ca8b99e279a4562941e2dec40fd" exitCode=0 Dec 06 10:05:20 crc kubenswrapper[4672]: I1206 10:05:20.156517 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rnmb" event={"ID":"02616aec-9825-4c17-9969-eadd0405341c","Type":"ContainerDied","Data":"fdce0f93ba8c34365f94ba89930ec4b1fa233ca8b99e279a4562941e2dec40fd"} Dec 06 10:05:21 crc kubenswrapper[4672]: I1206 10:05:21.166124 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7n65" event={"ID":"c35debca-4e88-40f8-b458-0800427ff967","Type":"ContainerStarted","Data":"fe19c3bd41ea778b21bb5d7fdd81b8fc8dd18f3b5304f0452a3e5a90dbea91e4"} Dec 06 10:05:21 crc kubenswrapper[4672]: I1206 10:05:21.169642 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rnmb" event={"ID":"02616aec-9825-4c17-9969-eadd0405341c","Type":"ContainerStarted","Data":"1071584c93496665f0cfd9ea45c15543cb232e1113ec2fc4bd273365605b0315"} Dec 06 10:05:21 crc kubenswrapper[4672]: I1206 10:05:21.221837 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8rnmb" podStartSLOduration=2.732109623 podStartE2EDuration="6.221820363s" podCreationTimestamp="2025-12-06 10:05:15 +0000 UTC" firstStartedPulling="2025-12-06 10:05:17.117566286 +0000 UTC m=+3534.861826573" lastFinishedPulling="2025-12-06 10:05:20.607277026 +0000 UTC m=+3538.351537313" observedRunningTime="2025-12-06 10:05:21.216738796 +0000 UTC m=+3538.960999083" watchObservedRunningTime="2025-12-06 10:05:21.221820363 +0000 UTC m=+3538.966080650" Dec 06 10:05:21 crc kubenswrapper[4672]: I1206 10:05:21.882519 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nsljk"] Dec 06 10:05:21 crc kubenswrapper[4672]: I1206 10:05:21.883163 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nsljk" podUID="d898c68a-5600-4634-b005-69e25e854b40" containerName="registry-server" containerID="cri-o://f417bdf3f829bceb2ea1d43dc778b1db7aa697fc79145159cd87237e7e2f1ddd" gracePeriod=2 Dec 06 10:05:22 crc kubenswrapper[4672]: I1206 10:05:22.178434 4672 generic.go:334] "Generic (PLEG): container finished" podID="c35debca-4e88-40f8-b458-0800427ff967" containerID="fe19c3bd41ea778b21bb5d7fdd81b8fc8dd18f3b5304f0452a3e5a90dbea91e4" exitCode=0 Dec 06 10:05:22 crc kubenswrapper[4672]: I1206 10:05:22.178492 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7n65" event={"ID":"c35debca-4e88-40f8-b458-0800427ff967","Type":"ContainerDied","Data":"fe19c3bd41ea778b21bb5d7fdd81b8fc8dd18f3b5304f0452a3e5a90dbea91e4"} Dec 06 10:05:22 crc kubenswrapper[4672]: I1206 10:05:22.185919 4672 generic.go:334] "Generic (PLEG): container finished" podID="d898c68a-5600-4634-b005-69e25e854b40" containerID="f417bdf3f829bceb2ea1d43dc778b1db7aa697fc79145159cd87237e7e2f1ddd" exitCode=0 Dec 06 10:05:22 crc kubenswrapper[4672]: I1206 10:05:22.185966 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nsljk" event={"ID":"d898c68a-5600-4634-b005-69e25e854b40","Type":"ContainerDied","Data":"f417bdf3f829bceb2ea1d43dc778b1db7aa697fc79145159cd87237e7e2f1ddd"} Dec 06 10:05:22 crc kubenswrapper[4672]: I1206 10:05:22.650317 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nsljk" Dec 06 10:05:22 crc kubenswrapper[4672]: I1206 10:05:22.726435 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmvld\" (UniqueName: \"kubernetes.io/projected/d898c68a-5600-4634-b005-69e25e854b40-kube-api-access-qmvld\") pod \"d898c68a-5600-4634-b005-69e25e854b40\" (UID: \"d898c68a-5600-4634-b005-69e25e854b40\") " Dec 06 10:05:22 crc kubenswrapper[4672]: I1206 10:05:22.726539 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d898c68a-5600-4634-b005-69e25e854b40-utilities\") pod \"d898c68a-5600-4634-b005-69e25e854b40\" (UID: \"d898c68a-5600-4634-b005-69e25e854b40\") " Dec 06 10:05:22 crc kubenswrapper[4672]: I1206 10:05:22.726584 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d898c68a-5600-4634-b005-69e25e854b40-catalog-content\") pod \"d898c68a-5600-4634-b005-69e25e854b40\" (UID: \"d898c68a-5600-4634-b005-69e25e854b40\") " Dec 06 10:05:22 crc kubenswrapper[4672]: I1206 10:05:22.730551 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d898c68a-5600-4634-b005-69e25e854b40-utilities" (OuterVolumeSpecName: "utilities") pod "d898c68a-5600-4634-b005-69e25e854b40" (UID: "d898c68a-5600-4634-b005-69e25e854b40"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:05:22 crc kubenswrapper[4672]: I1206 10:05:22.757114 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d898c68a-5600-4634-b005-69e25e854b40-kube-api-access-qmvld" (OuterVolumeSpecName: "kube-api-access-qmvld") pod "d898c68a-5600-4634-b005-69e25e854b40" (UID: "d898c68a-5600-4634-b005-69e25e854b40"). InnerVolumeSpecName "kube-api-access-qmvld". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:05:22 crc kubenswrapper[4672]: I1206 10:05:22.830496 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmvld\" (UniqueName: \"kubernetes.io/projected/d898c68a-5600-4634-b005-69e25e854b40-kube-api-access-qmvld\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:22 crc kubenswrapper[4672]: I1206 10:05:22.830527 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d898c68a-5600-4634-b005-69e25e854b40-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:22 crc kubenswrapper[4672]: I1206 10:05:22.918322 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d898c68a-5600-4634-b005-69e25e854b40-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d898c68a-5600-4634-b005-69e25e854b40" (UID: "d898c68a-5600-4634-b005-69e25e854b40"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:05:22 crc kubenswrapper[4672]: I1206 10:05:22.932012 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d898c68a-5600-4634-b005-69e25e854b40-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:23 crc kubenswrapper[4672]: I1206 10:05:23.194998 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7n65" event={"ID":"c35debca-4e88-40f8-b458-0800427ff967","Type":"ContainerStarted","Data":"38138c0d6c360e0849fe00f9e9b5f64b0d46657441f6d34d9adc11435a9d7c37"} Dec 06 10:05:23 crc kubenswrapper[4672]: I1206 10:05:23.197506 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nsljk" event={"ID":"d898c68a-5600-4634-b005-69e25e854b40","Type":"ContainerDied","Data":"d03f60ee2cec747afcd49ee29be413d56fdfc48c4241d33a1a65d2beb909d130"} Dec 06 10:05:23 crc kubenswrapper[4672]: I1206 10:05:23.197530 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nsljk" Dec 06 10:05:23 crc kubenswrapper[4672]: I1206 10:05:23.197555 4672 scope.go:117] "RemoveContainer" containerID="f417bdf3f829bceb2ea1d43dc778b1db7aa697fc79145159cd87237e7e2f1ddd" Dec 06 10:05:23 crc kubenswrapper[4672]: I1206 10:05:23.219948 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h7n65" podStartSLOduration=2.730953791 podStartE2EDuration="6.219931772s" podCreationTimestamp="2025-12-06 10:05:17 +0000 UTC" firstStartedPulling="2025-12-06 10:05:19.150612209 +0000 UTC m=+3536.894872496" lastFinishedPulling="2025-12-06 10:05:22.63959019 +0000 UTC m=+3540.383850477" observedRunningTime="2025-12-06 10:05:23.216138739 +0000 UTC m=+3540.960399036" watchObservedRunningTime="2025-12-06 10:05:23.219931772 +0000 UTC m=+3540.964192059" Dec 06 10:05:23 crc kubenswrapper[4672]: I1206 10:05:23.220453 4672 scope.go:117] "RemoveContainer" containerID="f22ff2e6db5022a5957e6339b22b4e05cb7c71e150389319c5523347cc90fd9d" Dec 06 10:05:23 crc kubenswrapper[4672]: I1206 10:05:23.248150 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nsljk"] Dec 06 10:05:23 crc kubenswrapper[4672]: I1206 10:05:23.250012 4672 scope.go:117] "RemoveContainer" containerID="60025c3a61ccab73864fb4f179e814e5ea694a5975305e140f95ec991ce8b1a0" Dec 06 10:05:23 crc kubenswrapper[4672]: I1206 10:05:23.258640 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nsljk"] Dec 06 10:05:24 crc kubenswrapper[4672]: I1206 10:05:24.571692 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d898c68a-5600-4634-b005-69e25e854b40" path="/var/lib/kubelet/pods/d898c68a-5600-4634-b005-69e25e854b40/volumes" Dec 06 10:05:26 crc kubenswrapper[4672]: I1206 10:05:26.168737 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8rnmb" Dec 06 10:05:26 crc kubenswrapper[4672]: I1206 10:05:26.169811 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8rnmb" Dec 06 10:05:27 crc kubenswrapper[4672]: I1206 10:05:27.232371 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-8rnmb" podUID="02616aec-9825-4c17-9969-eadd0405341c" containerName="registry-server" probeResult="failure" output=< Dec 06 10:05:27 crc kubenswrapper[4672]: timeout: failed to connect service ":50051" within 1s Dec 06 10:05:27 crc kubenswrapper[4672]: > Dec 06 10:05:27 crc kubenswrapper[4672]: I1206 10:05:27.582064 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h7n65" Dec 06 10:05:27 crc kubenswrapper[4672]: I1206 10:05:27.582288 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h7n65" Dec 06 10:05:27 crc kubenswrapper[4672]: I1206 10:05:27.623440 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h7n65" Dec 06 10:05:28 crc kubenswrapper[4672]: I1206 10:05:28.304394 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h7n65" Dec 06 10:05:28 crc kubenswrapper[4672]: I1206 10:05:28.377789 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h7n65"] Dec 06 10:05:30 crc kubenswrapper[4672]: I1206 10:05:30.267654 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h7n65" podUID="c35debca-4e88-40f8-b458-0800427ff967" containerName="registry-server" containerID="cri-o://38138c0d6c360e0849fe00f9e9b5f64b0d46657441f6d34d9adc11435a9d7c37" gracePeriod=2 Dec 06 10:05:30 crc kubenswrapper[4672]: I1206 10:05:30.745793 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h7n65" Dec 06 10:05:30 crc kubenswrapper[4672]: I1206 10:05:30.824621 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s475g\" (UniqueName: \"kubernetes.io/projected/c35debca-4e88-40f8-b458-0800427ff967-kube-api-access-s475g\") pod \"c35debca-4e88-40f8-b458-0800427ff967\" (UID: \"c35debca-4e88-40f8-b458-0800427ff967\") " Dec 06 10:05:30 crc kubenswrapper[4672]: I1206 10:05:30.824725 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c35debca-4e88-40f8-b458-0800427ff967-utilities\") pod \"c35debca-4e88-40f8-b458-0800427ff967\" (UID: \"c35debca-4e88-40f8-b458-0800427ff967\") " Dec 06 10:05:30 crc kubenswrapper[4672]: I1206 10:05:30.824833 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c35debca-4e88-40f8-b458-0800427ff967-catalog-content\") pod \"c35debca-4e88-40f8-b458-0800427ff967\" (UID: \"c35debca-4e88-40f8-b458-0800427ff967\") " Dec 06 10:05:30 crc kubenswrapper[4672]: I1206 10:05:30.826070 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c35debca-4e88-40f8-b458-0800427ff967-utilities" (OuterVolumeSpecName: "utilities") pod "c35debca-4e88-40f8-b458-0800427ff967" (UID: "c35debca-4e88-40f8-b458-0800427ff967"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:05:30 crc kubenswrapper[4672]: I1206 10:05:30.848611 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c35debca-4e88-40f8-b458-0800427ff967-kube-api-access-s475g" (OuterVolumeSpecName: "kube-api-access-s475g") pod "c35debca-4e88-40f8-b458-0800427ff967" (UID: "c35debca-4e88-40f8-b458-0800427ff967"). InnerVolumeSpecName "kube-api-access-s475g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:05:30 crc kubenswrapper[4672]: I1206 10:05:30.875014 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c35debca-4e88-40f8-b458-0800427ff967-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c35debca-4e88-40f8-b458-0800427ff967" (UID: "c35debca-4e88-40f8-b458-0800427ff967"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:05:30 crc kubenswrapper[4672]: I1206 10:05:30.927135 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c35debca-4e88-40f8-b458-0800427ff967-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:30 crc kubenswrapper[4672]: I1206 10:05:30.927172 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s475g\" (UniqueName: \"kubernetes.io/projected/c35debca-4e88-40f8-b458-0800427ff967-kube-api-access-s475g\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:30 crc kubenswrapper[4672]: I1206 10:05:30.927184 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c35debca-4e88-40f8-b458-0800427ff967-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:31 crc kubenswrapper[4672]: I1206 10:05:31.279566 4672 generic.go:334] "Generic (PLEG): container finished" podID="c35debca-4e88-40f8-b458-0800427ff967" containerID="38138c0d6c360e0849fe00f9e9b5f64b0d46657441f6d34d9adc11435a9d7c37" exitCode=0 Dec 06 10:05:31 crc kubenswrapper[4672]: I1206 10:05:31.279643 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7n65" event={"ID":"c35debca-4e88-40f8-b458-0800427ff967","Type":"ContainerDied","Data":"38138c0d6c360e0849fe00f9e9b5f64b0d46657441f6d34d9adc11435a9d7c37"} Dec 06 10:05:31 crc kubenswrapper[4672]: I1206 10:05:31.279662 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h7n65" Dec 06 10:05:31 crc kubenswrapper[4672]: I1206 10:05:31.279682 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7n65" event={"ID":"c35debca-4e88-40f8-b458-0800427ff967","Type":"ContainerDied","Data":"1e29d3b84b2cab8f66282dbfe77f78ec27e6871204f55e6834b37d89d36a9bbb"} Dec 06 10:05:31 crc kubenswrapper[4672]: I1206 10:05:31.279711 4672 scope.go:117] "RemoveContainer" containerID="38138c0d6c360e0849fe00f9e9b5f64b0d46657441f6d34d9adc11435a9d7c37" Dec 06 10:05:31 crc kubenswrapper[4672]: I1206 10:05:31.307394 4672 scope.go:117] "RemoveContainer" containerID="fe19c3bd41ea778b21bb5d7fdd81b8fc8dd18f3b5304f0452a3e5a90dbea91e4" Dec 06 10:05:31 crc kubenswrapper[4672]: I1206 10:05:31.338985 4672 scope.go:117] "RemoveContainer" containerID="ff7b8f7d0d2a0d6bb1a0afb669bb765a8efca7b77b10688c3e12217414959999" Dec 06 10:05:31 crc kubenswrapper[4672]: I1206 10:05:31.347163 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h7n65"] Dec 06 10:05:31 crc kubenswrapper[4672]: I1206 10:05:31.356632 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h7n65"] Dec 06 10:05:31 crc kubenswrapper[4672]: I1206 10:05:31.385044 4672 scope.go:117] "RemoveContainer" containerID="38138c0d6c360e0849fe00f9e9b5f64b0d46657441f6d34d9adc11435a9d7c37" Dec 06 10:05:31 crc kubenswrapper[4672]: E1206 10:05:31.387494 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38138c0d6c360e0849fe00f9e9b5f64b0d46657441f6d34d9adc11435a9d7c37\": container with ID starting with 38138c0d6c360e0849fe00f9e9b5f64b0d46657441f6d34d9adc11435a9d7c37 not found: ID does not exist" containerID="38138c0d6c360e0849fe00f9e9b5f64b0d46657441f6d34d9adc11435a9d7c37" Dec 06 10:05:31 crc kubenswrapper[4672]: I1206 10:05:31.387530 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38138c0d6c360e0849fe00f9e9b5f64b0d46657441f6d34d9adc11435a9d7c37"} err="failed to get container status \"38138c0d6c360e0849fe00f9e9b5f64b0d46657441f6d34d9adc11435a9d7c37\": rpc error: code = NotFound desc = could not find container \"38138c0d6c360e0849fe00f9e9b5f64b0d46657441f6d34d9adc11435a9d7c37\": container with ID starting with 38138c0d6c360e0849fe00f9e9b5f64b0d46657441f6d34d9adc11435a9d7c37 not found: ID does not exist" Dec 06 10:05:31 crc kubenswrapper[4672]: I1206 10:05:31.387557 4672 scope.go:117] "RemoveContainer" containerID="fe19c3bd41ea778b21bb5d7fdd81b8fc8dd18f3b5304f0452a3e5a90dbea91e4" Dec 06 10:05:31 crc kubenswrapper[4672]: E1206 10:05:31.387912 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe19c3bd41ea778b21bb5d7fdd81b8fc8dd18f3b5304f0452a3e5a90dbea91e4\": container with ID starting with fe19c3bd41ea778b21bb5d7fdd81b8fc8dd18f3b5304f0452a3e5a90dbea91e4 not found: ID does not exist" containerID="fe19c3bd41ea778b21bb5d7fdd81b8fc8dd18f3b5304f0452a3e5a90dbea91e4" Dec 06 10:05:31 crc kubenswrapper[4672]: I1206 10:05:31.387954 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe19c3bd41ea778b21bb5d7fdd81b8fc8dd18f3b5304f0452a3e5a90dbea91e4"} err="failed to get container status \"fe19c3bd41ea778b21bb5d7fdd81b8fc8dd18f3b5304f0452a3e5a90dbea91e4\": rpc error: code = NotFound desc = could not find container \"fe19c3bd41ea778b21bb5d7fdd81b8fc8dd18f3b5304f0452a3e5a90dbea91e4\": container with ID starting with fe19c3bd41ea778b21bb5d7fdd81b8fc8dd18f3b5304f0452a3e5a90dbea91e4 not found: ID does not exist" Dec 06 10:05:31 crc kubenswrapper[4672]: I1206 10:05:31.387981 4672 scope.go:117] "RemoveContainer" containerID="ff7b8f7d0d2a0d6bb1a0afb669bb765a8efca7b77b10688c3e12217414959999" Dec 06 10:05:31 crc kubenswrapper[4672]: E1206 10:05:31.388761 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff7b8f7d0d2a0d6bb1a0afb669bb765a8efca7b77b10688c3e12217414959999\": container with ID starting with ff7b8f7d0d2a0d6bb1a0afb669bb765a8efca7b77b10688c3e12217414959999 not found: ID does not exist" containerID="ff7b8f7d0d2a0d6bb1a0afb669bb765a8efca7b77b10688c3e12217414959999" Dec 06 10:05:31 crc kubenswrapper[4672]: I1206 10:05:31.388793 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff7b8f7d0d2a0d6bb1a0afb669bb765a8efca7b77b10688c3e12217414959999"} err="failed to get container status \"ff7b8f7d0d2a0d6bb1a0afb669bb765a8efca7b77b10688c3e12217414959999\": rpc error: code = NotFound desc = could not find container \"ff7b8f7d0d2a0d6bb1a0afb669bb765a8efca7b77b10688c3e12217414959999\": container with ID starting with ff7b8f7d0d2a0d6bb1a0afb669bb765a8efca7b77b10688c3e12217414959999 not found: ID does not exist" Dec 06 10:05:32 crc kubenswrapper[4672]: I1206 10:05:32.567077 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c35debca-4e88-40f8-b458-0800427ff967" path="/var/lib/kubelet/pods/c35debca-4e88-40f8-b458-0800427ff967/volumes" Dec 06 10:05:36 crc kubenswrapper[4672]: I1206 10:05:36.212872 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8rnmb" Dec 06 10:05:36 crc kubenswrapper[4672]: I1206 10:05:36.265634 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8rnmb" Dec 06 10:05:36 crc kubenswrapper[4672]: I1206 10:05:36.762119 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8rnmb"] Dec 06 10:05:37 crc kubenswrapper[4672]: I1206 10:05:37.345565 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8rnmb" podUID="02616aec-9825-4c17-9969-eadd0405341c" containerName="registry-server" containerID="cri-o://1071584c93496665f0cfd9ea45c15543cb232e1113ec2fc4bd273365605b0315" gracePeriod=2 Dec 06 10:05:37 crc kubenswrapper[4672]: I1206 10:05:37.846137 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rnmb" Dec 06 10:05:37 crc kubenswrapper[4672]: I1206 10:05:37.876392 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02616aec-9825-4c17-9969-eadd0405341c-catalog-content\") pod \"02616aec-9825-4c17-9969-eadd0405341c\" (UID: \"02616aec-9825-4c17-9969-eadd0405341c\") " Dec 06 10:05:37 crc kubenswrapper[4672]: I1206 10:05:37.876520 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02616aec-9825-4c17-9969-eadd0405341c-utilities\") pod \"02616aec-9825-4c17-9969-eadd0405341c\" (UID: \"02616aec-9825-4c17-9969-eadd0405341c\") " Dec 06 10:05:37 crc kubenswrapper[4672]: I1206 10:05:37.876634 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzl5q\" (UniqueName: \"kubernetes.io/projected/02616aec-9825-4c17-9969-eadd0405341c-kube-api-access-rzl5q\") pod \"02616aec-9825-4c17-9969-eadd0405341c\" (UID: \"02616aec-9825-4c17-9969-eadd0405341c\") " Dec 06 10:05:37 crc kubenswrapper[4672]: I1206 10:05:37.877345 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02616aec-9825-4c17-9969-eadd0405341c-utilities" (OuterVolumeSpecName: "utilities") pod "02616aec-9825-4c17-9969-eadd0405341c" (UID: "02616aec-9825-4c17-9969-eadd0405341c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:05:37 crc kubenswrapper[4672]: I1206 10:05:37.883705 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02616aec-9825-4c17-9969-eadd0405341c-kube-api-access-rzl5q" (OuterVolumeSpecName: "kube-api-access-rzl5q") pod "02616aec-9825-4c17-9969-eadd0405341c" (UID: "02616aec-9825-4c17-9969-eadd0405341c"). InnerVolumeSpecName "kube-api-access-rzl5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:05:37 crc kubenswrapper[4672]: I1206 10:05:37.959370 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02616aec-9825-4c17-9969-eadd0405341c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02616aec-9825-4c17-9969-eadd0405341c" (UID: "02616aec-9825-4c17-9969-eadd0405341c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:05:37 crc kubenswrapper[4672]: I1206 10:05:37.978395 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02616aec-9825-4c17-9969-eadd0405341c-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:37 crc kubenswrapper[4672]: I1206 10:05:37.978590 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzl5q\" (UniqueName: \"kubernetes.io/projected/02616aec-9825-4c17-9969-eadd0405341c-kube-api-access-rzl5q\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:37 crc kubenswrapper[4672]: I1206 10:05:37.978679 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02616aec-9825-4c17-9969-eadd0405341c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:38 crc kubenswrapper[4672]: I1206 10:05:38.357660 4672 generic.go:334] "Generic (PLEG): container finished" podID="02616aec-9825-4c17-9969-eadd0405341c" containerID="1071584c93496665f0cfd9ea45c15543cb232e1113ec2fc4bd273365605b0315" exitCode=0 Dec 06 10:05:38 crc kubenswrapper[4672]: I1206 10:05:38.357710 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rnmb" event={"ID":"02616aec-9825-4c17-9969-eadd0405341c","Type":"ContainerDied","Data":"1071584c93496665f0cfd9ea45c15543cb232e1113ec2fc4bd273365605b0315"} Dec 06 10:05:38 crc kubenswrapper[4672]: I1206 10:05:38.357741 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rnmb" event={"ID":"02616aec-9825-4c17-9969-eadd0405341c","Type":"ContainerDied","Data":"ca997d8ec01cbab54838da9907ed237e5a7dd755959332e0fee1c5be1fe44ff1"} Dec 06 10:05:38 crc kubenswrapper[4672]: I1206 10:05:38.357762 4672 scope.go:117] "RemoveContainer" containerID="1071584c93496665f0cfd9ea45c15543cb232e1113ec2fc4bd273365605b0315" Dec 06 10:05:38 crc kubenswrapper[4672]: I1206 10:05:38.358804 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rnmb" Dec 06 10:05:38 crc kubenswrapper[4672]: I1206 10:05:38.394857 4672 scope.go:117] "RemoveContainer" containerID="fdce0f93ba8c34365f94ba89930ec4b1fa233ca8b99e279a4562941e2dec40fd" Dec 06 10:05:38 crc kubenswrapper[4672]: I1206 10:05:38.399867 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8rnmb"] Dec 06 10:05:38 crc kubenswrapper[4672]: I1206 10:05:38.408267 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8rnmb"] Dec 06 10:05:38 crc kubenswrapper[4672]: I1206 10:05:38.429862 4672 scope.go:117] "RemoveContainer" containerID="0d159b2bd7ac2cddfb719a231e36946fb72b0bda24c96df64bb17f374e37141e" Dec 06 10:05:38 crc kubenswrapper[4672]: I1206 10:05:38.472820 4672 scope.go:117] "RemoveContainer" containerID="1071584c93496665f0cfd9ea45c15543cb232e1113ec2fc4bd273365605b0315" Dec 06 10:05:38 crc kubenswrapper[4672]: E1206 10:05:38.473213 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1071584c93496665f0cfd9ea45c15543cb232e1113ec2fc4bd273365605b0315\": container with ID starting with 1071584c93496665f0cfd9ea45c15543cb232e1113ec2fc4bd273365605b0315 not found: ID does not exist" containerID="1071584c93496665f0cfd9ea45c15543cb232e1113ec2fc4bd273365605b0315" Dec 06 10:05:38 crc kubenswrapper[4672]: I1206 10:05:38.473244 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1071584c93496665f0cfd9ea45c15543cb232e1113ec2fc4bd273365605b0315"} err="failed to get container status \"1071584c93496665f0cfd9ea45c15543cb232e1113ec2fc4bd273365605b0315\": rpc error: code = NotFound desc = could not find container \"1071584c93496665f0cfd9ea45c15543cb232e1113ec2fc4bd273365605b0315\": container with ID starting with 1071584c93496665f0cfd9ea45c15543cb232e1113ec2fc4bd273365605b0315 not found: ID does not exist" Dec 06 10:05:38 crc kubenswrapper[4672]: I1206 10:05:38.473265 4672 scope.go:117] "RemoveContainer" containerID="fdce0f93ba8c34365f94ba89930ec4b1fa233ca8b99e279a4562941e2dec40fd" Dec 06 10:05:38 crc kubenswrapper[4672]: E1206 10:05:38.473534 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdce0f93ba8c34365f94ba89930ec4b1fa233ca8b99e279a4562941e2dec40fd\": container with ID starting with fdce0f93ba8c34365f94ba89930ec4b1fa233ca8b99e279a4562941e2dec40fd not found: ID does not exist" containerID="fdce0f93ba8c34365f94ba89930ec4b1fa233ca8b99e279a4562941e2dec40fd" Dec 06 10:05:38 crc kubenswrapper[4672]: I1206 10:05:38.473638 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdce0f93ba8c34365f94ba89930ec4b1fa233ca8b99e279a4562941e2dec40fd"} err="failed to get container status \"fdce0f93ba8c34365f94ba89930ec4b1fa233ca8b99e279a4562941e2dec40fd\": rpc error: code = NotFound desc = could not find container \"fdce0f93ba8c34365f94ba89930ec4b1fa233ca8b99e279a4562941e2dec40fd\": container with ID starting with fdce0f93ba8c34365f94ba89930ec4b1fa233ca8b99e279a4562941e2dec40fd not found: ID does not exist" Dec 06 10:05:38 crc kubenswrapper[4672]: I1206 10:05:38.473715 4672 scope.go:117] "RemoveContainer" containerID="0d159b2bd7ac2cddfb719a231e36946fb72b0bda24c96df64bb17f374e37141e" Dec 06 10:05:38 crc kubenswrapper[4672]: E1206 10:05:38.474007 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d159b2bd7ac2cddfb719a231e36946fb72b0bda24c96df64bb17f374e37141e\": container with ID starting with 0d159b2bd7ac2cddfb719a231e36946fb72b0bda24c96df64bb17f374e37141e not found: ID does not exist" containerID="0d159b2bd7ac2cddfb719a231e36946fb72b0bda24c96df64bb17f374e37141e" Dec 06 10:05:38 crc kubenswrapper[4672]: I1206 10:05:38.474032 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d159b2bd7ac2cddfb719a231e36946fb72b0bda24c96df64bb17f374e37141e"} err="failed to get container status \"0d159b2bd7ac2cddfb719a231e36946fb72b0bda24c96df64bb17f374e37141e\": rpc error: code = NotFound desc = could not find container \"0d159b2bd7ac2cddfb719a231e36946fb72b0bda24c96df64bb17f374e37141e\": container with ID starting with 0d159b2bd7ac2cddfb719a231e36946fb72b0bda24c96df64bb17f374e37141e not found: ID does not exist" Dec 06 10:05:38 crc kubenswrapper[4672]: I1206 10:05:38.571683 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02616aec-9825-4c17-9969-eadd0405341c" path="/var/lib/kubelet/pods/02616aec-9825-4c17-9969-eadd0405341c/volumes" Dec 06 10:06:42 crc kubenswrapper[4672]: I1206 10:06:42.319127 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:06:42 crc kubenswrapper[4672]: I1206 10:06:42.319588 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:07:12 crc kubenswrapper[4672]: I1206 10:07:12.319308 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:07:12 crc kubenswrapper[4672]: I1206 10:07:12.320016 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:07:42 crc kubenswrapper[4672]: I1206 10:07:42.320146 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:07:42 crc kubenswrapper[4672]: I1206 10:07:42.320617 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:07:42 crc kubenswrapper[4672]: I1206 10:07:42.320665 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 10:07:42 crc kubenswrapper[4672]: I1206 10:07:42.321395 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ca4e55181ab085a6d2d94a78978707a35a3af3000a5af2216eb580aa96202e83"} pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 10:07:42 crc kubenswrapper[4672]: I1206 10:07:42.321439 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" containerID="cri-o://ca4e55181ab085a6d2d94a78978707a35a3af3000a5af2216eb580aa96202e83" gracePeriod=600 Dec 06 10:07:42 crc kubenswrapper[4672]: E1206 10:07:42.487293 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:07:42 crc kubenswrapper[4672]: I1206 10:07:42.526159 4672 generic.go:334] "Generic (PLEG): container finished" podID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerID="ca4e55181ab085a6d2d94a78978707a35a3af3000a5af2216eb580aa96202e83" exitCode=0 Dec 06 10:07:42 crc kubenswrapper[4672]: I1206 10:07:42.526213 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerDied","Data":"ca4e55181ab085a6d2d94a78978707a35a3af3000a5af2216eb580aa96202e83"} Dec 06 10:07:42 crc kubenswrapper[4672]: I1206 10:07:42.526250 4672 scope.go:117] "RemoveContainer" containerID="b52c4a53b3f1a8646d79189c3bda17a1d38aee3f1effad6325de3a3cc88f79b3" Dec 06 10:07:42 crc kubenswrapper[4672]: I1206 10:07:42.527101 4672 scope.go:117] "RemoveContainer" containerID="ca4e55181ab085a6d2d94a78978707a35a3af3000a5af2216eb580aa96202e83" Dec 06 10:07:42 crc kubenswrapper[4672]: E1206 10:07:42.527474 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:07:57 crc kubenswrapper[4672]: I1206 10:07:57.556823 4672 scope.go:117] "RemoveContainer" containerID="ca4e55181ab085a6d2d94a78978707a35a3af3000a5af2216eb580aa96202e83" Dec 06 10:07:57 crc kubenswrapper[4672]: E1206 10:07:57.558724 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:08:10 crc kubenswrapper[4672]: I1206 10:08:10.558024 4672 scope.go:117] "RemoveContainer" containerID="ca4e55181ab085a6d2d94a78978707a35a3af3000a5af2216eb580aa96202e83" Dec 06 10:08:10 crc kubenswrapper[4672]: E1206 10:08:10.558906 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:08:25 crc kubenswrapper[4672]: I1206 10:08:25.556791 4672 scope.go:117] "RemoveContainer" containerID="ca4e55181ab085a6d2d94a78978707a35a3af3000a5af2216eb580aa96202e83" Dec 06 10:08:25 crc kubenswrapper[4672]: E1206 10:08:25.557633 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:08:28 crc kubenswrapper[4672]: I1206 10:08:28.549709 4672 scope.go:117] "RemoveContainer" containerID="cd54ddc56c30affd5ac1887dc1e5338b5c22b0b4612e2c1d10dec357d228833e" Dec 06 10:08:28 crc kubenswrapper[4672]: I1206 10:08:28.586684 4672 scope.go:117] "RemoveContainer" containerID="3b28f23ad9e2c8a24d91074f7436290ef3650df1372eb9b83046034b8698b41d" Dec 06 10:08:37 crc kubenswrapper[4672]: I1206 10:08:37.556836 4672 scope.go:117] "RemoveContainer" containerID="ca4e55181ab085a6d2d94a78978707a35a3af3000a5af2216eb580aa96202e83" Dec 06 10:08:37 crc kubenswrapper[4672]: E1206 10:08:37.557751 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:08:48 crc kubenswrapper[4672]: I1206 10:08:48.557514 4672 scope.go:117] "RemoveContainer" containerID="ca4e55181ab085a6d2d94a78978707a35a3af3000a5af2216eb580aa96202e83" Dec 06 10:08:48 crc kubenswrapper[4672]: E1206 10:08:48.558703 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:09:02 crc kubenswrapper[4672]: I1206 10:09:02.563816 4672 scope.go:117] "RemoveContainer" containerID="ca4e55181ab085a6d2d94a78978707a35a3af3000a5af2216eb580aa96202e83" Dec 06 10:09:02 crc kubenswrapper[4672]: E1206 10:09:02.564698 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:09:16 crc kubenswrapper[4672]: I1206 10:09:16.558353 4672 scope.go:117] "RemoveContainer" containerID="ca4e55181ab085a6d2d94a78978707a35a3af3000a5af2216eb580aa96202e83" Dec 06 10:09:16 crc kubenswrapper[4672]: E1206 10:09:16.559118 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:09:31 crc kubenswrapper[4672]: I1206 10:09:31.558372 4672 scope.go:117] "RemoveContainer" containerID="ca4e55181ab085a6d2d94a78978707a35a3af3000a5af2216eb580aa96202e83" Dec 06 10:09:31 crc kubenswrapper[4672]: E1206 10:09:31.559162 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:09:45 crc kubenswrapper[4672]: I1206 10:09:45.557069 4672 scope.go:117] "RemoveContainer" containerID="ca4e55181ab085a6d2d94a78978707a35a3af3000a5af2216eb580aa96202e83" Dec 06 10:09:45 crc kubenswrapper[4672]: E1206 10:09:45.557914 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:09:56 crc kubenswrapper[4672]: I1206 10:09:56.556961 4672 scope.go:117] "RemoveContainer" containerID="ca4e55181ab085a6d2d94a78978707a35a3af3000a5af2216eb580aa96202e83" Dec 06 10:09:56 crc kubenswrapper[4672]: E1206 10:09:56.557789 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:10:08 crc kubenswrapper[4672]: I1206 10:10:08.557053 4672 scope.go:117] "RemoveContainer" containerID="ca4e55181ab085a6d2d94a78978707a35a3af3000a5af2216eb580aa96202e83" Dec 06 10:10:08 crc kubenswrapper[4672]: E1206 10:10:08.557856 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:10:20 crc kubenswrapper[4672]: I1206 10:10:20.558191 4672 scope.go:117] "RemoveContainer" containerID="ca4e55181ab085a6d2d94a78978707a35a3af3000a5af2216eb580aa96202e83" Dec 06 10:10:20 crc kubenswrapper[4672]: E1206 10:10:20.558938 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:10:33 crc kubenswrapper[4672]: I1206 10:10:33.559559 4672 scope.go:117] "RemoveContainer" containerID="ca4e55181ab085a6d2d94a78978707a35a3af3000a5af2216eb580aa96202e83" Dec 06 10:10:33 crc kubenswrapper[4672]: E1206 10:10:33.560235 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:10:44 crc kubenswrapper[4672]: I1206 10:10:44.556876 4672 scope.go:117] "RemoveContainer" containerID="ca4e55181ab085a6d2d94a78978707a35a3af3000a5af2216eb580aa96202e83" Dec 06 10:10:44 crc kubenswrapper[4672]: E1206 10:10:44.557842 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:10:59 crc kubenswrapper[4672]: I1206 10:10:59.557384 4672 scope.go:117] "RemoveContainer" containerID="ca4e55181ab085a6d2d94a78978707a35a3af3000a5af2216eb580aa96202e83" Dec 06 10:10:59 crc kubenswrapper[4672]: E1206 10:10:59.558004 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:11:01 crc kubenswrapper[4672]: I1206 10:11:01.081172 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-fvtdd"] Dec 06 10:11:01 crc kubenswrapper[4672]: I1206 10:11:01.094506 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-fvtdd"] Dec 06 10:11:02 crc kubenswrapper[4672]: I1206 10:11:02.026032 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-3062-account-create-update-7k7nc"] Dec 06 10:11:02 crc kubenswrapper[4672]: I1206 10:11:02.034467 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-3062-account-create-update-7k7nc"] Dec 06 10:11:02 crc kubenswrapper[4672]: I1206 10:11:02.573082 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="129c23da-4548-4207-808e-296c6e1f6396" path="/var/lib/kubelet/pods/129c23da-4548-4207-808e-296c6e1f6396/volumes" Dec 06 10:11:02 crc kubenswrapper[4672]: I1206 10:11:02.575189 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54fd106c-75b6-4c7a-810c-dd13d4655cba" path="/var/lib/kubelet/pods/54fd106c-75b6-4c7a-810c-dd13d4655cba/volumes" Dec 06 10:11:14 crc kubenswrapper[4672]: I1206 10:11:14.557544 4672 scope.go:117] "RemoveContainer" containerID="ca4e55181ab085a6d2d94a78978707a35a3af3000a5af2216eb580aa96202e83" Dec 06 10:11:14 crc kubenswrapper[4672]: E1206 10:11:14.558569 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:11:28 crc kubenswrapper[4672]: I1206 10:11:28.557485 4672 scope.go:117] "RemoveContainer" containerID="ca4e55181ab085a6d2d94a78978707a35a3af3000a5af2216eb580aa96202e83" Dec 06 10:11:28 crc kubenswrapper[4672]: E1206 10:11:28.558344 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:11:28 crc kubenswrapper[4672]: I1206 10:11:28.692360 4672 scope.go:117] "RemoveContainer" containerID="8c1542d192562b39daee0b6830d11cc3c88e0389589e3bab2d4a2c69002de8ee" Dec 06 10:11:28 crc kubenswrapper[4672]: I1206 10:11:28.716940 4672 scope.go:117] "RemoveContainer" containerID="bdbdec7cbeffcee022b2554351f1d7917ac0f455aad112cb3a5e8818b32b5d05" Dec 06 10:11:33 crc kubenswrapper[4672]: I1206 10:11:33.652007 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vvv9j"] Dec 06 10:11:33 crc kubenswrapper[4672]: E1206 10:11:33.654072 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02616aec-9825-4c17-9969-eadd0405341c" containerName="extract-utilities" Dec 06 10:11:33 crc kubenswrapper[4672]: I1206 10:11:33.654201 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="02616aec-9825-4c17-9969-eadd0405341c" containerName="extract-utilities" Dec 06 10:11:33 crc kubenswrapper[4672]: E1206 10:11:33.654295 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c35debca-4e88-40f8-b458-0800427ff967" containerName="registry-server" Dec 06 10:11:33 crc kubenswrapper[4672]: I1206 10:11:33.654381 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c35debca-4e88-40f8-b458-0800427ff967" containerName="registry-server" Dec 06 10:11:33 crc kubenswrapper[4672]: E1206 10:11:33.654482 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d898c68a-5600-4634-b005-69e25e854b40" containerName="extract-content" Dec 06 10:11:33 crc kubenswrapper[4672]: I1206 10:11:33.654558 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d898c68a-5600-4634-b005-69e25e854b40" containerName="extract-content" Dec 06 10:11:33 crc kubenswrapper[4672]: E1206 10:11:33.654662 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02616aec-9825-4c17-9969-eadd0405341c" containerName="extract-content" Dec 06 10:11:33 crc kubenswrapper[4672]: I1206 10:11:33.654759 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="02616aec-9825-4c17-9969-eadd0405341c" containerName="extract-content" Dec 06 10:11:33 crc kubenswrapper[4672]: E1206 10:11:33.654845 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d898c68a-5600-4634-b005-69e25e854b40" containerName="registry-server" Dec 06 10:11:33 crc kubenswrapper[4672]: I1206 10:11:33.654914 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d898c68a-5600-4634-b005-69e25e854b40" containerName="registry-server" Dec 06 10:11:33 crc kubenswrapper[4672]: E1206 10:11:33.654986 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c35debca-4e88-40f8-b458-0800427ff967" containerName="extract-content" Dec 06 10:11:33 crc kubenswrapper[4672]: I1206 10:11:33.655058 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c35debca-4e88-40f8-b458-0800427ff967" containerName="extract-content" Dec 06 10:11:33 crc kubenswrapper[4672]: E1206 10:11:33.655134 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d898c68a-5600-4634-b005-69e25e854b40" containerName="extract-utilities" Dec 06 10:11:33 crc kubenswrapper[4672]: I1206 10:11:33.655208 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d898c68a-5600-4634-b005-69e25e854b40" containerName="extract-utilities" Dec 06 10:11:33 crc kubenswrapper[4672]: E1206 10:11:33.655282 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c35debca-4e88-40f8-b458-0800427ff967" containerName="extract-utilities" Dec 06 10:11:33 crc kubenswrapper[4672]: I1206 10:11:33.655352 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c35debca-4e88-40f8-b458-0800427ff967" containerName="extract-utilities" Dec 06 10:11:33 crc kubenswrapper[4672]: E1206 10:11:33.655428 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02616aec-9825-4c17-9969-eadd0405341c" containerName="registry-server" Dec 06 10:11:33 crc kubenswrapper[4672]: I1206 10:11:33.655503 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="02616aec-9825-4c17-9969-eadd0405341c" containerName="registry-server" Dec 06 10:11:33 crc kubenswrapper[4672]: I1206 10:11:33.655841 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="02616aec-9825-4c17-9969-eadd0405341c" containerName="registry-server" Dec 06 10:11:33 crc kubenswrapper[4672]: I1206 10:11:33.655938 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="d898c68a-5600-4634-b005-69e25e854b40" containerName="registry-server" Dec 06 10:11:33 crc kubenswrapper[4672]: I1206 10:11:33.656032 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="c35debca-4e88-40f8-b458-0800427ff967" containerName="registry-server" Dec 06 10:11:33 crc kubenswrapper[4672]: I1206 10:11:33.657774 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vvv9j" Dec 06 10:11:33 crc kubenswrapper[4672]: I1206 10:11:33.680721 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vvv9j"] Dec 06 10:11:33 crc kubenswrapper[4672]: I1206 10:11:33.727026 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/868cc727-1d6d-442d-80f4-f3294f7e50de-utilities\") pod \"redhat-marketplace-vvv9j\" (UID: \"868cc727-1d6d-442d-80f4-f3294f7e50de\") " pod="openshift-marketplace/redhat-marketplace-vvv9j" Dec 06 10:11:33 crc kubenswrapper[4672]: I1206 10:11:33.727126 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/868cc727-1d6d-442d-80f4-f3294f7e50de-catalog-content\") pod \"redhat-marketplace-vvv9j\" (UID: \"868cc727-1d6d-442d-80f4-f3294f7e50de\") " pod="openshift-marketplace/redhat-marketplace-vvv9j" Dec 06 10:11:33 crc kubenswrapper[4672]: I1206 10:11:33.727262 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv6nr\" (UniqueName: \"kubernetes.io/projected/868cc727-1d6d-442d-80f4-f3294f7e50de-kube-api-access-qv6nr\") pod \"redhat-marketplace-vvv9j\" (UID: \"868cc727-1d6d-442d-80f4-f3294f7e50de\") " pod="openshift-marketplace/redhat-marketplace-vvv9j" Dec 06 10:11:33 crc kubenswrapper[4672]: I1206 10:11:33.829388 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/868cc727-1d6d-442d-80f4-f3294f7e50de-catalog-content\") pod \"redhat-marketplace-vvv9j\" (UID: \"868cc727-1d6d-442d-80f4-f3294f7e50de\") " pod="openshift-marketplace/redhat-marketplace-vvv9j" Dec 06 10:11:33 crc kubenswrapper[4672]: I1206 10:11:33.829547 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv6nr\" (UniqueName: \"kubernetes.io/projected/868cc727-1d6d-442d-80f4-f3294f7e50de-kube-api-access-qv6nr\") pod \"redhat-marketplace-vvv9j\" (UID: \"868cc727-1d6d-442d-80f4-f3294f7e50de\") " pod="openshift-marketplace/redhat-marketplace-vvv9j" Dec 06 10:11:33 crc kubenswrapper[4672]: I1206 10:11:33.829619 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/868cc727-1d6d-442d-80f4-f3294f7e50de-utilities\") pod \"redhat-marketplace-vvv9j\" (UID: \"868cc727-1d6d-442d-80f4-f3294f7e50de\") " pod="openshift-marketplace/redhat-marketplace-vvv9j" Dec 06 10:11:33 crc kubenswrapper[4672]: I1206 10:11:33.829889 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/868cc727-1d6d-442d-80f4-f3294f7e50de-catalog-content\") pod \"redhat-marketplace-vvv9j\" (UID: \"868cc727-1d6d-442d-80f4-f3294f7e50de\") " pod="openshift-marketplace/redhat-marketplace-vvv9j" Dec 06 10:11:33 crc kubenswrapper[4672]: I1206 10:11:33.830372 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/868cc727-1d6d-442d-80f4-f3294f7e50de-utilities\") pod \"redhat-marketplace-vvv9j\" (UID: \"868cc727-1d6d-442d-80f4-f3294f7e50de\") " pod="openshift-marketplace/redhat-marketplace-vvv9j" Dec 06 10:11:33 crc kubenswrapper[4672]: I1206 10:11:33.862919 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv6nr\" (UniqueName: \"kubernetes.io/projected/868cc727-1d6d-442d-80f4-f3294f7e50de-kube-api-access-qv6nr\") pod \"redhat-marketplace-vvv9j\" (UID: \"868cc727-1d6d-442d-80f4-f3294f7e50de\") " pod="openshift-marketplace/redhat-marketplace-vvv9j" Dec 06 10:11:33 crc kubenswrapper[4672]: I1206 10:11:33.987275 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vvv9j" Dec 06 10:11:34 crc kubenswrapper[4672]: I1206 10:11:34.670408 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vvv9j"] Dec 06 10:11:35 crc kubenswrapper[4672]: I1206 10:11:35.039998 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-96zr2"] Dec 06 10:11:35 crc kubenswrapper[4672]: I1206 10:11:35.050092 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-96zr2"] Dec 06 10:11:35 crc kubenswrapper[4672]: I1206 10:11:35.138966 4672 generic.go:334] "Generic (PLEG): container finished" podID="868cc727-1d6d-442d-80f4-f3294f7e50de" containerID="60e62a271ce0fba92099e2ec5fb32638efb2c9a42e9607a7c2f0288a61b7c324" exitCode=0 Dec 06 10:11:35 crc kubenswrapper[4672]: I1206 10:11:35.139016 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvv9j" event={"ID":"868cc727-1d6d-442d-80f4-f3294f7e50de","Type":"ContainerDied","Data":"60e62a271ce0fba92099e2ec5fb32638efb2c9a42e9607a7c2f0288a61b7c324"} Dec 06 10:11:35 crc kubenswrapper[4672]: I1206 10:11:35.139064 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvv9j" event={"ID":"868cc727-1d6d-442d-80f4-f3294f7e50de","Type":"ContainerStarted","Data":"e8e2ea7b421613ca6589fa60c9c542c25761c48695e1b60552a21fad9691863a"} Dec 06 10:11:35 crc kubenswrapper[4672]: I1206 10:11:35.141536 4672 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 10:11:36 crc kubenswrapper[4672]: I1206 10:11:36.566552 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bfe87b4-aa9f-475a-bba9-438425d79d47" path="/var/lib/kubelet/pods/5bfe87b4-aa9f-475a-bba9-438425d79d47/volumes" Dec 06 10:11:37 crc kubenswrapper[4672]: I1206 10:11:37.154856 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvv9j" event={"ID":"868cc727-1d6d-442d-80f4-f3294f7e50de","Type":"ContainerStarted","Data":"2041ac89c3ca8a3974f046a61570b1782cb30bd9ba856fcd8f0fde2c1b8c4929"} Dec 06 10:11:38 crc kubenswrapper[4672]: I1206 10:11:38.164887 4672 generic.go:334] "Generic (PLEG): container finished" podID="868cc727-1d6d-442d-80f4-f3294f7e50de" containerID="2041ac89c3ca8a3974f046a61570b1782cb30bd9ba856fcd8f0fde2c1b8c4929" exitCode=0 Dec 06 10:11:38 crc kubenswrapper[4672]: I1206 10:11:38.164964 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvv9j" event={"ID":"868cc727-1d6d-442d-80f4-f3294f7e50de","Type":"ContainerDied","Data":"2041ac89c3ca8a3974f046a61570b1782cb30bd9ba856fcd8f0fde2c1b8c4929"} Dec 06 10:11:39 crc kubenswrapper[4672]: I1206 10:11:39.185084 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvv9j" event={"ID":"868cc727-1d6d-442d-80f4-f3294f7e50de","Type":"ContainerStarted","Data":"baae375b08a18aeaa1131ebf4e8473446c84cdcb4029da133643008cbdc7f1e5"} Dec 06 10:11:39 crc kubenswrapper[4672]: I1206 10:11:39.222168 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vvv9j" podStartSLOduration=2.805856041 podStartE2EDuration="6.222147097s" podCreationTimestamp="2025-12-06 10:11:33 +0000 UTC" firstStartedPulling="2025-12-06 10:11:35.141255381 +0000 UTC m=+3912.885515668" lastFinishedPulling="2025-12-06 10:11:38.557546437 +0000 UTC m=+3916.301806724" observedRunningTime="2025-12-06 10:11:39.211250252 +0000 UTC m=+3916.955510539" watchObservedRunningTime="2025-12-06 10:11:39.222147097 +0000 UTC m=+3916.966407384" Dec 06 10:11:39 crc kubenswrapper[4672]: I1206 10:11:39.557202 4672 scope.go:117] "RemoveContainer" containerID="ca4e55181ab085a6d2d94a78978707a35a3af3000a5af2216eb580aa96202e83" Dec 06 10:11:39 crc kubenswrapper[4672]: E1206 10:11:39.557402 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:11:43 crc kubenswrapper[4672]: I1206 10:11:43.987970 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vvv9j" Dec 06 10:11:43 crc kubenswrapper[4672]: I1206 10:11:43.989582 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vvv9j" Dec 06 10:11:44 crc kubenswrapper[4672]: I1206 10:11:44.055770 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vvv9j" Dec 06 10:11:44 crc kubenswrapper[4672]: I1206 10:11:44.287120 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vvv9j" Dec 06 10:11:44 crc kubenswrapper[4672]: I1206 10:11:44.341731 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vvv9j"] Dec 06 10:11:46 crc kubenswrapper[4672]: I1206 10:11:46.250328 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vvv9j" podUID="868cc727-1d6d-442d-80f4-f3294f7e50de" containerName="registry-server" containerID="cri-o://baae375b08a18aeaa1131ebf4e8473446c84cdcb4029da133643008cbdc7f1e5" gracePeriod=2 Dec 06 10:11:46 crc kubenswrapper[4672]: I1206 10:11:46.785421 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vvv9j" Dec 06 10:11:46 crc kubenswrapper[4672]: I1206 10:11:46.880994 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/868cc727-1d6d-442d-80f4-f3294f7e50de-catalog-content\") pod \"868cc727-1d6d-442d-80f4-f3294f7e50de\" (UID: \"868cc727-1d6d-442d-80f4-f3294f7e50de\") " Dec 06 10:11:46 crc kubenswrapper[4672]: I1206 10:11:46.881310 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/868cc727-1d6d-442d-80f4-f3294f7e50de-utilities\") pod \"868cc727-1d6d-442d-80f4-f3294f7e50de\" (UID: \"868cc727-1d6d-442d-80f4-f3294f7e50de\") " Dec 06 10:11:46 crc kubenswrapper[4672]: I1206 10:11:46.881438 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv6nr\" (UniqueName: \"kubernetes.io/projected/868cc727-1d6d-442d-80f4-f3294f7e50de-kube-api-access-qv6nr\") pod \"868cc727-1d6d-442d-80f4-f3294f7e50de\" (UID: \"868cc727-1d6d-442d-80f4-f3294f7e50de\") " Dec 06 10:11:46 crc kubenswrapper[4672]: I1206 10:11:46.883331 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/868cc727-1d6d-442d-80f4-f3294f7e50de-utilities" (OuterVolumeSpecName: "utilities") pod "868cc727-1d6d-442d-80f4-f3294f7e50de" (UID: "868cc727-1d6d-442d-80f4-f3294f7e50de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:11:46 crc kubenswrapper[4672]: I1206 10:11:46.900021 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/868cc727-1d6d-442d-80f4-f3294f7e50de-kube-api-access-qv6nr" (OuterVolumeSpecName: "kube-api-access-qv6nr") pod "868cc727-1d6d-442d-80f4-f3294f7e50de" (UID: "868cc727-1d6d-442d-80f4-f3294f7e50de"). InnerVolumeSpecName "kube-api-access-qv6nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:11:46 crc kubenswrapper[4672]: I1206 10:11:46.902834 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/868cc727-1d6d-442d-80f4-f3294f7e50de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "868cc727-1d6d-442d-80f4-f3294f7e50de" (UID: "868cc727-1d6d-442d-80f4-f3294f7e50de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:11:46 crc kubenswrapper[4672]: I1206 10:11:46.983786 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/868cc727-1d6d-442d-80f4-f3294f7e50de-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:11:46 crc kubenswrapper[4672]: I1206 10:11:46.983815 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/868cc727-1d6d-442d-80f4-f3294f7e50de-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:11:46 crc kubenswrapper[4672]: I1206 10:11:46.983827 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv6nr\" (UniqueName: \"kubernetes.io/projected/868cc727-1d6d-442d-80f4-f3294f7e50de-kube-api-access-qv6nr\") on node \"crc\" DevicePath \"\"" Dec 06 10:11:47 crc kubenswrapper[4672]: I1206 10:11:47.266976 4672 generic.go:334] "Generic (PLEG): container finished" podID="868cc727-1d6d-442d-80f4-f3294f7e50de" containerID="baae375b08a18aeaa1131ebf4e8473446c84cdcb4029da133643008cbdc7f1e5" exitCode=0 Dec 06 10:11:47 crc kubenswrapper[4672]: I1206 10:11:47.267027 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvv9j" event={"ID":"868cc727-1d6d-442d-80f4-f3294f7e50de","Type":"ContainerDied","Data":"baae375b08a18aeaa1131ebf4e8473446c84cdcb4029da133643008cbdc7f1e5"} Dec 06 10:11:47 crc kubenswrapper[4672]: I1206 10:11:47.267057 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvv9j" event={"ID":"868cc727-1d6d-442d-80f4-f3294f7e50de","Type":"ContainerDied","Data":"e8e2ea7b421613ca6589fa60c9c542c25761c48695e1b60552a21fad9691863a"} Dec 06 10:11:47 crc kubenswrapper[4672]: I1206 10:11:47.267079 4672 scope.go:117] "RemoveContainer" containerID="baae375b08a18aeaa1131ebf4e8473446c84cdcb4029da133643008cbdc7f1e5" Dec 06 10:11:47 crc kubenswrapper[4672]: I1206 10:11:47.267240 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vvv9j" Dec 06 10:11:47 crc kubenswrapper[4672]: I1206 10:11:47.306963 4672 scope.go:117] "RemoveContainer" containerID="2041ac89c3ca8a3974f046a61570b1782cb30bd9ba856fcd8f0fde2c1b8c4929" Dec 06 10:11:47 crc kubenswrapper[4672]: I1206 10:11:47.316130 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vvv9j"] Dec 06 10:11:47 crc kubenswrapper[4672]: I1206 10:11:47.366941 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vvv9j"] Dec 06 10:11:47 crc kubenswrapper[4672]: I1206 10:11:47.379734 4672 scope.go:117] "RemoveContainer" containerID="60e62a271ce0fba92099e2ec5fb32638efb2c9a42e9607a7c2f0288a61b7c324" Dec 06 10:11:47 crc kubenswrapper[4672]: I1206 10:11:47.405038 4672 scope.go:117] "RemoveContainer" containerID="baae375b08a18aeaa1131ebf4e8473446c84cdcb4029da133643008cbdc7f1e5" Dec 06 10:11:47 crc kubenswrapper[4672]: E1206 10:11:47.405452 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baae375b08a18aeaa1131ebf4e8473446c84cdcb4029da133643008cbdc7f1e5\": container with ID starting with baae375b08a18aeaa1131ebf4e8473446c84cdcb4029da133643008cbdc7f1e5 not found: ID does not exist" containerID="baae375b08a18aeaa1131ebf4e8473446c84cdcb4029da133643008cbdc7f1e5" Dec 06 10:11:47 crc kubenswrapper[4672]: I1206 10:11:47.405551 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baae375b08a18aeaa1131ebf4e8473446c84cdcb4029da133643008cbdc7f1e5"} err="failed to get container status \"baae375b08a18aeaa1131ebf4e8473446c84cdcb4029da133643008cbdc7f1e5\": rpc error: code = NotFound desc = could not find container \"baae375b08a18aeaa1131ebf4e8473446c84cdcb4029da133643008cbdc7f1e5\": container with ID starting with baae375b08a18aeaa1131ebf4e8473446c84cdcb4029da133643008cbdc7f1e5 not found: ID does not exist" Dec 06 10:11:47 crc kubenswrapper[4672]: I1206 10:11:47.405646 4672 scope.go:117] "RemoveContainer" containerID="2041ac89c3ca8a3974f046a61570b1782cb30bd9ba856fcd8f0fde2c1b8c4929" Dec 06 10:11:47 crc kubenswrapper[4672]: E1206 10:11:47.405998 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2041ac89c3ca8a3974f046a61570b1782cb30bd9ba856fcd8f0fde2c1b8c4929\": container with ID starting with 2041ac89c3ca8a3974f046a61570b1782cb30bd9ba856fcd8f0fde2c1b8c4929 not found: ID does not exist" containerID="2041ac89c3ca8a3974f046a61570b1782cb30bd9ba856fcd8f0fde2c1b8c4929" Dec 06 10:11:47 crc kubenswrapper[4672]: I1206 10:11:47.406086 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2041ac89c3ca8a3974f046a61570b1782cb30bd9ba856fcd8f0fde2c1b8c4929"} err="failed to get container status \"2041ac89c3ca8a3974f046a61570b1782cb30bd9ba856fcd8f0fde2c1b8c4929\": rpc error: code = NotFound desc = could not find container \"2041ac89c3ca8a3974f046a61570b1782cb30bd9ba856fcd8f0fde2c1b8c4929\": container with ID starting with 2041ac89c3ca8a3974f046a61570b1782cb30bd9ba856fcd8f0fde2c1b8c4929 not found: ID does not exist" Dec 06 10:11:47 crc kubenswrapper[4672]: I1206 10:11:47.406148 4672 scope.go:117] "RemoveContainer" containerID="60e62a271ce0fba92099e2ec5fb32638efb2c9a42e9607a7c2f0288a61b7c324" Dec 06 10:11:47 crc kubenswrapper[4672]: E1206 10:11:47.407493 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60e62a271ce0fba92099e2ec5fb32638efb2c9a42e9607a7c2f0288a61b7c324\": container with ID starting with 60e62a271ce0fba92099e2ec5fb32638efb2c9a42e9607a7c2f0288a61b7c324 not found: ID does not exist" containerID="60e62a271ce0fba92099e2ec5fb32638efb2c9a42e9607a7c2f0288a61b7c324" Dec 06 10:11:47 crc kubenswrapper[4672]: I1206 10:11:47.407612 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60e62a271ce0fba92099e2ec5fb32638efb2c9a42e9607a7c2f0288a61b7c324"} err="failed to get container status \"60e62a271ce0fba92099e2ec5fb32638efb2c9a42e9607a7c2f0288a61b7c324\": rpc error: code = NotFound desc = could not find container \"60e62a271ce0fba92099e2ec5fb32638efb2c9a42e9607a7c2f0288a61b7c324\": container with ID starting with 60e62a271ce0fba92099e2ec5fb32638efb2c9a42e9607a7c2f0288a61b7c324 not found: ID does not exist" Dec 06 10:11:48 crc kubenswrapper[4672]: I1206 10:11:48.573090 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="868cc727-1d6d-442d-80f4-f3294f7e50de" path="/var/lib/kubelet/pods/868cc727-1d6d-442d-80f4-f3294f7e50de/volumes" Dec 06 10:11:54 crc kubenswrapper[4672]: I1206 10:11:54.557209 4672 scope.go:117] "RemoveContainer" containerID="ca4e55181ab085a6d2d94a78978707a35a3af3000a5af2216eb580aa96202e83" Dec 06 10:11:54 crc kubenswrapper[4672]: E1206 10:11:54.557847 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:12:05 crc kubenswrapper[4672]: I1206 10:12:05.556693 4672 scope.go:117] "RemoveContainer" containerID="ca4e55181ab085a6d2d94a78978707a35a3af3000a5af2216eb580aa96202e83" Dec 06 10:12:05 crc kubenswrapper[4672]: E1206 10:12:05.557313 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:12:16 crc kubenswrapper[4672]: I1206 10:12:16.556788 4672 scope.go:117] "RemoveContainer" containerID="ca4e55181ab085a6d2d94a78978707a35a3af3000a5af2216eb580aa96202e83" Dec 06 10:12:16 crc kubenswrapper[4672]: E1206 10:12:16.557488 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:12:28 crc kubenswrapper[4672]: I1206 10:12:28.862102 4672 scope.go:117] "RemoveContainer" containerID="146877679d3fbb7046b748f1038bf730ca69f61a25f128667cf8a4bb0cdc7c11" Dec 06 10:12:29 crc kubenswrapper[4672]: I1206 10:12:29.557073 4672 scope.go:117] "RemoveContainer" containerID="ca4e55181ab085a6d2d94a78978707a35a3af3000a5af2216eb580aa96202e83" Dec 06 10:12:29 crc kubenswrapper[4672]: E1206 10:12:29.557512 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:12:44 crc kubenswrapper[4672]: I1206 10:12:44.559416 4672 scope.go:117] "RemoveContainer" containerID="ca4e55181ab085a6d2d94a78978707a35a3af3000a5af2216eb580aa96202e83" Dec 06 10:12:45 crc kubenswrapper[4672]: I1206 10:12:45.783232 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerStarted","Data":"a9fc2abd84860456c000814e8f44da296b7a156fa72f23c9ec5a13b3d45f8bff"} Dec 06 10:15:00 crc kubenswrapper[4672]: I1206 10:15:00.197625 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416935-kwlds"] Dec 06 10:15:00 crc kubenswrapper[4672]: E1206 10:15:00.198533 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="868cc727-1d6d-442d-80f4-f3294f7e50de" containerName="extract-content" Dec 06 10:15:00 crc kubenswrapper[4672]: I1206 10:15:00.198546 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="868cc727-1d6d-442d-80f4-f3294f7e50de" containerName="extract-content" Dec 06 10:15:00 crc kubenswrapper[4672]: E1206 10:15:00.198590 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="868cc727-1d6d-442d-80f4-f3294f7e50de" containerName="extract-utilities" Dec 06 10:15:00 crc kubenswrapper[4672]: I1206 10:15:00.198611 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="868cc727-1d6d-442d-80f4-f3294f7e50de" containerName="extract-utilities" Dec 06 10:15:00 crc kubenswrapper[4672]: E1206 10:15:00.198618 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="868cc727-1d6d-442d-80f4-f3294f7e50de" containerName="registry-server" Dec 06 10:15:00 crc kubenswrapper[4672]: I1206 10:15:00.198624 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="868cc727-1d6d-442d-80f4-f3294f7e50de" containerName="registry-server" Dec 06 10:15:00 crc kubenswrapper[4672]: I1206 10:15:00.198828 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="868cc727-1d6d-442d-80f4-f3294f7e50de" containerName="registry-server" Dec 06 10:15:00 crc kubenswrapper[4672]: I1206 10:15:00.199503 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-kwlds" Dec 06 10:15:00 crc kubenswrapper[4672]: I1206 10:15:00.202717 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 10:15:00 crc kubenswrapper[4672]: I1206 10:15:00.203165 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 10:15:00 crc kubenswrapper[4672]: I1206 10:15:00.208063 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416935-kwlds"] Dec 06 10:15:00 crc kubenswrapper[4672]: I1206 10:15:00.305778 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/91b24b50-c34b-4010-bab5-7a5643641264-config-volume\") pod \"collect-profiles-29416935-kwlds\" (UID: \"91b24b50-c34b-4010-bab5-7a5643641264\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-kwlds" Dec 06 10:15:00 crc kubenswrapper[4672]: I1206 10:15:00.305830 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7fwf\" (UniqueName: \"kubernetes.io/projected/91b24b50-c34b-4010-bab5-7a5643641264-kube-api-access-g7fwf\") pod \"collect-profiles-29416935-kwlds\" (UID: \"91b24b50-c34b-4010-bab5-7a5643641264\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-kwlds" Dec 06 10:15:00 crc kubenswrapper[4672]: I1206 10:15:00.305922 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/91b24b50-c34b-4010-bab5-7a5643641264-secret-volume\") pod \"collect-profiles-29416935-kwlds\" (UID: \"91b24b50-c34b-4010-bab5-7a5643641264\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-kwlds" Dec 06 10:15:00 crc kubenswrapper[4672]: I1206 10:15:00.409866 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/91b24b50-c34b-4010-bab5-7a5643641264-config-volume\") pod \"collect-profiles-29416935-kwlds\" (UID: \"91b24b50-c34b-4010-bab5-7a5643641264\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-kwlds" Dec 06 10:15:00 crc kubenswrapper[4672]: I1206 10:15:00.410210 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7fwf\" (UniqueName: \"kubernetes.io/projected/91b24b50-c34b-4010-bab5-7a5643641264-kube-api-access-g7fwf\") pod \"collect-profiles-29416935-kwlds\" (UID: \"91b24b50-c34b-4010-bab5-7a5643641264\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-kwlds" Dec 06 10:15:00 crc kubenswrapper[4672]: I1206 10:15:00.410415 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/91b24b50-c34b-4010-bab5-7a5643641264-secret-volume\") pod \"collect-profiles-29416935-kwlds\" (UID: \"91b24b50-c34b-4010-bab5-7a5643641264\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-kwlds" Dec 06 10:15:00 crc kubenswrapper[4672]: I1206 10:15:00.411965 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/91b24b50-c34b-4010-bab5-7a5643641264-config-volume\") pod \"collect-profiles-29416935-kwlds\" (UID: \"91b24b50-c34b-4010-bab5-7a5643641264\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-kwlds" Dec 06 10:15:00 crc kubenswrapper[4672]: I1206 10:15:00.438256 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/91b24b50-c34b-4010-bab5-7a5643641264-secret-volume\") pod \"collect-profiles-29416935-kwlds\" (UID: \"91b24b50-c34b-4010-bab5-7a5643641264\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-kwlds" Dec 06 10:15:00 crc kubenswrapper[4672]: I1206 10:15:00.450442 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7fwf\" (UniqueName: \"kubernetes.io/projected/91b24b50-c34b-4010-bab5-7a5643641264-kube-api-access-g7fwf\") pod \"collect-profiles-29416935-kwlds\" (UID: \"91b24b50-c34b-4010-bab5-7a5643641264\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-kwlds" Dec 06 10:15:00 crc kubenswrapper[4672]: I1206 10:15:00.542573 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-kwlds" Dec 06 10:15:01 crc kubenswrapper[4672]: I1206 10:15:01.076703 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416935-kwlds"] Dec 06 10:15:01 crc kubenswrapper[4672]: I1206 10:15:01.133139 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-kwlds" event={"ID":"91b24b50-c34b-4010-bab5-7a5643641264","Type":"ContainerStarted","Data":"8739b085a1fe8ba0e30d920bff7e7a8e7aead373d6d460cef4e51a4e814d05d4"} Dec 06 10:15:02 crc kubenswrapper[4672]: I1206 10:15:02.143522 4672 generic.go:334] "Generic (PLEG): container finished" podID="91b24b50-c34b-4010-bab5-7a5643641264" containerID="550ae923bf35c63edcf989aa12452ef3186510a98afbc3fa3c6b08650b9dd5ac" exitCode=0 Dec 06 10:15:02 crc kubenswrapper[4672]: I1206 10:15:02.144040 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-kwlds" event={"ID":"91b24b50-c34b-4010-bab5-7a5643641264","Type":"ContainerDied","Data":"550ae923bf35c63edcf989aa12452ef3186510a98afbc3fa3c6b08650b9dd5ac"} Dec 06 10:15:03 crc kubenswrapper[4672]: I1206 10:15:03.885063 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-kwlds" Dec 06 10:15:03 crc kubenswrapper[4672]: I1206 10:15:03.979856 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/91b24b50-c34b-4010-bab5-7a5643641264-config-volume\") pod \"91b24b50-c34b-4010-bab5-7a5643641264\" (UID: \"91b24b50-c34b-4010-bab5-7a5643641264\") " Dec 06 10:15:03 crc kubenswrapper[4672]: I1206 10:15:03.979917 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7fwf\" (UniqueName: \"kubernetes.io/projected/91b24b50-c34b-4010-bab5-7a5643641264-kube-api-access-g7fwf\") pod \"91b24b50-c34b-4010-bab5-7a5643641264\" (UID: \"91b24b50-c34b-4010-bab5-7a5643641264\") " Dec 06 10:15:03 crc kubenswrapper[4672]: I1206 10:15:03.979948 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/91b24b50-c34b-4010-bab5-7a5643641264-secret-volume\") pod \"91b24b50-c34b-4010-bab5-7a5643641264\" (UID: \"91b24b50-c34b-4010-bab5-7a5643641264\") " Dec 06 10:15:03 crc kubenswrapper[4672]: I1206 10:15:03.980725 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91b24b50-c34b-4010-bab5-7a5643641264-config-volume" (OuterVolumeSpecName: "config-volume") pod "91b24b50-c34b-4010-bab5-7a5643641264" (UID: "91b24b50-c34b-4010-bab5-7a5643641264"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:15:04 crc kubenswrapper[4672]: I1206 10:15:04.002804 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91b24b50-c34b-4010-bab5-7a5643641264-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "91b24b50-c34b-4010-bab5-7a5643641264" (UID: "91b24b50-c34b-4010-bab5-7a5643641264"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:15:04 crc kubenswrapper[4672]: I1206 10:15:04.003828 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91b24b50-c34b-4010-bab5-7a5643641264-kube-api-access-g7fwf" (OuterVolumeSpecName: "kube-api-access-g7fwf") pod "91b24b50-c34b-4010-bab5-7a5643641264" (UID: "91b24b50-c34b-4010-bab5-7a5643641264"). InnerVolumeSpecName "kube-api-access-g7fwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:15:04 crc kubenswrapper[4672]: I1206 10:15:04.082676 4672 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/91b24b50-c34b-4010-bab5-7a5643641264-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 10:15:04 crc kubenswrapper[4672]: I1206 10:15:04.082712 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7fwf\" (UniqueName: \"kubernetes.io/projected/91b24b50-c34b-4010-bab5-7a5643641264-kube-api-access-g7fwf\") on node \"crc\" DevicePath \"\"" Dec 06 10:15:04 crc kubenswrapper[4672]: I1206 10:15:04.082730 4672 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/91b24b50-c34b-4010-bab5-7a5643641264-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 10:15:04 crc kubenswrapper[4672]: I1206 10:15:04.176798 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-kwlds" event={"ID":"91b24b50-c34b-4010-bab5-7a5643641264","Type":"ContainerDied","Data":"8739b085a1fe8ba0e30d920bff7e7a8e7aead373d6d460cef4e51a4e814d05d4"} Dec 06 10:15:04 crc kubenswrapper[4672]: I1206 10:15:04.176864 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-kwlds" Dec 06 10:15:04 crc kubenswrapper[4672]: I1206 10:15:04.176833 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8739b085a1fe8ba0e30d920bff7e7a8e7aead373d6d460cef4e51a4e814d05d4" Dec 06 10:15:04 crc kubenswrapper[4672]: I1206 10:15:04.975926 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416890-2hz5q"] Dec 06 10:15:04 crc kubenswrapper[4672]: I1206 10:15:04.987483 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416890-2hz5q"] Dec 06 10:15:06 crc kubenswrapper[4672]: I1206 10:15:06.573342 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2804b63a-981b-41bd-bedb-370f4d1a4820" path="/var/lib/kubelet/pods/2804b63a-981b-41bd-bedb-370f4d1a4820/volumes" Dec 06 10:15:07 crc kubenswrapper[4672]: I1206 10:15:07.584868 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tf4k7"] Dec 06 10:15:07 crc kubenswrapper[4672]: E1206 10:15:07.585659 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91b24b50-c34b-4010-bab5-7a5643641264" containerName="collect-profiles" Dec 06 10:15:07 crc kubenswrapper[4672]: I1206 10:15:07.585675 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="91b24b50-c34b-4010-bab5-7a5643641264" containerName="collect-profiles" Dec 06 10:15:07 crc kubenswrapper[4672]: I1206 10:15:07.585923 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="91b24b50-c34b-4010-bab5-7a5643641264" containerName="collect-profiles" Dec 06 10:15:07 crc kubenswrapper[4672]: I1206 10:15:07.588202 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tf4k7" Dec 06 10:15:07 crc kubenswrapper[4672]: I1206 10:15:07.608807 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tf4k7"] Dec 06 10:15:07 crc kubenswrapper[4672]: I1206 10:15:07.661387 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rp66\" (UniqueName: \"kubernetes.io/projected/0b8efc57-f746-4182-82de-0282f336b1a1-kube-api-access-8rp66\") pod \"redhat-operators-tf4k7\" (UID: \"0b8efc57-f746-4182-82de-0282f336b1a1\") " pod="openshift-marketplace/redhat-operators-tf4k7" Dec 06 10:15:07 crc kubenswrapper[4672]: I1206 10:15:07.661578 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b8efc57-f746-4182-82de-0282f336b1a1-catalog-content\") pod \"redhat-operators-tf4k7\" (UID: \"0b8efc57-f746-4182-82de-0282f336b1a1\") " pod="openshift-marketplace/redhat-operators-tf4k7" Dec 06 10:15:07 crc kubenswrapper[4672]: I1206 10:15:07.661649 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b8efc57-f746-4182-82de-0282f336b1a1-utilities\") pod \"redhat-operators-tf4k7\" (UID: \"0b8efc57-f746-4182-82de-0282f336b1a1\") " pod="openshift-marketplace/redhat-operators-tf4k7" Dec 06 10:15:07 crc kubenswrapper[4672]: I1206 10:15:07.764443 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rp66\" (UniqueName: \"kubernetes.io/projected/0b8efc57-f746-4182-82de-0282f336b1a1-kube-api-access-8rp66\") pod \"redhat-operators-tf4k7\" (UID: \"0b8efc57-f746-4182-82de-0282f336b1a1\") " pod="openshift-marketplace/redhat-operators-tf4k7" Dec 06 10:15:07 crc kubenswrapper[4672]: I1206 10:15:07.764729 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b8efc57-f746-4182-82de-0282f336b1a1-catalog-content\") pod \"redhat-operators-tf4k7\" (UID: \"0b8efc57-f746-4182-82de-0282f336b1a1\") " pod="openshift-marketplace/redhat-operators-tf4k7" Dec 06 10:15:07 crc kubenswrapper[4672]: I1206 10:15:07.764819 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b8efc57-f746-4182-82de-0282f336b1a1-utilities\") pod \"redhat-operators-tf4k7\" (UID: \"0b8efc57-f746-4182-82de-0282f336b1a1\") " pod="openshift-marketplace/redhat-operators-tf4k7" Dec 06 10:15:07 crc kubenswrapper[4672]: I1206 10:15:07.765288 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b8efc57-f746-4182-82de-0282f336b1a1-catalog-content\") pod \"redhat-operators-tf4k7\" (UID: \"0b8efc57-f746-4182-82de-0282f336b1a1\") " pod="openshift-marketplace/redhat-operators-tf4k7" Dec 06 10:15:07 crc kubenswrapper[4672]: I1206 10:15:07.765374 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b8efc57-f746-4182-82de-0282f336b1a1-utilities\") pod \"redhat-operators-tf4k7\" (UID: \"0b8efc57-f746-4182-82de-0282f336b1a1\") " pod="openshift-marketplace/redhat-operators-tf4k7" Dec 06 10:15:07 crc kubenswrapper[4672]: I1206 10:15:07.787437 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rp66\" (UniqueName: \"kubernetes.io/projected/0b8efc57-f746-4182-82de-0282f336b1a1-kube-api-access-8rp66\") pod \"redhat-operators-tf4k7\" (UID: \"0b8efc57-f746-4182-82de-0282f336b1a1\") " pod="openshift-marketplace/redhat-operators-tf4k7" Dec 06 10:15:07 crc kubenswrapper[4672]: I1206 10:15:07.947579 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tf4k7" Dec 06 10:15:08 crc kubenswrapper[4672]: I1206 10:15:08.447725 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tf4k7"] Dec 06 10:15:09 crc kubenswrapper[4672]: I1206 10:15:09.221205 4672 generic.go:334] "Generic (PLEG): container finished" podID="0b8efc57-f746-4182-82de-0282f336b1a1" containerID="5924f195104133a7f9de96ebfb003e317dcea896684b95a4a479811cec3bf006" exitCode=0 Dec 06 10:15:09 crc kubenswrapper[4672]: I1206 10:15:09.221291 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tf4k7" event={"ID":"0b8efc57-f746-4182-82de-0282f336b1a1","Type":"ContainerDied","Data":"5924f195104133a7f9de96ebfb003e317dcea896684b95a4a479811cec3bf006"} Dec 06 10:15:09 crc kubenswrapper[4672]: I1206 10:15:09.222905 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tf4k7" event={"ID":"0b8efc57-f746-4182-82de-0282f336b1a1","Type":"ContainerStarted","Data":"3b432fb4d56d69249a81c1e4cd33c9b6c2a580815b83c3d48ab59908480784ce"} Dec 06 10:15:10 crc kubenswrapper[4672]: I1206 10:15:10.236255 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tf4k7" event={"ID":"0b8efc57-f746-4182-82de-0282f336b1a1","Type":"ContainerStarted","Data":"a6c998a40c455503e1f183772055b5f56f899b432bbe9c5d12c3ed8cc0f6cb40"} Dec 06 10:15:12 crc kubenswrapper[4672]: I1206 10:15:12.319231 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:15:12 crc kubenswrapper[4672]: I1206 10:15:12.319548 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:15:13 crc kubenswrapper[4672]: I1206 10:15:13.261554 4672 generic.go:334] "Generic (PLEG): container finished" podID="0b8efc57-f746-4182-82de-0282f336b1a1" containerID="a6c998a40c455503e1f183772055b5f56f899b432bbe9c5d12c3ed8cc0f6cb40" exitCode=0 Dec 06 10:15:13 crc kubenswrapper[4672]: I1206 10:15:13.261631 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tf4k7" event={"ID":"0b8efc57-f746-4182-82de-0282f336b1a1","Type":"ContainerDied","Data":"a6c998a40c455503e1f183772055b5f56f899b432bbe9c5d12c3ed8cc0f6cb40"} Dec 06 10:15:14 crc kubenswrapper[4672]: I1206 10:15:14.276210 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tf4k7" event={"ID":"0b8efc57-f746-4182-82de-0282f336b1a1","Type":"ContainerStarted","Data":"44241806c3ec1ee6fc4a2855791e215a2a4f04939b941cdbe607450bf2bea84f"} Dec 06 10:15:14 crc kubenswrapper[4672]: I1206 10:15:14.308895 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tf4k7" podStartSLOduration=2.849543288 podStartE2EDuration="7.308872405s" podCreationTimestamp="2025-12-06 10:15:07 +0000 UTC" firstStartedPulling="2025-12-06 10:15:09.224786943 +0000 UTC m=+4126.969047230" lastFinishedPulling="2025-12-06 10:15:13.68411606 +0000 UTC m=+4131.428376347" observedRunningTime="2025-12-06 10:15:14.300864078 +0000 UTC m=+4132.045124365" watchObservedRunningTime="2025-12-06 10:15:14.308872405 +0000 UTC m=+4132.053132692" Dec 06 10:15:17 crc kubenswrapper[4672]: I1206 10:15:17.948578 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tf4k7" Dec 06 10:15:17 crc kubenswrapper[4672]: I1206 10:15:17.949172 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tf4k7" Dec 06 10:15:18 crc kubenswrapper[4672]: I1206 10:15:18.994534 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tf4k7" podUID="0b8efc57-f746-4182-82de-0282f336b1a1" containerName="registry-server" probeResult="failure" output=< Dec 06 10:15:18 crc kubenswrapper[4672]: timeout: failed to connect service ":50051" within 1s Dec 06 10:15:18 crc kubenswrapper[4672]: > Dec 06 10:15:28 crc kubenswrapper[4672]: I1206 10:15:28.001239 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tf4k7" Dec 06 10:15:28 crc kubenswrapper[4672]: I1206 10:15:28.070177 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tf4k7" Dec 06 10:15:28 crc kubenswrapper[4672]: I1206 10:15:28.250058 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tf4k7"] Dec 06 10:15:29 crc kubenswrapper[4672]: I1206 10:15:29.004645 4672 scope.go:117] "RemoveContainer" containerID="5f8d1e342b2907d2cfc6d9974f8e7517f45ea6edd8ef201e2b8cdb4d95c8b7bb" Dec 06 10:15:29 crc kubenswrapper[4672]: I1206 10:15:29.414676 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tf4k7" podUID="0b8efc57-f746-4182-82de-0282f336b1a1" containerName="registry-server" containerID="cri-o://44241806c3ec1ee6fc4a2855791e215a2a4f04939b941cdbe607450bf2bea84f" gracePeriod=2 Dec 06 10:15:30 crc kubenswrapper[4672]: I1206 10:15:30.021506 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tf4k7" Dec 06 10:15:30 crc kubenswrapper[4672]: I1206 10:15:30.091540 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b8efc57-f746-4182-82de-0282f336b1a1-utilities\") pod \"0b8efc57-f746-4182-82de-0282f336b1a1\" (UID: \"0b8efc57-f746-4182-82de-0282f336b1a1\") " Dec 06 10:15:30 crc kubenswrapper[4672]: I1206 10:15:30.091675 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rp66\" (UniqueName: \"kubernetes.io/projected/0b8efc57-f746-4182-82de-0282f336b1a1-kube-api-access-8rp66\") pod \"0b8efc57-f746-4182-82de-0282f336b1a1\" (UID: \"0b8efc57-f746-4182-82de-0282f336b1a1\") " Dec 06 10:15:30 crc kubenswrapper[4672]: I1206 10:15:30.091999 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b8efc57-f746-4182-82de-0282f336b1a1-catalog-content\") pod \"0b8efc57-f746-4182-82de-0282f336b1a1\" (UID: \"0b8efc57-f746-4182-82de-0282f336b1a1\") " Dec 06 10:15:30 crc kubenswrapper[4672]: I1206 10:15:30.093485 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b8efc57-f746-4182-82de-0282f336b1a1-utilities" (OuterVolumeSpecName: "utilities") pod "0b8efc57-f746-4182-82de-0282f336b1a1" (UID: "0b8efc57-f746-4182-82de-0282f336b1a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:15:30 crc kubenswrapper[4672]: I1206 10:15:30.110969 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b8efc57-f746-4182-82de-0282f336b1a1-kube-api-access-8rp66" (OuterVolumeSpecName: "kube-api-access-8rp66") pod "0b8efc57-f746-4182-82de-0282f336b1a1" (UID: "0b8efc57-f746-4182-82de-0282f336b1a1"). InnerVolumeSpecName "kube-api-access-8rp66". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:15:30 crc kubenswrapper[4672]: I1206 10:15:30.194761 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b8efc57-f746-4182-82de-0282f336b1a1-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:15:30 crc kubenswrapper[4672]: I1206 10:15:30.194794 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rp66\" (UniqueName: \"kubernetes.io/projected/0b8efc57-f746-4182-82de-0282f336b1a1-kube-api-access-8rp66\") on node \"crc\" DevicePath \"\"" Dec 06 10:15:30 crc kubenswrapper[4672]: I1206 10:15:30.218025 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b8efc57-f746-4182-82de-0282f336b1a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b8efc57-f746-4182-82de-0282f336b1a1" (UID: "0b8efc57-f746-4182-82de-0282f336b1a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:15:30 crc kubenswrapper[4672]: I1206 10:15:30.296532 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b8efc57-f746-4182-82de-0282f336b1a1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:15:30 crc kubenswrapper[4672]: I1206 10:15:30.427364 4672 generic.go:334] "Generic (PLEG): container finished" podID="0b8efc57-f746-4182-82de-0282f336b1a1" containerID="44241806c3ec1ee6fc4a2855791e215a2a4f04939b941cdbe607450bf2bea84f" exitCode=0 Dec 06 10:15:30 crc kubenswrapper[4672]: I1206 10:15:30.427429 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tf4k7" Dec 06 10:15:30 crc kubenswrapper[4672]: I1206 10:15:30.427460 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tf4k7" event={"ID":"0b8efc57-f746-4182-82de-0282f336b1a1","Type":"ContainerDied","Data":"44241806c3ec1ee6fc4a2855791e215a2a4f04939b941cdbe607450bf2bea84f"} Dec 06 10:15:30 crc kubenswrapper[4672]: I1206 10:15:30.427903 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tf4k7" event={"ID":"0b8efc57-f746-4182-82de-0282f336b1a1","Type":"ContainerDied","Data":"3b432fb4d56d69249a81c1e4cd33c9b6c2a580815b83c3d48ab59908480784ce"} Dec 06 10:15:30 crc kubenswrapper[4672]: I1206 10:15:30.427925 4672 scope.go:117] "RemoveContainer" containerID="44241806c3ec1ee6fc4a2855791e215a2a4f04939b941cdbe607450bf2bea84f" Dec 06 10:15:30 crc kubenswrapper[4672]: I1206 10:15:30.450416 4672 scope.go:117] "RemoveContainer" containerID="a6c998a40c455503e1f183772055b5f56f899b432bbe9c5d12c3ed8cc0f6cb40" Dec 06 10:15:30 crc kubenswrapper[4672]: I1206 10:15:30.470476 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tf4k7"] Dec 06 10:15:30 crc kubenswrapper[4672]: I1206 10:15:30.481014 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tf4k7"] Dec 06 10:15:30 crc kubenswrapper[4672]: I1206 10:15:30.489217 4672 scope.go:117] "RemoveContainer" containerID="5924f195104133a7f9de96ebfb003e317dcea896684b95a4a479811cec3bf006" Dec 06 10:15:30 crc kubenswrapper[4672]: I1206 10:15:30.540053 4672 scope.go:117] "RemoveContainer" containerID="44241806c3ec1ee6fc4a2855791e215a2a4f04939b941cdbe607450bf2bea84f" Dec 06 10:15:30 crc kubenswrapper[4672]: E1206 10:15:30.540524 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44241806c3ec1ee6fc4a2855791e215a2a4f04939b941cdbe607450bf2bea84f\": container with ID starting with 44241806c3ec1ee6fc4a2855791e215a2a4f04939b941cdbe607450bf2bea84f not found: ID does not exist" containerID="44241806c3ec1ee6fc4a2855791e215a2a4f04939b941cdbe607450bf2bea84f" Dec 06 10:15:30 crc kubenswrapper[4672]: I1206 10:15:30.540559 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44241806c3ec1ee6fc4a2855791e215a2a4f04939b941cdbe607450bf2bea84f"} err="failed to get container status \"44241806c3ec1ee6fc4a2855791e215a2a4f04939b941cdbe607450bf2bea84f\": rpc error: code = NotFound desc = could not find container \"44241806c3ec1ee6fc4a2855791e215a2a4f04939b941cdbe607450bf2bea84f\": container with ID starting with 44241806c3ec1ee6fc4a2855791e215a2a4f04939b941cdbe607450bf2bea84f not found: ID does not exist" Dec 06 10:15:30 crc kubenswrapper[4672]: I1206 10:15:30.540579 4672 scope.go:117] "RemoveContainer" containerID="a6c998a40c455503e1f183772055b5f56f899b432bbe9c5d12c3ed8cc0f6cb40" Dec 06 10:15:30 crc kubenswrapper[4672]: E1206 10:15:30.540830 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6c998a40c455503e1f183772055b5f56f899b432bbe9c5d12c3ed8cc0f6cb40\": container with ID starting with a6c998a40c455503e1f183772055b5f56f899b432bbe9c5d12c3ed8cc0f6cb40 not found: ID does not exist" containerID="a6c998a40c455503e1f183772055b5f56f899b432bbe9c5d12c3ed8cc0f6cb40" Dec 06 10:15:30 crc kubenswrapper[4672]: I1206 10:15:30.540858 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6c998a40c455503e1f183772055b5f56f899b432bbe9c5d12c3ed8cc0f6cb40"} err="failed to get container status \"a6c998a40c455503e1f183772055b5f56f899b432bbe9c5d12c3ed8cc0f6cb40\": rpc error: code = NotFound desc = could not find container \"a6c998a40c455503e1f183772055b5f56f899b432bbe9c5d12c3ed8cc0f6cb40\": container with ID starting with a6c998a40c455503e1f183772055b5f56f899b432bbe9c5d12c3ed8cc0f6cb40 not found: ID does not exist" Dec 06 10:15:30 crc kubenswrapper[4672]: I1206 10:15:30.540873 4672 scope.go:117] "RemoveContainer" containerID="5924f195104133a7f9de96ebfb003e317dcea896684b95a4a479811cec3bf006" Dec 06 10:15:30 crc kubenswrapper[4672]: E1206 10:15:30.541325 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5924f195104133a7f9de96ebfb003e317dcea896684b95a4a479811cec3bf006\": container with ID starting with 5924f195104133a7f9de96ebfb003e317dcea896684b95a4a479811cec3bf006 not found: ID does not exist" containerID="5924f195104133a7f9de96ebfb003e317dcea896684b95a4a479811cec3bf006" Dec 06 10:15:30 crc kubenswrapper[4672]: I1206 10:15:30.541347 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5924f195104133a7f9de96ebfb003e317dcea896684b95a4a479811cec3bf006"} err="failed to get container status \"5924f195104133a7f9de96ebfb003e317dcea896684b95a4a479811cec3bf006\": rpc error: code = NotFound desc = could not find container \"5924f195104133a7f9de96ebfb003e317dcea896684b95a4a479811cec3bf006\": container with ID starting with 5924f195104133a7f9de96ebfb003e317dcea896684b95a4a479811cec3bf006 not found: ID does not exist" Dec 06 10:15:30 crc kubenswrapper[4672]: I1206 10:15:30.570965 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b8efc57-f746-4182-82de-0282f336b1a1" path="/var/lib/kubelet/pods/0b8efc57-f746-4182-82de-0282f336b1a1/volumes" Dec 06 10:15:40 crc kubenswrapper[4672]: I1206 10:15:40.508921 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l6tnr"] Dec 06 10:15:40 crc kubenswrapper[4672]: E1206 10:15:40.509701 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b8efc57-f746-4182-82de-0282f336b1a1" containerName="extract-content" Dec 06 10:15:40 crc kubenswrapper[4672]: I1206 10:15:40.509713 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b8efc57-f746-4182-82de-0282f336b1a1" containerName="extract-content" Dec 06 10:15:40 crc kubenswrapper[4672]: E1206 10:15:40.509748 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b8efc57-f746-4182-82de-0282f336b1a1" containerName="registry-server" Dec 06 10:15:40 crc kubenswrapper[4672]: I1206 10:15:40.509755 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b8efc57-f746-4182-82de-0282f336b1a1" containerName="registry-server" Dec 06 10:15:40 crc kubenswrapper[4672]: E1206 10:15:40.509765 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b8efc57-f746-4182-82de-0282f336b1a1" containerName="extract-utilities" Dec 06 10:15:40 crc kubenswrapper[4672]: I1206 10:15:40.509771 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b8efc57-f746-4182-82de-0282f336b1a1" containerName="extract-utilities" Dec 06 10:15:40 crc kubenswrapper[4672]: I1206 10:15:40.509952 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b8efc57-f746-4182-82de-0282f336b1a1" containerName="registry-server" Dec 06 10:15:40 crc kubenswrapper[4672]: I1206 10:15:40.511288 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l6tnr" Dec 06 10:15:40 crc kubenswrapper[4672]: I1206 10:15:40.534661 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l6tnr"] Dec 06 10:15:40 crc kubenswrapper[4672]: I1206 10:15:40.644944 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1886d0f-48ef-405c-a21a-a06c51a4f549-utilities\") pod \"community-operators-l6tnr\" (UID: \"f1886d0f-48ef-405c-a21a-a06c51a4f549\") " pod="openshift-marketplace/community-operators-l6tnr" Dec 06 10:15:40 crc kubenswrapper[4672]: I1206 10:15:40.645191 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm9c4\" (UniqueName: \"kubernetes.io/projected/f1886d0f-48ef-405c-a21a-a06c51a4f549-kube-api-access-tm9c4\") pod \"community-operators-l6tnr\" (UID: \"f1886d0f-48ef-405c-a21a-a06c51a4f549\") " pod="openshift-marketplace/community-operators-l6tnr" Dec 06 10:15:40 crc kubenswrapper[4672]: I1206 10:15:40.645369 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1886d0f-48ef-405c-a21a-a06c51a4f549-catalog-content\") pod \"community-operators-l6tnr\" (UID: \"f1886d0f-48ef-405c-a21a-a06c51a4f549\") " pod="openshift-marketplace/community-operators-l6tnr" Dec 06 10:15:40 crc kubenswrapper[4672]: I1206 10:15:40.747942 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1886d0f-48ef-405c-a21a-a06c51a4f549-utilities\") pod \"community-operators-l6tnr\" (UID: \"f1886d0f-48ef-405c-a21a-a06c51a4f549\") " pod="openshift-marketplace/community-operators-l6tnr" Dec 06 10:15:40 crc kubenswrapper[4672]: I1206 10:15:40.748028 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm9c4\" (UniqueName: \"kubernetes.io/projected/f1886d0f-48ef-405c-a21a-a06c51a4f549-kube-api-access-tm9c4\") pod \"community-operators-l6tnr\" (UID: \"f1886d0f-48ef-405c-a21a-a06c51a4f549\") " pod="openshift-marketplace/community-operators-l6tnr" Dec 06 10:15:40 crc kubenswrapper[4672]: I1206 10:15:40.748068 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1886d0f-48ef-405c-a21a-a06c51a4f549-catalog-content\") pod \"community-operators-l6tnr\" (UID: \"f1886d0f-48ef-405c-a21a-a06c51a4f549\") " pod="openshift-marketplace/community-operators-l6tnr" Dec 06 10:15:40 crc kubenswrapper[4672]: I1206 10:15:40.748581 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1886d0f-48ef-405c-a21a-a06c51a4f549-catalog-content\") pod \"community-operators-l6tnr\" (UID: \"f1886d0f-48ef-405c-a21a-a06c51a4f549\") " pod="openshift-marketplace/community-operators-l6tnr" Dec 06 10:15:40 crc kubenswrapper[4672]: I1206 10:15:40.748884 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1886d0f-48ef-405c-a21a-a06c51a4f549-utilities\") pod \"community-operators-l6tnr\" (UID: \"f1886d0f-48ef-405c-a21a-a06c51a4f549\") " pod="openshift-marketplace/community-operators-l6tnr" Dec 06 10:15:40 crc kubenswrapper[4672]: I1206 10:15:40.774647 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm9c4\" (UniqueName: \"kubernetes.io/projected/f1886d0f-48ef-405c-a21a-a06c51a4f549-kube-api-access-tm9c4\") pod \"community-operators-l6tnr\" (UID: \"f1886d0f-48ef-405c-a21a-a06c51a4f549\") " pod="openshift-marketplace/community-operators-l6tnr" Dec 06 10:15:40 crc kubenswrapper[4672]: I1206 10:15:40.833177 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l6tnr" Dec 06 10:15:41 crc kubenswrapper[4672]: I1206 10:15:41.499748 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l6tnr"] Dec 06 10:15:41 crc kubenswrapper[4672]: I1206 10:15:41.519004 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l6tnr" event={"ID":"f1886d0f-48ef-405c-a21a-a06c51a4f549","Type":"ContainerStarted","Data":"60a96303adf92c59ae1661fefe2aae5a148e1d3f2d5797f8348145dfa7cc33fd"} Dec 06 10:15:42 crc kubenswrapper[4672]: I1206 10:15:42.320293 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:15:42 crc kubenswrapper[4672]: I1206 10:15:42.320695 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:15:42 crc kubenswrapper[4672]: I1206 10:15:42.528513 4672 generic.go:334] "Generic (PLEG): container finished" podID="f1886d0f-48ef-405c-a21a-a06c51a4f549" containerID="73075c8208d3c4aca57ae45aca219ab831762d276fb4a78c24ca101d877eabac" exitCode=0 Dec 06 10:15:42 crc kubenswrapper[4672]: I1206 10:15:42.529422 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l6tnr" event={"ID":"f1886d0f-48ef-405c-a21a-a06c51a4f549","Type":"ContainerDied","Data":"73075c8208d3c4aca57ae45aca219ab831762d276fb4a78c24ca101d877eabac"} Dec 06 10:15:44 crc kubenswrapper[4672]: I1206 10:15:44.546071 4672 generic.go:334] "Generic (PLEG): container finished" podID="f1886d0f-48ef-405c-a21a-a06c51a4f549" containerID="b4b76f8e343fba3a77dd9091ff3bf7dabcec33d485baf2ffff4649e7efe636ec" exitCode=0 Dec 06 10:15:44 crc kubenswrapper[4672]: I1206 10:15:44.546131 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l6tnr" event={"ID":"f1886d0f-48ef-405c-a21a-a06c51a4f549","Type":"ContainerDied","Data":"b4b76f8e343fba3a77dd9091ff3bf7dabcec33d485baf2ffff4649e7efe636ec"} Dec 06 10:15:46 crc kubenswrapper[4672]: I1206 10:15:46.567459 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l6tnr" event={"ID":"f1886d0f-48ef-405c-a21a-a06c51a4f549","Type":"ContainerStarted","Data":"4ae8f1528b128e56a619d2db26991d575f0574a5a09018633e248f9d6717c0f2"} Dec 06 10:15:46 crc kubenswrapper[4672]: I1206 10:15:46.596095 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l6tnr" podStartSLOduration=4.210326358 podStartE2EDuration="6.596077237s" podCreationTimestamp="2025-12-06 10:15:40 +0000 UTC" firstStartedPulling="2025-12-06 10:15:42.530744022 +0000 UTC m=+4160.275004309" lastFinishedPulling="2025-12-06 10:15:44.916494911 +0000 UTC m=+4162.660755188" observedRunningTime="2025-12-06 10:15:46.589488699 +0000 UTC m=+4164.333748996" watchObservedRunningTime="2025-12-06 10:15:46.596077237 +0000 UTC m=+4164.340337524" Dec 06 10:15:50 crc kubenswrapper[4672]: I1206 10:15:50.834405 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l6tnr" Dec 06 10:15:50 crc kubenswrapper[4672]: I1206 10:15:50.836545 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l6tnr" Dec 06 10:15:50 crc kubenswrapper[4672]: I1206 10:15:50.887263 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l6tnr" Dec 06 10:15:51 crc kubenswrapper[4672]: I1206 10:15:51.681515 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l6tnr" Dec 06 10:15:51 crc kubenswrapper[4672]: I1206 10:15:51.749538 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l6tnr"] Dec 06 10:15:53 crc kubenswrapper[4672]: I1206 10:15:53.627387 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l6tnr" podUID="f1886d0f-48ef-405c-a21a-a06c51a4f549" containerName="registry-server" containerID="cri-o://4ae8f1528b128e56a619d2db26991d575f0574a5a09018633e248f9d6717c0f2" gracePeriod=2 Dec 06 10:15:54 crc kubenswrapper[4672]: I1206 10:15:54.115031 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l6tnr" Dec 06 10:15:54 crc kubenswrapper[4672]: I1206 10:15:54.145512 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tm9c4\" (UniqueName: \"kubernetes.io/projected/f1886d0f-48ef-405c-a21a-a06c51a4f549-kube-api-access-tm9c4\") pod \"f1886d0f-48ef-405c-a21a-a06c51a4f549\" (UID: \"f1886d0f-48ef-405c-a21a-a06c51a4f549\") " Dec 06 10:15:54 crc kubenswrapper[4672]: I1206 10:15:54.147886 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1886d0f-48ef-405c-a21a-a06c51a4f549-utilities\") pod \"f1886d0f-48ef-405c-a21a-a06c51a4f549\" (UID: \"f1886d0f-48ef-405c-a21a-a06c51a4f549\") " Dec 06 10:15:54 crc kubenswrapper[4672]: I1206 10:15:54.148064 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1886d0f-48ef-405c-a21a-a06c51a4f549-catalog-content\") pod \"f1886d0f-48ef-405c-a21a-a06c51a4f549\" (UID: \"f1886d0f-48ef-405c-a21a-a06c51a4f549\") " Dec 06 10:15:54 crc kubenswrapper[4672]: I1206 10:15:54.174320 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1886d0f-48ef-405c-a21a-a06c51a4f549-utilities" (OuterVolumeSpecName: "utilities") pod "f1886d0f-48ef-405c-a21a-a06c51a4f549" (UID: "f1886d0f-48ef-405c-a21a-a06c51a4f549"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:15:54 crc kubenswrapper[4672]: I1206 10:15:54.213797 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1886d0f-48ef-405c-a21a-a06c51a4f549-kube-api-access-tm9c4" (OuterVolumeSpecName: "kube-api-access-tm9c4") pod "f1886d0f-48ef-405c-a21a-a06c51a4f549" (UID: "f1886d0f-48ef-405c-a21a-a06c51a4f549"). InnerVolumeSpecName "kube-api-access-tm9c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:15:54 crc kubenswrapper[4672]: I1206 10:15:54.319885 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tm9c4\" (UniqueName: \"kubernetes.io/projected/f1886d0f-48ef-405c-a21a-a06c51a4f549-kube-api-access-tm9c4\") on node \"crc\" DevicePath \"\"" Dec 06 10:15:54 crc kubenswrapper[4672]: I1206 10:15:54.319928 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1886d0f-48ef-405c-a21a-a06c51a4f549-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:15:54 crc kubenswrapper[4672]: I1206 10:15:54.347079 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1886d0f-48ef-405c-a21a-a06c51a4f549-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1886d0f-48ef-405c-a21a-a06c51a4f549" (UID: "f1886d0f-48ef-405c-a21a-a06c51a4f549"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:15:54 crc kubenswrapper[4672]: I1206 10:15:54.422052 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1886d0f-48ef-405c-a21a-a06c51a4f549-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:15:54 crc kubenswrapper[4672]: I1206 10:15:54.639797 4672 generic.go:334] "Generic (PLEG): container finished" podID="f1886d0f-48ef-405c-a21a-a06c51a4f549" containerID="4ae8f1528b128e56a619d2db26991d575f0574a5a09018633e248f9d6717c0f2" exitCode=0 Dec 06 10:15:54 crc kubenswrapper[4672]: I1206 10:15:54.639878 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l6tnr" Dec 06 10:15:54 crc kubenswrapper[4672]: I1206 10:15:54.639901 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l6tnr" event={"ID":"f1886d0f-48ef-405c-a21a-a06c51a4f549","Type":"ContainerDied","Data":"4ae8f1528b128e56a619d2db26991d575f0574a5a09018633e248f9d6717c0f2"} Dec 06 10:15:54 crc kubenswrapper[4672]: I1206 10:15:54.641627 4672 scope.go:117] "RemoveContainer" containerID="4ae8f1528b128e56a619d2db26991d575f0574a5a09018633e248f9d6717c0f2" Dec 06 10:15:54 crc kubenswrapper[4672]: I1206 10:15:54.641802 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l6tnr" event={"ID":"f1886d0f-48ef-405c-a21a-a06c51a4f549","Type":"ContainerDied","Data":"60a96303adf92c59ae1661fefe2aae5a148e1d3f2d5797f8348145dfa7cc33fd"} Dec 06 10:15:54 crc kubenswrapper[4672]: I1206 10:15:54.679592 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l6tnr"] Dec 06 10:15:54 crc kubenswrapper[4672]: I1206 10:15:54.679745 4672 scope.go:117] "RemoveContainer" containerID="b4b76f8e343fba3a77dd9091ff3bf7dabcec33d485baf2ffff4649e7efe636ec" Dec 06 10:15:54 crc kubenswrapper[4672]: I1206 10:15:54.688059 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l6tnr"] Dec 06 10:15:54 crc kubenswrapper[4672]: I1206 10:15:54.982466 4672 scope.go:117] "RemoveContainer" containerID="73075c8208d3c4aca57ae45aca219ab831762d276fb4a78c24ca101d877eabac" Dec 06 10:15:55 crc kubenswrapper[4672]: I1206 10:15:55.039906 4672 scope.go:117] "RemoveContainer" containerID="4ae8f1528b128e56a619d2db26991d575f0574a5a09018633e248f9d6717c0f2" Dec 06 10:15:55 crc kubenswrapper[4672]: E1206 10:15:55.040395 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ae8f1528b128e56a619d2db26991d575f0574a5a09018633e248f9d6717c0f2\": container with ID starting with 4ae8f1528b128e56a619d2db26991d575f0574a5a09018633e248f9d6717c0f2 not found: ID does not exist" containerID="4ae8f1528b128e56a619d2db26991d575f0574a5a09018633e248f9d6717c0f2" Dec 06 10:15:55 crc kubenswrapper[4672]: I1206 10:15:55.040447 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ae8f1528b128e56a619d2db26991d575f0574a5a09018633e248f9d6717c0f2"} err="failed to get container status \"4ae8f1528b128e56a619d2db26991d575f0574a5a09018633e248f9d6717c0f2\": rpc error: code = NotFound desc = could not find container \"4ae8f1528b128e56a619d2db26991d575f0574a5a09018633e248f9d6717c0f2\": container with ID starting with 4ae8f1528b128e56a619d2db26991d575f0574a5a09018633e248f9d6717c0f2 not found: ID does not exist" Dec 06 10:15:55 crc kubenswrapper[4672]: I1206 10:15:55.040481 4672 scope.go:117] "RemoveContainer" containerID="b4b76f8e343fba3a77dd9091ff3bf7dabcec33d485baf2ffff4649e7efe636ec" Dec 06 10:15:55 crc kubenswrapper[4672]: E1206 10:15:55.040848 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4b76f8e343fba3a77dd9091ff3bf7dabcec33d485baf2ffff4649e7efe636ec\": container with ID starting with b4b76f8e343fba3a77dd9091ff3bf7dabcec33d485baf2ffff4649e7efe636ec not found: ID does not exist" containerID="b4b76f8e343fba3a77dd9091ff3bf7dabcec33d485baf2ffff4649e7efe636ec" Dec 06 10:15:55 crc kubenswrapper[4672]: I1206 10:15:55.040875 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4b76f8e343fba3a77dd9091ff3bf7dabcec33d485baf2ffff4649e7efe636ec"} err="failed to get container status \"b4b76f8e343fba3a77dd9091ff3bf7dabcec33d485baf2ffff4649e7efe636ec\": rpc error: code = NotFound desc = could not find container \"b4b76f8e343fba3a77dd9091ff3bf7dabcec33d485baf2ffff4649e7efe636ec\": container with ID starting with b4b76f8e343fba3a77dd9091ff3bf7dabcec33d485baf2ffff4649e7efe636ec not found: ID does not exist" Dec 06 10:15:55 crc kubenswrapper[4672]: I1206 10:15:55.040896 4672 scope.go:117] "RemoveContainer" containerID="73075c8208d3c4aca57ae45aca219ab831762d276fb4a78c24ca101d877eabac" Dec 06 10:15:55 crc kubenswrapper[4672]: E1206 10:15:55.041268 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73075c8208d3c4aca57ae45aca219ab831762d276fb4a78c24ca101d877eabac\": container with ID starting with 73075c8208d3c4aca57ae45aca219ab831762d276fb4a78c24ca101d877eabac not found: ID does not exist" containerID="73075c8208d3c4aca57ae45aca219ab831762d276fb4a78c24ca101d877eabac" Dec 06 10:15:55 crc kubenswrapper[4672]: I1206 10:15:55.041298 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73075c8208d3c4aca57ae45aca219ab831762d276fb4a78c24ca101d877eabac"} err="failed to get container status \"73075c8208d3c4aca57ae45aca219ab831762d276fb4a78c24ca101d877eabac\": rpc error: code = NotFound desc = could not find container \"73075c8208d3c4aca57ae45aca219ab831762d276fb4a78c24ca101d877eabac\": container with ID starting with 73075c8208d3c4aca57ae45aca219ab831762d276fb4a78c24ca101d877eabac not found: ID does not exist" Dec 06 10:15:56 crc kubenswrapper[4672]: I1206 10:15:56.569015 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1886d0f-48ef-405c-a21a-a06c51a4f549" path="/var/lib/kubelet/pods/f1886d0f-48ef-405c-a21a-a06c51a4f549/volumes" Dec 06 10:16:07 crc kubenswrapper[4672]: I1206 10:16:07.111526 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cwkdm"] Dec 06 10:16:07 crc kubenswrapper[4672]: E1206 10:16:07.112836 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1886d0f-48ef-405c-a21a-a06c51a4f549" containerName="extract-utilities" Dec 06 10:16:07 crc kubenswrapper[4672]: I1206 10:16:07.112860 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1886d0f-48ef-405c-a21a-a06c51a4f549" containerName="extract-utilities" Dec 06 10:16:07 crc kubenswrapper[4672]: E1206 10:16:07.112914 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1886d0f-48ef-405c-a21a-a06c51a4f549" containerName="registry-server" Dec 06 10:16:07 crc kubenswrapper[4672]: I1206 10:16:07.112926 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1886d0f-48ef-405c-a21a-a06c51a4f549" containerName="registry-server" Dec 06 10:16:07 crc kubenswrapper[4672]: E1206 10:16:07.112940 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1886d0f-48ef-405c-a21a-a06c51a4f549" containerName="extract-content" Dec 06 10:16:07 crc kubenswrapper[4672]: I1206 10:16:07.112952 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1886d0f-48ef-405c-a21a-a06c51a4f549" containerName="extract-content" Dec 06 10:16:07 crc kubenswrapper[4672]: I1206 10:16:07.113305 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1886d0f-48ef-405c-a21a-a06c51a4f549" containerName="registry-server" Dec 06 10:16:07 crc kubenswrapper[4672]: I1206 10:16:07.116725 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cwkdm" Dec 06 10:16:07 crc kubenswrapper[4672]: I1206 10:16:07.129308 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cwkdm"] Dec 06 10:16:07 crc kubenswrapper[4672]: I1206 10:16:07.297703 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ntc4\" (UniqueName: \"kubernetes.io/projected/6a17a689-a226-4e9b-83e0-4f812e0b12b3-kube-api-access-2ntc4\") pod \"certified-operators-cwkdm\" (UID: \"6a17a689-a226-4e9b-83e0-4f812e0b12b3\") " pod="openshift-marketplace/certified-operators-cwkdm" Dec 06 10:16:07 crc kubenswrapper[4672]: I1206 10:16:07.297751 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a17a689-a226-4e9b-83e0-4f812e0b12b3-catalog-content\") pod \"certified-operators-cwkdm\" (UID: \"6a17a689-a226-4e9b-83e0-4f812e0b12b3\") " pod="openshift-marketplace/certified-operators-cwkdm" Dec 06 10:16:07 crc kubenswrapper[4672]: I1206 10:16:07.297924 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a17a689-a226-4e9b-83e0-4f812e0b12b3-utilities\") pod \"certified-operators-cwkdm\" (UID: \"6a17a689-a226-4e9b-83e0-4f812e0b12b3\") " pod="openshift-marketplace/certified-operators-cwkdm" Dec 06 10:16:07 crc kubenswrapper[4672]: I1206 10:16:07.399960 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a17a689-a226-4e9b-83e0-4f812e0b12b3-utilities\") pod \"certified-operators-cwkdm\" (UID: \"6a17a689-a226-4e9b-83e0-4f812e0b12b3\") " pod="openshift-marketplace/certified-operators-cwkdm" Dec 06 10:16:07 crc kubenswrapper[4672]: I1206 10:16:07.400131 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ntc4\" (UniqueName: \"kubernetes.io/projected/6a17a689-a226-4e9b-83e0-4f812e0b12b3-kube-api-access-2ntc4\") pod \"certified-operators-cwkdm\" (UID: \"6a17a689-a226-4e9b-83e0-4f812e0b12b3\") " pod="openshift-marketplace/certified-operators-cwkdm" Dec 06 10:16:07 crc kubenswrapper[4672]: I1206 10:16:07.400158 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a17a689-a226-4e9b-83e0-4f812e0b12b3-catalog-content\") pod \"certified-operators-cwkdm\" (UID: \"6a17a689-a226-4e9b-83e0-4f812e0b12b3\") " pod="openshift-marketplace/certified-operators-cwkdm" Dec 06 10:16:07 crc kubenswrapper[4672]: I1206 10:16:07.400501 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a17a689-a226-4e9b-83e0-4f812e0b12b3-utilities\") pod \"certified-operators-cwkdm\" (UID: \"6a17a689-a226-4e9b-83e0-4f812e0b12b3\") " pod="openshift-marketplace/certified-operators-cwkdm" Dec 06 10:16:07 crc kubenswrapper[4672]: I1206 10:16:07.400568 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a17a689-a226-4e9b-83e0-4f812e0b12b3-catalog-content\") pod \"certified-operators-cwkdm\" (UID: \"6a17a689-a226-4e9b-83e0-4f812e0b12b3\") " pod="openshift-marketplace/certified-operators-cwkdm" Dec 06 10:16:07 crc kubenswrapper[4672]: I1206 10:16:07.420208 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ntc4\" (UniqueName: \"kubernetes.io/projected/6a17a689-a226-4e9b-83e0-4f812e0b12b3-kube-api-access-2ntc4\") pod \"certified-operators-cwkdm\" (UID: \"6a17a689-a226-4e9b-83e0-4f812e0b12b3\") " pod="openshift-marketplace/certified-operators-cwkdm" Dec 06 10:16:07 crc kubenswrapper[4672]: I1206 10:16:07.469249 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cwkdm" Dec 06 10:16:07 crc kubenswrapper[4672]: I1206 10:16:07.933422 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cwkdm"] Dec 06 10:16:08 crc kubenswrapper[4672]: I1206 10:16:08.787467 4672 generic.go:334] "Generic (PLEG): container finished" podID="6a17a689-a226-4e9b-83e0-4f812e0b12b3" containerID="8a44541f024308e3410d602feda02fba0015673eeb5000860a3a63267b0ff050" exitCode=0 Dec 06 10:16:08 crc kubenswrapper[4672]: I1206 10:16:08.787545 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cwkdm" event={"ID":"6a17a689-a226-4e9b-83e0-4f812e0b12b3","Type":"ContainerDied","Data":"8a44541f024308e3410d602feda02fba0015673eeb5000860a3a63267b0ff050"} Dec 06 10:16:08 crc kubenswrapper[4672]: I1206 10:16:08.787782 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cwkdm" event={"ID":"6a17a689-a226-4e9b-83e0-4f812e0b12b3","Type":"ContainerStarted","Data":"3a9120ae7f06957d68a286caeabf2f8f0ec7e3a41af0978dd303e5cb0977fabe"} Dec 06 10:16:09 crc kubenswrapper[4672]: I1206 10:16:09.796689 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cwkdm" event={"ID":"6a17a689-a226-4e9b-83e0-4f812e0b12b3","Type":"ContainerStarted","Data":"d0151b0d8b2245033e979daecf914d9841d5cfbcee27e8617c2ea8b6b1958a67"} Dec 06 10:16:10 crc kubenswrapper[4672]: I1206 10:16:10.810571 4672 generic.go:334] "Generic (PLEG): container finished" podID="6a17a689-a226-4e9b-83e0-4f812e0b12b3" containerID="d0151b0d8b2245033e979daecf914d9841d5cfbcee27e8617c2ea8b6b1958a67" exitCode=0 Dec 06 10:16:10 crc kubenswrapper[4672]: I1206 10:16:10.810764 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cwkdm" event={"ID":"6a17a689-a226-4e9b-83e0-4f812e0b12b3","Type":"ContainerDied","Data":"d0151b0d8b2245033e979daecf914d9841d5cfbcee27e8617c2ea8b6b1958a67"} Dec 06 10:16:11 crc kubenswrapper[4672]: I1206 10:16:11.820801 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cwkdm" event={"ID":"6a17a689-a226-4e9b-83e0-4f812e0b12b3","Type":"ContainerStarted","Data":"47aa5b477a96fb0f3a137e2a09e37b322de2731d81eea94702903a868753206a"} Dec 06 10:16:11 crc kubenswrapper[4672]: I1206 10:16:11.851393 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cwkdm" podStartSLOduration=2.060674019 podStartE2EDuration="4.851369758s" podCreationTimestamp="2025-12-06 10:16:07 +0000 UTC" firstStartedPulling="2025-12-06 10:16:08.789377133 +0000 UTC m=+4186.533637420" lastFinishedPulling="2025-12-06 10:16:11.580072872 +0000 UTC m=+4189.324333159" observedRunningTime="2025-12-06 10:16:11.843338601 +0000 UTC m=+4189.587598898" watchObservedRunningTime="2025-12-06 10:16:11.851369758 +0000 UTC m=+4189.595630055" Dec 06 10:16:12 crc kubenswrapper[4672]: I1206 10:16:12.319384 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:16:12 crc kubenswrapper[4672]: I1206 10:16:12.319957 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:16:12 crc kubenswrapper[4672]: I1206 10:16:12.320108 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 10:16:12 crc kubenswrapper[4672]: I1206 10:16:12.320885 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a9fc2abd84860456c000814e8f44da296b7a156fa72f23c9ec5a13b3d45f8bff"} pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 10:16:12 crc kubenswrapper[4672]: I1206 10:16:12.321049 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" containerID="cri-o://a9fc2abd84860456c000814e8f44da296b7a156fa72f23c9ec5a13b3d45f8bff" gracePeriod=600 Dec 06 10:16:12 crc kubenswrapper[4672]: I1206 10:16:12.833203 4672 generic.go:334] "Generic (PLEG): container finished" podID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerID="a9fc2abd84860456c000814e8f44da296b7a156fa72f23c9ec5a13b3d45f8bff" exitCode=0 Dec 06 10:16:12 crc kubenswrapper[4672]: I1206 10:16:12.833293 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerDied","Data":"a9fc2abd84860456c000814e8f44da296b7a156fa72f23c9ec5a13b3d45f8bff"} Dec 06 10:16:12 crc kubenswrapper[4672]: I1206 10:16:12.833535 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerStarted","Data":"17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6"} Dec 06 10:16:12 crc kubenswrapper[4672]: I1206 10:16:12.833578 4672 scope.go:117] "RemoveContainer" containerID="ca4e55181ab085a6d2d94a78978707a35a3af3000a5af2216eb580aa96202e83" Dec 06 10:16:17 crc kubenswrapper[4672]: I1206 10:16:17.469636 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cwkdm" Dec 06 10:16:17 crc kubenswrapper[4672]: I1206 10:16:17.470227 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cwkdm" Dec 06 10:16:17 crc kubenswrapper[4672]: I1206 10:16:17.521408 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cwkdm" Dec 06 10:16:17 crc kubenswrapper[4672]: I1206 10:16:17.982460 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cwkdm" Dec 06 10:16:18 crc kubenswrapper[4672]: I1206 10:16:18.074583 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cwkdm"] Dec 06 10:16:18 crc kubenswrapper[4672]: I1206 10:16:18.886412 4672 generic.go:334] "Generic (PLEG): container finished" podID="5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d" containerID="f5b3d8749f523904d43ba52bdbfcbc08406e845e238f2f9d8ea548d9e56c9f73" exitCode=0 Dec 06 10:16:18 crc kubenswrapper[4672]: I1206 10:16:18.886553 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d","Type":"ContainerDied","Data":"f5b3d8749f523904d43ba52bdbfcbc08406e845e238f2f9d8ea548d9e56c9f73"} Dec 06 10:16:19 crc kubenswrapper[4672]: I1206 10:16:19.896540 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cwkdm" podUID="6a17a689-a226-4e9b-83e0-4f812e0b12b3" containerName="registry-server" containerID="cri-o://47aa5b477a96fb0f3a137e2a09e37b322de2731d81eea94702903a868753206a" gracePeriod=2 Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.475709 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.479459 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cwkdm" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.662090 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s57mp\" (UniqueName: \"kubernetes.io/projected/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-kube-api-access-s57mp\") pod \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.662325 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-openstack-config\") pod \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.662431 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-config-data\") pod \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.662581 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-ca-certs\") pod \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.662683 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-openstack-config-secret\") pod \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.662982 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a17a689-a226-4e9b-83e0-4f812e0b12b3-utilities\") pod \"6a17a689-a226-4e9b-83e0-4f812e0b12b3\" (UID: \"6a17a689-a226-4e9b-83e0-4f812e0b12b3\") " Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.663086 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-test-operator-ephemeral-temporary\") pod \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.663242 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a17a689-a226-4e9b-83e0-4f812e0b12b3-catalog-content\") pod \"6a17a689-a226-4e9b-83e0-4f812e0b12b3\" (UID: \"6a17a689-a226-4e9b-83e0-4f812e0b12b3\") " Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.664247 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-ssh-key\") pod \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.664314 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-test-operator-ephemeral-workdir\") pod \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.664350 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\" (UID: \"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d\") " Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.664450 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ntc4\" (UniqueName: \"kubernetes.io/projected/6a17a689-a226-4e9b-83e0-4f812e0b12b3-kube-api-access-2ntc4\") pod \"6a17a689-a226-4e9b-83e0-4f812e0b12b3\" (UID: \"6a17a689-a226-4e9b-83e0-4f812e0b12b3\") " Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.663986 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a17a689-a226-4e9b-83e0-4f812e0b12b3-utilities" (OuterVolumeSpecName: "utilities") pod "6a17a689-a226-4e9b-83e0-4f812e0b12b3" (UID: "6a17a689-a226-4e9b-83e0-4f812e0b12b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.664809 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-config-data" (OuterVolumeSpecName: "config-data") pod "5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d" (UID: "5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.666466 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d" (UID: "5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.670235 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a17a689-a226-4e9b-83e0-4f812e0b12b3-kube-api-access-2ntc4" (OuterVolumeSpecName: "kube-api-access-2ntc4") pod "6a17a689-a226-4e9b-83e0-4f812e0b12b3" (UID: "6a17a689-a226-4e9b-83e0-4f812e0b12b3"). InnerVolumeSpecName "kube-api-access-2ntc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.670313 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d" (UID: "5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.680161 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-kube-api-access-s57mp" (OuterVolumeSpecName: "kube-api-access-s57mp") pod "5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d" (UID: "5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d"). InnerVolumeSpecName "kube-api-access-s57mp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.707073 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "test-operator-logs") pod "5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d" (UID: "5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.711906 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d" (UID: "5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.714759 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d" (UID: "5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.731045 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d" (UID: "5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.759550 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a17a689-a226-4e9b-83e0-4f812e0b12b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a17a689-a226-4e9b-83e0-4f812e0b12b3" (UID: "6a17a689-a226-4e9b-83e0-4f812e0b12b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.766192 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ntc4\" (UniqueName: \"kubernetes.io/projected/6a17a689-a226-4e9b-83e0-4f812e0b12b3-kube-api-access-2ntc4\") on node \"crc\" DevicePath \"\"" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.766330 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s57mp\" (UniqueName: \"kubernetes.io/projected/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-kube-api-access-s57mp\") on node \"crc\" DevicePath \"\"" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.766404 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.766476 4672 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.766579 4672 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.766669 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a17a689-a226-4e9b-83e0-4f812e0b12b3-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.766743 4672 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.766830 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a17a689-a226-4e9b-83e0-4f812e0b12b3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.766906 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.766971 4672 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.767464 4672 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.789402 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d" (UID: "5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.795825 4672 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.869905 4672 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.869958 4672 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.907932 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.908066 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d","Type":"ContainerDied","Data":"7ecacedd532b69bacd6aca045a8634d2bcbfc14689d5df865d085e5cfd5f46b9"} Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.908163 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ecacedd532b69bacd6aca045a8634d2bcbfc14689d5df865d085e5cfd5f46b9" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.910625 4672 generic.go:334] "Generic (PLEG): container finished" podID="6a17a689-a226-4e9b-83e0-4f812e0b12b3" containerID="47aa5b477a96fb0f3a137e2a09e37b322de2731d81eea94702903a868753206a" exitCode=0 Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.910672 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cwkdm" event={"ID":"6a17a689-a226-4e9b-83e0-4f812e0b12b3","Type":"ContainerDied","Data":"47aa5b477a96fb0f3a137e2a09e37b322de2731d81eea94702903a868753206a"} Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.910701 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cwkdm" event={"ID":"6a17a689-a226-4e9b-83e0-4f812e0b12b3","Type":"ContainerDied","Data":"3a9120ae7f06957d68a286caeabf2f8f0ec7e3a41af0978dd303e5cb0977fabe"} Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.910719 4672 scope.go:117] "RemoveContainer" containerID="47aa5b477a96fb0f3a137e2a09e37b322de2731d81eea94702903a868753206a" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.910841 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cwkdm" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.947851 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cwkdm"] Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.949860 4672 scope.go:117] "RemoveContainer" containerID="d0151b0d8b2245033e979daecf914d9841d5cfbcee27e8617c2ea8b6b1958a67" Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.959662 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cwkdm"] Dec 06 10:16:20 crc kubenswrapper[4672]: I1206 10:16:20.981740 4672 scope.go:117] "RemoveContainer" containerID="8a44541f024308e3410d602feda02fba0015673eeb5000860a3a63267b0ff050" Dec 06 10:16:21 crc kubenswrapper[4672]: I1206 10:16:21.005100 4672 scope.go:117] "RemoveContainer" containerID="47aa5b477a96fb0f3a137e2a09e37b322de2731d81eea94702903a868753206a" Dec 06 10:16:21 crc kubenswrapper[4672]: E1206 10:16:21.005632 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47aa5b477a96fb0f3a137e2a09e37b322de2731d81eea94702903a868753206a\": container with ID starting with 47aa5b477a96fb0f3a137e2a09e37b322de2731d81eea94702903a868753206a not found: ID does not exist" containerID="47aa5b477a96fb0f3a137e2a09e37b322de2731d81eea94702903a868753206a" Dec 06 10:16:21 crc kubenswrapper[4672]: I1206 10:16:21.005695 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47aa5b477a96fb0f3a137e2a09e37b322de2731d81eea94702903a868753206a"} err="failed to get container status \"47aa5b477a96fb0f3a137e2a09e37b322de2731d81eea94702903a868753206a\": rpc error: code = NotFound desc = could not find container \"47aa5b477a96fb0f3a137e2a09e37b322de2731d81eea94702903a868753206a\": container with ID starting with 47aa5b477a96fb0f3a137e2a09e37b322de2731d81eea94702903a868753206a not found: ID does not exist" Dec 06 10:16:21 crc kubenswrapper[4672]: I1206 10:16:21.005729 4672 scope.go:117] "RemoveContainer" containerID="d0151b0d8b2245033e979daecf914d9841d5cfbcee27e8617c2ea8b6b1958a67" Dec 06 10:16:21 crc kubenswrapper[4672]: E1206 10:16:21.006203 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0151b0d8b2245033e979daecf914d9841d5cfbcee27e8617c2ea8b6b1958a67\": container with ID starting with d0151b0d8b2245033e979daecf914d9841d5cfbcee27e8617c2ea8b6b1958a67 not found: ID does not exist" containerID="d0151b0d8b2245033e979daecf914d9841d5cfbcee27e8617c2ea8b6b1958a67" Dec 06 10:16:21 crc kubenswrapper[4672]: I1206 10:16:21.006261 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0151b0d8b2245033e979daecf914d9841d5cfbcee27e8617c2ea8b6b1958a67"} err="failed to get container status \"d0151b0d8b2245033e979daecf914d9841d5cfbcee27e8617c2ea8b6b1958a67\": rpc error: code = NotFound desc = could not find container \"d0151b0d8b2245033e979daecf914d9841d5cfbcee27e8617c2ea8b6b1958a67\": container with ID starting with d0151b0d8b2245033e979daecf914d9841d5cfbcee27e8617c2ea8b6b1958a67 not found: ID does not exist" Dec 06 10:16:21 crc kubenswrapper[4672]: I1206 10:16:21.006304 4672 scope.go:117] "RemoveContainer" containerID="8a44541f024308e3410d602feda02fba0015673eeb5000860a3a63267b0ff050" Dec 06 10:16:21 crc kubenswrapper[4672]: E1206 10:16:21.006671 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a44541f024308e3410d602feda02fba0015673eeb5000860a3a63267b0ff050\": container with ID starting with 8a44541f024308e3410d602feda02fba0015673eeb5000860a3a63267b0ff050 not found: ID does not exist" containerID="8a44541f024308e3410d602feda02fba0015673eeb5000860a3a63267b0ff050" Dec 06 10:16:21 crc kubenswrapper[4672]: I1206 10:16:21.006716 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a44541f024308e3410d602feda02fba0015673eeb5000860a3a63267b0ff050"} err="failed to get container status \"8a44541f024308e3410d602feda02fba0015673eeb5000860a3a63267b0ff050\": rpc error: code = NotFound desc = could not find container \"8a44541f024308e3410d602feda02fba0015673eeb5000860a3a63267b0ff050\": container with ID starting with 8a44541f024308e3410d602feda02fba0015673eeb5000860a3a63267b0ff050 not found: ID does not exist" Dec 06 10:16:22 crc kubenswrapper[4672]: I1206 10:16:22.569539 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a17a689-a226-4e9b-83e0-4f812e0b12b3" path="/var/lib/kubelet/pods/6a17a689-a226-4e9b-83e0-4f812e0b12b3/volumes" Dec 06 10:16:32 crc kubenswrapper[4672]: I1206 10:16:32.742059 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 06 10:16:32 crc kubenswrapper[4672]: E1206 10:16:32.743064 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a17a689-a226-4e9b-83e0-4f812e0b12b3" containerName="extract-utilities" Dec 06 10:16:32 crc kubenswrapper[4672]: I1206 10:16:32.743085 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a17a689-a226-4e9b-83e0-4f812e0b12b3" containerName="extract-utilities" Dec 06 10:16:32 crc kubenswrapper[4672]: E1206 10:16:32.743127 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a17a689-a226-4e9b-83e0-4f812e0b12b3" containerName="extract-content" Dec 06 10:16:32 crc kubenswrapper[4672]: I1206 10:16:32.743135 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a17a689-a226-4e9b-83e0-4f812e0b12b3" containerName="extract-content" Dec 06 10:16:32 crc kubenswrapper[4672]: E1206 10:16:32.743150 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a17a689-a226-4e9b-83e0-4f812e0b12b3" containerName="registry-server" Dec 06 10:16:32 crc kubenswrapper[4672]: I1206 10:16:32.743158 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a17a689-a226-4e9b-83e0-4f812e0b12b3" containerName="registry-server" Dec 06 10:16:32 crc kubenswrapper[4672]: E1206 10:16:32.743171 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d" containerName="tempest-tests-tempest-tests-runner" Dec 06 10:16:32 crc kubenswrapper[4672]: I1206 10:16:32.743180 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d" containerName="tempest-tests-tempest-tests-runner" Dec 06 10:16:32 crc kubenswrapper[4672]: I1206 10:16:32.743454 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a17a689-a226-4e9b-83e0-4f812e0b12b3" containerName="registry-server" Dec 06 10:16:32 crc kubenswrapper[4672]: I1206 10:16:32.743491 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d" containerName="tempest-tests-tempest-tests-runner" Dec 06 10:16:32 crc kubenswrapper[4672]: I1206 10:16:32.744307 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 10:16:32 crc kubenswrapper[4672]: I1206 10:16:32.753428 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 06 10:16:32 crc kubenswrapper[4672]: I1206 10:16:32.754504 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-7gqdm" Dec 06 10:16:32 crc kubenswrapper[4672]: I1206 10:16:32.827148 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6db39db9-f682-4b08-adce-32d7478a345b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 10:16:32 crc kubenswrapper[4672]: I1206 10:16:32.827224 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msclq\" (UniqueName: \"kubernetes.io/projected/6db39db9-f682-4b08-adce-32d7478a345b-kube-api-access-msclq\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6db39db9-f682-4b08-adce-32d7478a345b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 10:16:32 crc kubenswrapper[4672]: I1206 10:16:32.929225 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6db39db9-f682-4b08-adce-32d7478a345b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 10:16:32 crc kubenswrapper[4672]: I1206 10:16:32.929288 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msclq\" (UniqueName: \"kubernetes.io/projected/6db39db9-f682-4b08-adce-32d7478a345b-kube-api-access-msclq\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6db39db9-f682-4b08-adce-32d7478a345b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 10:16:32 crc kubenswrapper[4672]: I1206 10:16:32.930006 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6db39db9-f682-4b08-adce-32d7478a345b\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 10:16:32 crc kubenswrapper[4672]: I1206 10:16:32.947415 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msclq\" (UniqueName: \"kubernetes.io/projected/6db39db9-f682-4b08-adce-32d7478a345b-kube-api-access-msclq\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6db39db9-f682-4b08-adce-32d7478a345b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 10:16:32 crc kubenswrapper[4672]: I1206 10:16:32.954511 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6db39db9-f682-4b08-adce-32d7478a345b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 10:16:33 crc kubenswrapper[4672]: I1206 10:16:33.072307 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 10:16:33 crc kubenswrapper[4672]: I1206 10:16:33.540074 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 06 10:16:33 crc kubenswrapper[4672]: W1206 10:16:33.974427 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6db39db9_f682_4b08_adce_32d7478a345b.slice/crio-34a3ecdbbdea9476dc395d0538f76f828e5d63aad70e9409fb98b00accf979ee WatchSource:0}: Error finding container 34a3ecdbbdea9476dc395d0538f76f828e5d63aad70e9409fb98b00accf979ee: Status 404 returned error can't find the container with id 34a3ecdbbdea9476dc395d0538f76f828e5d63aad70e9409fb98b00accf979ee Dec 06 10:16:34 crc kubenswrapper[4672]: I1206 10:16:34.028959 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"6db39db9-f682-4b08-adce-32d7478a345b","Type":"ContainerStarted","Data":"34a3ecdbbdea9476dc395d0538f76f828e5d63aad70e9409fb98b00accf979ee"} Dec 06 10:16:36 crc kubenswrapper[4672]: I1206 10:16:36.053721 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"6db39db9-f682-4b08-adce-32d7478a345b","Type":"ContainerStarted","Data":"f07a1bac8e75156cb4e63e47ffb946ec25aa6b02514eb7c5c6c24cb04339fbd8"} Dec 06 10:16:36 crc kubenswrapper[4672]: I1206 10:16:36.074102 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.698602239 podStartE2EDuration="4.074078031s" podCreationTimestamp="2025-12-06 10:16:32 +0000 UTC" firstStartedPulling="2025-12-06 10:16:33.979817863 +0000 UTC m=+4211.724078150" lastFinishedPulling="2025-12-06 10:16:35.355293655 +0000 UTC m=+4213.099553942" observedRunningTime="2025-12-06 10:16:36.068805478 +0000 UTC m=+4213.813065765" watchObservedRunningTime="2025-12-06 10:16:36.074078031 +0000 UTC m=+4213.818338338" Dec 06 10:17:00 crc kubenswrapper[4672]: I1206 10:17:00.107421 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n57zb/must-gather-q2d4j"] Dec 06 10:17:00 crc kubenswrapper[4672]: I1206 10:17:00.109322 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n57zb/must-gather-q2d4j" Dec 06 10:17:00 crc kubenswrapper[4672]: I1206 10:17:00.113562 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-n57zb"/"openshift-service-ca.crt" Dec 06 10:17:00 crc kubenswrapper[4672]: I1206 10:17:00.114214 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt2mb\" (UniqueName: \"kubernetes.io/projected/45976e37-3f07-4fe3-ad05-d94ab18e2ce1-kube-api-access-bt2mb\") pod \"must-gather-q2d4j\" (UID: \"45976e37-3f07-4fe3-ad05-d94ab18e2ce1\") " pod="openshift-must-gather-n57zb/must-gather-q2d4j" Dec 06 10:17:00 crc kubenswrapper[4672]: I1206 10:17:00.114351 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/45976e37-3f07-4fe3-ad05-d94ab18e2ce1-must-gather-output\") pod \"must-gather-q2d4j\" (UID: \"45976e37-3f07-4fe3-ad05-d94ab18e2ce1\") " pod="openshift-must-gather-n57zb/must-gather-q2d4j" Dec 06 10:17:00 crc kubenswrapper[4672]: I1206 10:17:00.116941 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-n57zb"/"default-dockercfg-fbtdk" Dec 06 10:17:00 crc kubenswrapper[4672]: I1206 10:17:00.123763 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-n57zb"/"kube-root-ca.crt" Dec 06 10:17:00 crc kubenswrapper[4672]: I1206 10:17:00.125742 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-n57zb/must-gather-q2d4j"] Dec 06 10:17:00 crc kubenswrapper[4672]: I1206 10:17:00.216079 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt2mb\" (UniqueName: \"kubernetes.io/projected/45976e37-3f07-4fe3-ad05-d94ab18e2ce1-kube-api-access-bt2mb\") pod \"must-gather-q2d4j\" (UID: \"45976e37-3f07-4fe3-ad05-d94ab18e2ce1\") " pod="openshift-must-gather-n57zb/must-gather-q2d4j" Dec 06 10:17:00 crc kubenswrapper[4672]: I1206 10:17:00.216150 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/45976e37-3f07-4fe3-ad05-d94ab18e2ce1-must-gather-output\") pod \"must-gather-q2d4j\" (UID: \"45976e37-3f07-4fe3-ad05-d94ab18e2ce1\") " pod="openshift-must-gather-n57zb/must-gather-q2d4j" Dec 06 10:17:00 crc kubenswrapper[4672]: I1206 10:17:00.216630 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/45976e37-3f07-4fe3-ad05-d94ab18e2ce1-must-gather-output\") pod \"must-gather-q2d4j\" (UID: \"45976e37-3f07-4fe3-ad05-d94ab18e2ce1\") " pod="openshift-must-gather-n57zb/must-gather-q2d4j" Dec 06 10:17:00 crc kubenswrapper[4672]: I1206 10:17:00.232475 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt2mb\" (UniqueName: \"kubernetes.io/projected/45976e37-3f07-4fe3-ad05-d94ab18e2ce1-kube-api-access-bt2mb\") pod \"must-gather-q2d4j\" (UID: \"45976e37-3f07-4fe3-ad05-d94ab18e2ce1\") " pod="openshift-must-gather-n57zb/must-gather-q2d4j" Dec 06 10:17:00 crc kubenswrapper[4672]: I1206 10:17:00.426015 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n57zb/must-gather-q2d4j" Dec 06 10:17:00 crc kubenswrapper[4672]: I1206 10:17:00.910092 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-n57zb/must-gather-q2d4j"] Dec 06 10:17:00 crc kubenswrapper[4672]: I1206 10:17:00.932985 4672 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 10:17:01 crc kubenswrapper[4672]: I1206 10:17:01.302227 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n57zb/must-gather-q2d4j" event={"ID":"45976e37-3f07-4fe3-ad05-d94ab18e2ce1","Type":"ContainerStarted","Data":"b27f9ade44dd13ffcae4910a83f1110bb3a3dbc1be62155f4b1c6c04f2eabebd"} Dec 06 10:17:06 crc kubenswrapper[4672]: I1206 10:17:06.372876 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n57zb/must-gather-q2d4j" event={"ID":"45976e37-3f07-4fe3-ad05-d94ab18e2ce1","Type":"ContainerStarted","Data":"68ccde6b01afcb0bae45d7d09f1163cf3ad68815f92cdfd37360d6939271be52"} Dec 06 10:17:06 crc kubenswrapper[4672]: I1206 10:17:06.374127 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n57zb/must-gather-q2d4j" event={"ID":"45976e37-3f07-4fe3-ad05-d94ab18e2ce1","Type":"ContainerStarted","Data":"989e6317ca2735021bb8793db4f29d7c9773b335d05055aeb5752f96430ea0b7"} Dec 06 10:17:06 crc kubenswrapper[4672]: I1206 10:17:06.403954 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n57zb/must-gather-q2d4j" podStartSLOduration=2.234670003 podStartE2EDuration="6.403937178s" podCreationTimestamp="2025-12-06 10:17:00 +0000 UTC" firstStartedPulling="2025-12-06 10:17:00.93277522 +0000 UTC m=+4238.677035507" lastFinishedPulling="2025-12-06 10:17:05.102042395 +0000 UTC m=+4242.846302682" observedRunningTime="2025-12-06 10:17:06.400009581 +0000 UTC m=+4244.144269868" watchObservedRunningTime="2025-12-06 10:17:06.403937178 +0000 UTC m=+4244.148197465" Dec 06 10:17:10 crc kubenswrapper[4672]: E1206 10:17:10.844382 4672 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.30:47864->38.102.83.30:37519: write tcp 38.102.83.30:47864->38.102.83.30:37519: write: broken pipe Dec 06 10:17:11 crc kubenswrapper[4672]: I1206 10:17:11.915265 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n57zb/crc-debug-7m4hw"] Dec 06 10:17:11 crc kubenswrapper[4672]: I1206 10:17:11.917088 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n57zb/crc-debug-7m4hw" Dec 06 10:17:11 crc kubenswrapper[4672]: I1206 10:17:11.980171 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6slr\" (UniqueName: \"kubernetes.io/projected/7fe92a0d-d0c7-4afb-807c-63389ced5e7c-kube-api-access-r6slr\") pod \"crc-debug-7m4hw\" (UID: \"7fe92a0d-d0c7-4afb-807c-63389ced5e7c\") " pod="openshift-must-gather-n57zb/crc-debug-7m4hw" Dec 06 10:17:11 crc kubenswrapper[4672]: I1206 10:17:11.980972 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7fe92a0d-d0c7-4afb-807c-63389ced5e7c-host\") pod \"crc-debug-7m4hw\" (UID: \"7fe92a0d-d0c7-4afb-807c-63389ced5e7c\") " pod="openshift-must-gather-n57zb/crc-debug-7m4hw" Dec 06 10:17:12 crc kubenswrapper[4672]: I1206 10:17:12.083998 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7fe92a0d-d0c7-4afb-807c-63389ced5e7c-host\") pod \"crc-debug-7m4hw\" (UID: \"7fe92a0d-d0c7-4afb-807c-63389ced5e7c\") " pod="openshift-must-gather-n57zb/crc-debug-7m4hw" Dec 06 10:17:12 crc kubenswrapper[4672]: I1206 10:17:12.084287 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6slr\" (UniqueName: \"kubernetes.io/projected/7fe92a0d-d0c7-4afb-807c-63389ced5e7c-kube-api-access-r6slr\") pod \"crc-debug-7m4hw\" (UID: \"7fe92a0d-d0c7-4afb-807c-63389ced5e7c\") " pod="openshift-must-gather-n57zb/crc-debug-7m4hw" Dec 06 10:17:12 crc kubenswrapper[4672]: I1206 10:17:12.084421 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7fe92a0d-d0c7-4afb-807c-63389ced5e7c-host\") pod \"crc-debug-7m4hw\" (UID: \"7fe92a0d-d0c7-4afb-807c-63389ced5e7c\") " pod="openshift-must-gather-n57zb/crc-debug-7m4hw" Dec 06 10:17:12 crc kubenswrapper[4672]: I1206 10:17:12.105040 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6slr\" (UniqueName: \"kubernetes.io/projected/7fe92a0d-d0c7-4afb-807c-63389ced5e7c-kube-api-access-r6slr\") pod \"crc-debug-7m4hw\" (UID: \"7fe92a0d-d0c7-4afb-807c-63389ced5e7c\") " pod="openshift-must-gather-n57zb/crc-debug-7m4hw" Dec 06 10:17:12 crc kubenswrapper[4672]: I1206 10:17:12.238094 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n57zb/crc-debug-7m4hw" Dec 06 10:17:12 crc kubenswrapper[4672]: I1206 10:17:12.420337 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n57zb/crc-debug-7m4hw" event={"ID":"7fe92a0d-d0c7-4afb-807c-63389ced5e7c","Type":"ContainerStarted","Data":"479424aa0a80e9d1c2c88b2f8e2cd2e92e3e7060f85da266cf7de4955aa52686"} Dec 06 10:17:22 crc kubenswrapper[4672]: I1206 10:17:22.506002 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n57zb/crc-debug-7m4hw" event={"ID":"7fe92a0d-d0c7-4afb-807c-63389ced5e7c","Type":"ContainerStarted","Data":"d66a4e55cf79990bbcb7878d5caf4251799ea74afb528278106d1dd0d7aec265"} Dec 06 10:17:22 crc kubenswrapper[4672]: I1206 10:17:22.522168 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n57zb/crc-debug-7m4hw" podStartSLOduration=2.160328867 podStartE2EDuration="11.522152647s" podCreationTimestamp="2025-12-06 10:17:11 +0000 UTC" firstStartedPulling="2025-12-06 10:17:12.276296834 +0000 UTC m=+4250.020557121" lastFinishedPulling="2025-12-06 10:17:21.638120614 +0000 UTC m=+4259.382380901" observedRunningTime="2025-12-06 10:17:22.519346461 +0000 UTC m=+4260.263606748" watchObservedRunningTime="2025-12-06 10:17:22.522152647 +0000 UTC m=+4260.266412934" Dec 06 10:18:03 crc kubenswrapper[4672]: I1206 10:18:03.988890 4672 generic.go:334] "Generic (PLEG): container finished" podID="7fe92a0d-d0c7-4afb-807c-63389ced5e7c" containerID="d66a4e55cf79990bbcb7878d5caf4251799ea74afb528278106d1dd0d7aec265" exitCode=0 Dec 06 10:18:03 crc kubenswrapper[4672]: I1206 10:18:03.988974 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n57zb/crc-debug-7m4hw" event={"ID":"7fe92a0d-d0c7-4afb-807c-63389ced5e7c","Type":"ContainerDied","Data":"d66a4e55cf79990bbcb7878d5caf4251799ea74afb528278106d1dd0d7aec265"} Dec 06 10:18:05 crc kubenswrapper[4672]: I1206 10:18:05.189562 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n57zb/crc-debug-7m4hw" Dec 06 10:18:05 crc kubenswrapper[4672]: I1206 10:18:05.223331 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n57zb/crc-debug-7m4hw"] Dec 06 10:18:05 crc kubenswrapper[4672]: I1206 10:18:05.231775 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n57zb/crc-debug-7m4hw"] Dec 06 10:18:05 crc kubenswrapper[4672]: I1206 10:18:05.290419 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6slr\" (UniqueName: \"kubernetes.io/projected/7fe92a0d-d0c7-4afb-807c-63389ced5e7c-kube-api-access-r6slr\") pod \"7fe92a0d-d0c7-4afb-807c-63389ced5e7c\" (UID: \"7fe92a0d-d0c7-4afb-807c-63389ced5e7c\") " Dec 06 10:18:05 crc kubenswrapper[4672]: I1206 10:18:05.290504 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7fe92a0d-d0c7-4afb-807c-63389ced5e7c-host\") pod \"7fe92a0d-d0c7-4afb-807c-63389ced5e7c\" (UID: \"7fe92a0d-d0c7-4afb-807c-63389ced5e7c\") " Dec 06 10:18:05 crc kubenswrapper[4672]: I1206 10:18:05.290945 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7fe92a0d-d0c7-4afb-807c-63389ced5e7c-host" (OuterVolumeSpecName: "host") pod "7fe92a0d-d0c7-4afb-807c-63389ced5e7c" (UID: "7fe92a0d-d0c7-4afb-807c-63389ced5e7c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 10:18:05 crc kubenswrapper[4672]: I1206 10:18:05.297418 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fe92a0d-d0c7-4afb-807c-63389ced5e7c-kube-api-access-r6slr" (OuterVolumeSpecName: "kube-api-access-r6slr") pod "7fe92a0d-d0c7-4afb-807c-63389ced5e7c" (UID: "7fe92a0d-d0c7-4afb-807c-63389ced5e7c"). InnerVolumeSpecName "kube-api-access-r6slr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:18:05 crc kubenswrapper[4672]: I1206 10:18:05.392540 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6slr\" (UniqueName: \"kubernetes.io/projected/7fe92a0d-d0c7-4afb-807c-63389ced5e7c-kube-api-access-r6slr\") on node \"crc\" DevicePath \"\"" Dec 06 10:18:05 crc kubenswrapper[4672]: I1206 10:18:05.392810 4672 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7fe92a0d-d0c7-4afb-807c-63389ced5e7c-host\") on node \"crc\" DevicePath \"\"" Dec 06 10:18:06 crc kubenswrapper[4672]: I1206 10:18:06.010742 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="479424aa0a80e9d1c2c88b2f8e2cd2e92e3e7060f85da266cf7de4955aa52686" Dec 06 10:18:06 crc kubenswrapper[4672]: I1206 10:18:06.010823 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n57zb/crc-debug-7m4hw" Dec 06 10:18:06 crc kubenswrapper[4672]: I1206 10:18:06.422238 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n57zb/crc-debug-n2wcv"] Dec 06 10:18:06 crc kubenswrapper[4672]: E1206 10:18:06.422756 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fe92a0d-d0c7-4afb-807c-63389ced5e7c" containerName="container-00" Dec 06 10:18:06 crc kubenswrapper[4672]: I1206 10:18:06.422771 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fe92a0d-d0c7-4afb-807c-63389ced5e7c" containerName="container-00" Dec 06 10:18:06 crc kubenswrapper[4672]: I1206 10:18:06.423012 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fe92a0d-d0c7-4afb-807c-63389ced5e7c" containerName="container-00" Dec 06 10:18:06 crc kubenswrapper[4672]: I1206 10:18:06.424087 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n57zb/crc-debug-n2wcv" Dec 06 10:18:06 crc kubenswrapper[4672]: I1206 10:18:06.514981 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/06b80d1d-a5c3-47ff-b2b0-ae46a9507f16-host\") pod \"crc-debug-n2wcv\" (UID: \"06b80d1d-a5c3-47ff-b2b0-ae46a9507f16\") " pod="openshift-must-gather-n57zb/crc-debug-n2wcv" Dec 06 10:18:06 crc kubenswrapper[4672]: I1206 10:18:06.515057 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwndk\" (UniqueName: \"kubernetes.io/projected/06b80d1d-a5c3-47ff-b2b0-ae46a9507f16-kube-api-access-xwndk\") pod \"crc-debug-n2wcv\" (UID: \"06b80d1d-a5c3-47ff-b2b0-ae46a9507f16\") " pod="openshift-must-gather-n57zb/crc-debug-n2wcv" Dec 06 10:18:06 crc kubenswrapper[4672]: I1206 10:18:06.565275 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fe92a0d-d0c7-4afb-807c-63389ced5e7c" path="/var/lib/kubelet/pods/7fe92a0d-d0c7-4afb-807c-63389ced5e7c/volumes" Dec 06 10:18:06 crc kubenswrapper[4672]: I1206 10:18:06.616432 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/06b80d1d-a5c3-47ff-b2b0-ae46a9507f16-host\") pod \"crc-debug-n2wcv\" (UID: \"06b80d1d-a5c3-47ff-b2b0-ae46a9507f16\") " pod="openshift-must-gather-n57zb/crc-debug-n2wcv" Dec 06 10:18:06 crc kubenswrapper[4672]: I1206 10:18:06.616484 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwndk\" (UniqueName: \"kubernetes.io/projected/06b80d1d-a5c3-47ff-b2b0-ae46a9507f16-kube-api-access-xwndk\") pod \"crc-debug-n2wcv\" (UID: \"06b80d1d-a5c3-47ff-b2b0-ae46a9507f16\") " pod="openshift-must-gather-n57zb/crc-debug-n2wcv" Dec 06 10:18:06 crc kubenswrapper[4672]: I1206 10:18:06.616694 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/06b80d1d-a5c3-47ff-b2b0-ae46a9507f16-host\") pod \"crc-debug-n2wcv\" (UID: \"06b80d1d-a5c3-47ff-b2b0-ae46a9507f16\") " pod="openshift-must-gather-n57zb/crc-debug-n2wcv" Dec 06 10:18:06 crc kubenswrapper[4672]: I1206 10:18:06.639767 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwndk\" (UniqueName: \"kubernetes.io/projected/06b80d1d-a5c3-47ff-b2b0-ae46a9507f16-kube-api-access-xwndk\") pod \"crc-debug-n2wcv\" (UID: \"06b80d1d-a5c3-47ff-b2b0-ae46a9507f16\") " pod="openshift-must-gather-n57zb/crc-debug-n2wcv" Dec 06 10:18:06 crc kubenswrapper[4672]: I1206 10:18:06.750819 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n57zb/crc-debug-n2wcv" Dec 06 10:18:07 crc kubenswrapper[4672]: I1206 10:18:07.021382 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n57zb/crc-debug-n2wcv" event={"ID":"06b80d1d-a5c3-47ff-b2b0-ae46a9507f16","Type":"ContainerStarted","Data":"529568dce6f6faeeb37b840d269e6144bcd5c8d13723b923b5977a0ee1d4c291"} Dec 06 10:18:08 crc kubenswrapper[4672]: I1206 10:18:08.031157 4672 generic.go:334] "Generic (PLEG): container finished" podID="06b80d1d-a5c3-47ff-b2b0-ae46a9507f16" containerID="06cb580499ed16e48a6f8ac95f2d07745ad29b116eaf69512693f4f348bc4742" exitCode=0 Dec 06 10:18:08 crc kubenswrapper[4672]: I1206 10:18:08.031199 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n57zb/crc-debug-n2wcv" event={"ID":"06b80d1d-a5c3-47ff-b2b0-ae46a9507f16","Type":"ContainerDied","Data":"06cb580499ed16e48a6f8ac95f2d07745ad29b116eaf69512693f4f348bc4742"} Dec 06 10:18:08 crc kubenswrapper[4672]: I1206 10:18:08.433865 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n57zb/crc-debug-n2wcv"] Dec 06 10:18:08 crc kubenswrapper[4672]: I1206 10:18:08.442803 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n57zb/crc-debug-n2wcv"] Dec 06 10:18:09 crc kubenswrapper[4672]: I1206 10:18:09.135676 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n57zb/crc-debug-n2wcv" Dec 06 10:18:09 crc kubenswrapper[4672]: I1206 10:18:09.267945 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwndk\" (UniqueName: \"kubernetes.io/projected/06b80d1d-a5c3-47ff-b2b0-ae46a9507f16-kube-api-access-xwndk\") pod \"06b80d1d-a5c3-47ff-b2b0-ae46a9507f16\" (UID: \"06b80d1d-a5c3-47ff-b2b0-ae46a9507f16\") " Dec 06 10:18:09 crc kubenswrapper[4672]: I1206 10:18:09.268061 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/06b80d1d-a5c3-47ff-b2b0-ae46a9507f16-host\") pod \"06b80d1d-a5c3-47ff-b2b0-ae46a9507f16\" (UID: \"06b80d1d-a5c3-47ff-b2b0-ae46a9507f16\") " Dec 06 10:18:09 crc kubenswrapper[4672]: I1206 10:18:09.268550 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/06b80d1d-a5c3-47ff-b2b0-ae46a9507f16-host" (OuterVolumeSpecName: "host") pod "06b80d1d-a5c3-47ff-b2b0-ae46a9507f16" (UID: "06b80d1d-a5c3-47ff-b2b0-ae46a9507f16"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 10:18:09 crc kubenswrapper[4672]: I1206 10:18:09.272837 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06b80d1d-a5c3-47ff-b2b0-ae46a9507f16-kube-api-access-xwndk" (OuterVolumeSpecName: "kube-api-access-xwndk") pod "06b80d1d-a5c3-47ff-b2b0-ae46a9507f16" (UID: "06b80d1d-a5c3-47ff-b2b0-ae46a9507f16"). InnerVolumeSpecName "kube-api-access-xwndk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:18:09 crc kubenswrapper[4672]: I1206 10:18:09.369853 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwndk\" (UniqueName: \"kubernetes.io/projected/06b80d1d-a5c3-47ff-b2b0-ae46a9507f16-kube-api-access-xwndk\") on node \"crc\" DevicePath \"\"" Dec 06 10:18:09 crc kubenswrapper[4672]: I1206 10:18:09.369900 4672 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/06b80d1d-a5c3-47ff-b2b0-ae46a9507f16-host\") on node \"crc\" DevicePath \"\"" Dec 06 10:18:09 crc kubenswrapper[4672]: I1206 10:18:09.611678 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n57zb/crc-debug-whgsf"] Dec 06 10:18:09 crc kubenswrapper[4672]: E1206 10:18:09.612048 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06b80d1d-a5c3-47ff-b2b0-ae46a9507f16" containerName="container-00" Dec 06 10:18:09 crc kubenswrapper[4672]: I1206 10:18:09.612067 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="06b80d1d-a5c3-47ff-b2b0-ae46a9507f16" containerName="container-00" Dec 06 10:18:09 crc kubenswrapper[4672]: I1206 10:18:09.612245 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="06b80d1d-a5c3-47ff-b2b0-ae46a9507f16" containerName="container-00" Dec 06 10:18:09 crc kubenswrapper[4672]: I1206 10:18:09.612947 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n57zb/crc-debug-whgsf" Dec 06 10:18:09 crc kubenswrapper[4672]: I1206 10:18:09.678223 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgqz8\" (UniqueName: \"kubernetes.io/projected/2fdf43a6-fd1b-471a-b9c1-7839dd02a969-kube-api-access-pgqz8\") pod \"crc-debug-whgsf\" (UID: \"2fdf43a6-fd1b-471a-b9c1-7839dd02a969\") " pod="openshift-must-gather-n57zb/crc-debug-whgsf" Dec 06 10:18:09 crc kubenswrapper[4672]: I1206 10:18:09.678760 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2fdf43a6-fd1b-471a-b9c1-7839dd02a969-host\") pod \"crc-debug-whgsf\" (UID: \"2fdf43a6-fd1b-471a-b9c1-7839dd02a969\") " pod="openshift-must-gather-n57zb/crc-debug-whgsf" Dec 06 10:18:09 crc kubenswrapper[4672]: I1206 10:18:09.780499 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgqz8\" (UniqueName: \"kubernetes.io/projected/2fdf43a6-fd1b-471a-b9c1-7839dd02a969-kube-api-access-pgqz8\") pod \"crc-debug-whgsf\" (UID: \"2fdf43a6-fd1b-471a-b9c1-7839dd02a969\") " pod="openshift-must-gather-n57zb/crc-debug-whgsf" Dec 06 10:18:09 crc kubenswrapper[4672]: I1206 10:18:09.780734 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2fdf43a6-fd1b-471a-b9c1-7839dd02a969-host\") pod \"crc-debug-whgsf\" (UID: \"2fdf43a6-fd1b-471a-b9c1-7839dd02a969\") " pod="openshift-must-gather-n57zb/crc-debug-whgsf" Dec 06 10:18:09 crc kubenswrapper[4672]: I1206 10:18:09.780870 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2fdf43a6-fd1b-471a-b9c1-7839dd02a969-host\") pod \"crc-debug-whgsf\" (UID: \"2fdf43a6-fd1b-471a-b9c1-7839dd02a969\") " pod="openshift-must-gather-n57zb/crc-debug-whgsf" Dec 06 10:18:09 crc kubenswrapper[4672]: I1206 10:18:09.802955 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgqz8\" (UniqueName: \"kubernetes.io/projected/2fdf43a6-fd1b-471a-b9c1-7839dd02a969-kube-api-access-pgqz8\") pod \"crc-debug-whgsf\" (UID: \"2fdf43a6-fd1b-471a-b9c1-7839dd02a969\") " pod="openshift-must-gather-n57zb/crc-debug-whgsf" Dec 06 10:18:09 crc kubenswrapper[4672]: I1206 10:18:09.936952 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n57zb/crc-debug-whgsf" Dec 06 10:18:10 crc kubenswrapper[4672]: I1206 10:18:10.055105 4672 scope.go:117] "RemoveContainer" containerID="06cb580499ed16e48a6f8ac95f2d07745ad29b116eaf69512693f4f348bc4742" Dec 06 10:18:10 crc kubenswrapper[4672]: I1206 10:18:10.055269 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n57zb/crc-debug-n2wcv" Dec 06 10:18:10 crc kubenswrapper[4672]: I1206 10:18:10.080748 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n57zb/crc-debug-whgsf" event={"ID":"2fdf43a6-fd1b-471a-b9c1-7839dd02a969","Type":"ContainerStarted","Data":"45859d6e1f98af199df7206fddeaac78be9943f96df2c2a7be6259215b1651ae"} Dec 06 10:18:10 crc kubenswrapper[4672]: I1206 10:18:10.569016 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06b80d1d-a5c3-47ff-b2b0-ae46a9507f16" path="/var/lib/kubelet/pods/06b80d1d-a5c3-47ff-b2b0-ae46a9507f16/volumes" Dec 06 10:18:11 crc kubenswrapper[4672]: I1206 10:18:11.110371 4672 generic.go:334] "Generic (PLEG): container finished" podID="2fdf43a6-fd1b-471a-b9c1-7839dd02a969" containerID="54702c393d48abd86ada1f74e32b21b88f88933b29604af9b72adb67d47ec34a" exitCode=0 Dec 06 10:18:11 crc kubenswrapper[4672]: I1206 10:18:11.110489 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n57zb/crc-debug-whgsf" event={"ID":"2fdf43a6-fd1b-471a-b9c1-7839dd02a969","Type":"ContainerDied","Data":"54702c393d48abd86ada1f74e32b21b88f88933b29604af9b72adb67d47ec34a"} Dec 06 10:18:11 crc kubenswrapper[4672]: I1206 10:18:11.151362 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n57zb/crc-debug-whgsf"] Dec 06 10:18:11 crc kubenswrapper[4672]: I1206 10:18:11.159597 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n57zb/crc-debug-whgsf"] Dec 06 10:18:12 crc kubenswrapper[4672]: I1206 10:18:12.233516 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n57zb/crc-debug-whgsf" Dec 06 10:18:12 crc kubenswrapper[4672]: I1206 10:18:12.319267 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:18:12 crc kubenswrapper[4672]: I1206 10:18:12.319335 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:18:12 crc kubenswrapper[4672]: I1206 10:18:12.366155 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2fdf43a6-fd1b-471a-b9c1-7839dd02a969-host\") pod \"2fdf43a6-fd1b-471a-b9c1-7839dd02a969\" (UID: \"2fdf43a6-fd1b-471a-b9c1-7839dd02a969\") " Dec 06 10:18:12 crc kubenswrapper[4672]: I1206 10:18:12.366315 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgqz8\" (UniqueName: \"kubernetes.io/projected/2fdf43a6-fd1b-471a-b9c1-7839dd02a969-kube-api-access-pgqz8\") pod \"2fdf43a6-fd1b-471a-b9c1-7839dd02a969\" (UID: \"2fdf43a6-fd1b-471a-b9c1-7839dd02a969\") " Dec 06 10:18:12 crc kubenswrapper[4672]: I1206 10:18:12.366778 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2fdf43a6-fd1b-471a-b9c1-7839dd02a969-host" (OuterVolumeSpecName: "host") pod "2fdf43a6-fd1b-471a-b9c1-7839dd02a969" (UID: "2fdf43a6-fd1b-471a-b9c1-7839dd02a969"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 10:18:12 crc kubenswrapper[4672]: I1206 10:18:12.367124 4672 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2fdf43a6-fd1b-471a-b9c1-7839dd02a969-host\") on node \"crc\" DevicePath \"\"" Dec 06 10:18:12 crc kubenswrapper[4672]: I1206 10:18:12.385853 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fdf43a6-fd1b-471a-b9c1-7839dd02a969-kube-api-access-pgqz8" (OuterVolumeSpecName: "kube-api-access-pgqz8") pod "2fdf43a6-fd1b-471a-b9c1-7839dd02a969" (UID: "2fdf43a6-fd1b-471a-b9c1-7839dd02a969"). InnerVolumeSpecName "kube-api-access-pgqz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:18:12 crc kubenswrapper[4672]: I1206 10:18:12.469443 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgqz8\" (UniqueName: \"kubernetes.io/projected/2fdf43a6-fd1b-471a-b9c1-7839dd02a969-kube-api-access-pgqz8\") on node \"crc\" DevicePath \"\"" Dec 06 10:18:12 crc kubenswrapper[4672]: I1206 10:18:12.576590 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fdf43a6-fd1b-471a-b9c1-7839dd02a969" path="/var/lib/kubelet/pods/2fdf43a6-fd1b-471a-b9c1-7839dd02a969/volumes" Dec 06 10:18:13 crc kubenswrapper[4672]: I1206 10:18:13.128398 4672 scope.go:117] "RemoveContainer" containerID="54702c393d48abd86ada1f74e32b21b88f88933b29604af9b72adb67d47ec34a" Dec 06 10:18:13 crc kubenswrapper[4672]: I1206 10:18:13.128454 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n57zb/crc-debug-whgsf" Dec 06 10:18:42 crc kubenswrapper[4672]: I1206 10:18:42.320211 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:18:42 crc kubenswrapper[4672]: I1206 10:18:42.320815 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:19:12 crc kubenswrapper[4672]: I1206 10:19:12.320348 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:19:12 crc kubenswrapper[4672]: I1206 10:19:12.321210 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:19:12 crc kubenswrapper[4672]: I1206 10:19:12.321331 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 10:19:12 crc kubenswrapper[4672]: I1206 10:19:12.322669 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6"} pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 10:19:12 crc kubenswrapper[4672]: I1206 10:19:12.322803 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" containerID="cri-o://17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6" gracePeriod=600 Dec 06 10:19:12 crc kubenswrapper[4672]: E1206 10:19:12.488966 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:19:12 crc kubenswrapper[4672]: I1206 10:19:12.681849 4672 generic.go:334] "Generic (PLEG): container finished" podID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerID="17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6" exitCode=0 Dec 06 10:19:12 crc kubenswrapper[4672]: I1206 10:19:12.681901 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerDied","Data":"17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6"} Dec 06 10:19:12 crc kubenswrapper[4672]: I1206 10:19:12.681943 4672 scope.go:117] "RemoveContainer" containerID="a9fc2abd84860456c000814e8f44da296b7a156fa72f23c9ec5a13b3d45f8bff" Dec 06 10:19:12 crc kubenswrapper[4672]: I1206 10:19:12.682849 4672 scope.go:117] "RemoveContainer" containerID="17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6" Dec 06 10:19:12 crc kubenswrapper[4672]: E1206 10:19:12.683218 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:19:16 crc kubenswrapper[4672]: I1206 10:19:16.432314 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7fcb648c94-6bbbh_28410d08-bb47-4a67-a4d8-c06929b8c644/barbican-api/0.log" Dec 06 10:19:16 crc kubenswrapper[4672]: I1206 10:19:16.598054 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7fcb648c94-6bbbh_28410d08-bb47-4a67-a4d8-c06929b8c644/barbican-api-log/0.log" Dec 06 10:19:16 crc kubenswrapper[4672]: I1206 10:19:16.631459 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-868fbbdb46-nq8wk_79992d1b-dc0e-43ad-b6cd-942fadb148e6/barbican-keystone-listener/0.log" Dec 06 10:19:16 crc kubenswrapper[4672]: I1206 10:19:16.736155 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-868fbbdb46-nq8wk_79992d1b-dc0e-43ad-b6cd-942fadb148e6/barbican-keystone-listener-log/0.log" Dec 06 10:19:16 crc kubenswrapper[4672]: I1206 10:19:16.881324 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6d54549b45-whq64_963924f1-d56b-4422-af4a-cc5c3a17944f/barbican-worker/0.log" Dec 06 10:19:16 crc kubenswrapper[4672]: I1206 10:19:16.916030 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6d54549b45-whq64_963924f1-d56b-4422-af4a-cc5c3a17944f/barbican-worker-log/0.log" Dec 06 10:19:17 crc kubenswrapper[4672]: I1206 10:19:17.070951 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9_87cce220-e210-44d8-ac72-946b6e9bb4c4/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 10:19:17 crc kubenswrapper[4672]: I1206 10:19:17.145259 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_42976197-15a4-4ceb-baf3-fa56682d89a6/ceilometer-central-agent/0.log" Dec 06 10:19:17 crc kubenswrapper[4672]: I1206 10:19:17.220663 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_42976197-15a4-4ceb-baf3-fa56682d89a6/ceilometer-notification-agent/0.log" Dec 06 10:19:17 crc kubenswrapper[4672]: I1206 10:19:17.338738 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_42976197-15a4-4ceb-baf3-fa56682d89a6/proxy-httpd/0.log" Dec 06 10:19:17 crc kubenswrapper[4672]: I1206 10:19:17.423477 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_42976197-15a4-4ceb-baf3-fa56682d89a6/sg-core/0.log" Dec 06 10:19:17 crc kubenswrapper[4672]: I1206 10:19:17.426049 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v_a0bb0cdb-025d-4251-b0f0-06185ea6fe8f/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 10:19:18 crc kubenswrapper[4672]: I1206 10:19:18.082090 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d_fc76ad12-899a-427b-abd7-57b3375a29ea/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 10:19:18 crc kubenswrapper[4672]: I1206 10:19:18.151677 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_1d2747dd-122d-4920-a266-6be569a3ab33/cinder-api/0.log" Dec 06 10:19:18 crc kubenswrapper[4672]: I1206 10:19:18.300648 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_1d2747dd-122d-4920-a266-6be569a3ab33/cinder-api-log/0.log" Dec 06 10:19:18 crc kubenswrapper[4672]: I1206 10:19:18.371084 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_db88dbb4-2112-4bec-a4e4-f0bf562bb173/probe/0.log" Dec 06 10:19:18 crc kubenswrapper[4672]: I1206 10:19:18.468823 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_db88dbb4-2112-4bec-a4e4-f0bf562bb173/cinder-backup/0.log" Dec 06 10:19:18 crc kubenswrapper[4672]: I1206 10:19:18.625956 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8dda2373-3b28-4086-b29c-3415f50f1d92/cinder-scheduler/0.log" Dec 06 10:19:18 crc kubenswrapper[4672]: I1206 10:19:18.640140 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8dda2373-3b28-4086-b29c-3415f50f1d92/probe/0.log" Dec 06 10:19:18 crc kubenswrapper[4672]: I1206 10:19:18.829250 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_a35af03e-7b48-40ee-a857-20824a664f4e/cinder-volume/0.log" Dec 06 10:19:18 crc kubenswrapper[4672]: I1206 10:19:18.848857 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_a35af03e-7b48-40ee-a857-20824a664f4e/probe/0.log" Dec 06 10:19:19 crc kubenswrapper[4672]: I1206 10:19:19.003430 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2_d3188b54-be64-4ee4-a4ff-4af6f300e58a/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 10:19:19 crc kubenswrapper[4672]: I1206 10:19:19.525680 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc_02d2290f-9fc0-4247-8db8-660f26601528/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 10:19:19 crc kubenswrapper[4672]: I1206 10:19:19.724133 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b586f587-9rz4l_961904ba-a936-4912-b5a1-a20e4a4028e6/init/0.log" Dec 06 10:19:19 crc kubenswrapper[4672]: I1206 10:19:19.791002 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b586f587-9rz4l_961904ba-a936-4912-b5a1-a20e4a4028e6/init/0.log" Dec 06 10:19:19 crc kubenswrapper[4672]: I1206 10:19:19.922712 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b586f587-9rz4l_961904ba-a936-4912-b5a1-a20e4a4028e6/dnsmasq-dns/0.log" Dec 06 10:19:20 crc kubenswrapper[4672]: I1206 10:19:20.006877 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_9a6d2d22-6464-4bf5-9bf6-e3515efedbf4/glance-httpd/0.log" Dec 06 10:19:20 crc kubenswrapper[4672]: I1206 10:19:20.061045 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_9a6d2d22-6464-4bf5-9bf6-e3515efedbf4/glance-log/0.log" Dec 06 10:19:20 crc kubenswrapper[4672]: I1206 10:19:20.214647 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_97cea7c5-c51e-4001-b398-28bdbccd9a97/glance-httpd/0.log" Dec 06 10:19:20 crc kubenswrapper[4672]: I1206 10:19:20.309554 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-8c74dbc66-8ghhf_70d8ba3e-3f2d-4627-afab-5bb8908f89eb/horizon/0.log" Dec 06 10:19:20 crc kubenswrapper[4672]: I1206 10:19:20.312258 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_97cea7c5-c51e-4001-b398-28bdbccd9a97/glance-log/0.log" Dec 06 10:19:20 crc kubenswrapper[4672]: I1206 10:19:20.541795 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-8c74dbc66-8ghhf_70d8ba3e-3f2d-4627-afab-5bb8908f89eb/horizon-log/0.log" Dec 06 10:19:20 crc kubenswrapper[4672]: I1206 10:19:20.581895 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-gzmnw_cad15908-dabc-4b48-9aa7-977801ce63ff/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 10:19:20 crc kubenswrapper[4672]: I1206 10:19:20.611988 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj_131ab019-8934-4783-ab57-b3ecccd11b05/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 10:19:20 crc kubenswrapper[4672]: I1206 10:19:20.801902 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29416921-tdtbn_f321169c-b38c-4403-8541-48064fd878b2/keystone-cron/0.log" Dec 06 10:19:20 crc kubenswrapper[4672]: I1206 10:19:20.879746 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-76bb4c894-tw7m5_010644c2-5d3a-41e3-a27a-31a6e1d3a0b4/keystone-api/0.log" Dec 06 10:19:20 crc kubenswrapper[4672]: I1206 10:19:20.965057 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_d6a82c78-9f40-4c1a-8f10-03f92549df7b/kube-state-metrics/0.log" Dec 06 10:19:21 crc kubenswrapper[4672]: I1206 10:19:21.073506 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r_cc85e883-c516-489f-b15d-6e57e4236b75/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 10:19:21 crc kubenswrapper[4672]: I1206 10:19:21.207328 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_42ee5091-4d7a-4807-905e-19dddd238386/manila-api/0.log" Dec 06 10:19:21 crc kubenswrapper[4672]: I1206 10:19:21.260477 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_42ee5091-4d7a-4807-905e-19dddd238386/manila-api-log/0.log" Dec 06 10:19:21 crc kubenswrapper[4672]: I1206 10:19:21.410134 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_ab157348-d161-4de2-bf4c-084cb71b0982/probe/0.log" Dec 06 10:19:21 crc kubenswrapper[4672]: I1206 10:19:21.412780 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_ab157348-d161-4de2-bf4c-084cb71b0982/manila-scheduler/0.log" Dec 06 10:19:21 crc kubenswrapper[4672]: I1206 10:19:21.526802 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_d7cfcb36-13c1-4215-b316-b2082d41bcae/manila-share/0.log" Dec 06 10:19:21 crc kubenswrapper[4672]: I1206 10:19:21.604457 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_d7cfcb36-13c1-4215-b316-b2082d41bcae/probe/0.log" Dec 06 10:19:22 crc kubenswrapper[4672]: I1206 10:19:22.013762 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-d999c477-wf9vn_bc4c1773-bc77-4592-aff9-04323f477805/neutron-api/0.log" Dec 06 10:19:22 crc kubenswrapper[4672]: I1206 10:19:22.015321 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-d999c477-wf9vn_bc4c1773-bc77-4592-aff9-04323f477805/neutron-httpd/0.log" Dec 06 10:19:22 crc kubenswrapper[4672]: I1206 10:19:22.176508 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f_0a4871b2-574b-433c-8491-9147da825602/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 10:19:22 crc kubenswrapper[4672]: I1206 10:19:22.877217 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1/nova-api-log/0.log" Dec 06 10:19:22 crc kubenswrapper[4672]: I1206 10:19:22.907197 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_ee39fc48-02ae-46b6-90b8-5b82cafad74d/nova-cell0-conductor-conductor/0.log" Dec 06 10:19:23 crc kubenswrapper[4672]: I1206 10:19:23.123141 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_2c1902d9-bb65-4974-a922-056811447603/nova-cell1-conductor-conductor/0.log" Dec 06 10:19:23 crc kubenswrapper[4672]: I1206 10:19:23.297052 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_21ff730f-c3e2-4cf0-8e52-8345907156f1/nova-cell1-novncproxy-novncproxy/0.log" Dec 06 10:19:23 crc kubenswrapper[4672]: I1206 10:19:23.331267 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1/nova-api-api/0.log" Dec 06 10:19:23 crc kubenswrapper[4672]: I1206 10:19:23.536825 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb_b27237d2-1240-4f55-a12b-9248c3a899e4/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 10:19:23 crc kubenswrapper[4672]: I1206 10:19:23.956117 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2610b3a3-94e4-4583-b42a-739e7dd1bfc7/nova-metadata-log/0.log" Dec 06 10:19:24 crc kubenswrapper[4672]: I1206 10:19:24.293396 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8c53efb2-1642-4efd-b920-7ad41e6c136a/mysql-bootstrap/0.log" Dec 06 10:19:24 crc kubenswrapper[4672]: I1206 10:19:24.302740 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_e0222d9d-628a-423d-b12a-377e94b3ac5c/nova-scheduler-scheduler/0.log" Dec 06 10:19:24 crc kubenswrapper[4672]: I1206 10:19:24.512318 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8c53efb2-1642-4efd-b920-7ad41e6c136a/galera/0.log" Dec 06 10:19:24 crc kubenswrapper[4672]: I1206 10:19:24.556345 4672 scope.go:117] "RemoveContainer" containerID="17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6" Dec 06 10:19:24 crc kubenswrapper[4672]: E1206 10:19:24.556592 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:19:24 crc kubenswrapper[4672]: I1206 10:19:24.589922 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8c53efb2-1642-4efd-b920-7ad41e6c136a/mysql-bootstrap/0.log" Dec 06 10:19:24 crc kubenswrapper[4672]: I1206 10:19:24.822550 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_37d0f081-e2da-4845-9097-31607c42efc4/mysql-bootstrap/0.log" Dec 06 10:19:25 crc kubenswrapper[4672]: I1206 10:19:25.040112 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_37d0f081-e2da-4845-9097-31607c42efc4/mysql-bootstrap/0.log" Dec 06 10:19:25 crc kubenswrapper[4672]: I1206 10:19:25.071556 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_37d0f081-e2da-4845-9097-31607c42efc4/galera/0.log" Dec 06 10:19:25 crc kubenswrapper[4672]: I1206 10:19:25.269807 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_44fea3e1-80c8-4525-b613-467978a95351/openstackclient/0.log" Dec 06 10:19:25 crc kubenswrapper[4672]: I1206 10:19:25.376443 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-hxgmq_2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6/ovn-controller/0.log" Dec 06 10:19:25 crc kubenswrapper[4672]: I1206 10:19:25.505593 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2610b3a3-94e4-4583-b42a-739e7dd1bfc7/nova-metadata-metadata/0.log" Dec 06 10:19:25 crc kubenswrapper[4672]: I1206 10:19:25.614384 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-q5ktw_a8b30f64-653c-49e8-857d-af30b3126e2d/openstack-network-exporter/0.log" Dec 06 10:19:25 crc kubenswrapper[4672]: I1206 10:19:25.859807 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rsxq7_8cc7e2b2-ad6c-44f4-b477-951936b867d8/ovsdb-server-init/0.log" Dec 06 10:19:26 crc kubenswrapper[4672]: I1206 10:19:26.518387 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rsxq7_8cc7e2b2-ad6c-44f4-b477-951936b867d8/ovs-vswitchd/0.log" Dec 06 10:19:26 crc kubenswrapper[4672]: I1206 10:19:26.539518 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rsxq7_8cc7e2b2-ad6c-44f4-b477-951936b867d8/ovsdb-server/0.log" Dec 06 10:19:26 crc kubenswrapper[4672]: I1206 10:19:26.551468 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rsxq7_8cc7e2b2-ad6c-44f4-b477-951936b867d8/ovsdb-server-init/0.log" Dec 06 10:19:26 crc kubenswrapper[4672]: I1206 10:19:26.847850 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-ckzpr_3c8ad536-4cb5-4454-bcc3-5b13cb92215d/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 10:19:26 crc kubenswrapper[4672]: I1206 10:19:26.864169 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb/openstack-network-exporter/0.log" Dec 06 10:19:27 crc kubenswrapper[4672]: I1206 10:19:27.007909 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb/ovn-northd/0.log" Dec 06 10:19:27 crc kubenswrapper[4672]: I1206 10:19:27.175868 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9f085ca1-832b-40dc-b131-2c287df92f6e/openstack-network-exporter/0.log" Dec 06 10:19:27 crc kubenswrapper[4672]: I1206 10:19:27.188744 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9f085ca1-832b-40dc-b131-2c287df92f6e/ovsdbserver-nb/0.log" Dec 06 10:19:27 crc kubenswrapper[4672]: I1206 10:19:27.355562 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4cddfb03-e3ff-478e-91c7-e3b58145d1e6/openstack-network-exporter/0.log" Dec 06 10:19:27 crc kubenswrapper[4672]: I1206 10:19:27.449682 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4cddfb03-e3ff-478e-91c7-e3b58145d1e6/ovsdbserver-sb/0.log" Dec 06 10:19:27 crc kubenswrapper[4672]: I1206 10:19:27.668048 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-777f8d8c58-75kwt_46caf0fe-e392-43fb-8893-2a7bd67bd1a7/placement-api/0.log" Dec 06 10:19:27 crc kubenswrapper[4672]: I1206 10:19:27.714253 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-777f8d8c58-75kwt_46caf0fe-e392-43fb-8893-2a7bd67bd1a7/placement-log/0.log" Dec 06 10:19:28 crc kubenswrapper[4672]: I1206 10:19:28.100773 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3f71615b-1205-44b2-b4aa-c03548716486/setup-container/0.log" Dec 06 10:19:28 crc kubenswrapper[4672]: I1206 10:19:28.366995 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3f71615b-1205-44b2-b4aa-c03548716486/setup-container/0.log" Dec 06 10:19:28 crc kubenswrapper[4672]: I1206 10:19:28.418179 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f3cf9f22-30ac-48ca-9d05-407868710c73/setup-container/0.log" Dec 06 10:19:28 crc kubenswrapper[4672]: I1206 10:19:28.419130 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3f71615b-1205-44b2-b4aa-c03548716486/rabbitmq/0.log" Dec 06 10:19:28 crc kubenswrapper[4672]: I1206 10:19:28.736159 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f3cf9f22-30ac-48ca-9d05-407868710c73/rabbitmq/0.log" Dec 06 10:19:28 crc kubenswrapper[4672]: I1206 10:19:28.766069 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f3cf9f22-30ac-48ca-9d05-407868710c73/setup-container/0.log" Dec 06 10:19:28 crc kubenswrapper[4672]: I1206 10:19:28.896154 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt_bb01149f-0837-46f0-8636-b72f5fb85e9a/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 10:19:29 crc kubenswrapper[4672]: I1206 10:19:29.010865 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt_38dd2d36-2778-405f-97b8-d2651746de0c/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 10:19:29 crc kubenswrapper[4672]: I1206 10:19:29.202795 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-9f6tf_5c6bfe13-aab7-4455-9879-a1e1e7276407/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 10:19:29 crc kubenswrapper[4672]: I1206 10:19:29.415929 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-pmvr2_e6867be1-002c-4eae-b841-885e3e5e5d20/ssh-known-hosts-edpm-deployment/0.log" Dec 06 10:19:29 crc kubenswrapper[4672]: I1206 10:19:29.463768 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d/tempest-tests-tempest-tests-runner/0.log" Dec 06 10:19:29 crc kubenswrapper[4672]: I1206 10:19:29.601957 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_6db39db9-f682-4b08-adce-32d7478a345b/test-operator-logs-container/0.log" Dec 06 10:19:29 crc kubenswrapper[4672]: I1206 10:19:29.786468 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn_53ed8161-58e0-4b3b-91bf-190216b16b12/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 10:19:37 crc kubenswrapper[4672]: I1206 10:19:37.557270 4672 scope.go:117] "RemoveContainer" containerID="17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6" Dec 06 10:19:37 crc kubenswrapper[4672]: E1206 10:19:37.557900 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:19:42 crc kubenswrapper[4672]: I1206 10:19:42.165940 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_7595f929-2c12-4a7f-ba33-2701f7a701ee/memcached/0.log" Dec 06 10:19:52 crc kubenswrapper[4672]: I1206 10:19:52.564412 4672 scope.go:117] "RemoveContainer" containerID="17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6" Dec 06 10:19:52 crc kubenswrapper[4672]: E1206 10:19:52.565558 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:20:01 crc kubenswrapper[4672]: I1206 10:20:01.212413 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm_a2459c7d-a6d6-48c8-9a18-48d05c0129a9/util/0.log" Dec 06 10:20:01 crc kubenswrapper[4672]: I1206 10:20:01.417405 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm_a2459c7d-a6d6-48c8-9a18-48d05c0129a9/pull/0.log" Dec 06 10:20:01 crc kubenswrapper[4672]: I1206 10:20:01.434156 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm_a2459c7d-a6d6-48c8-9a18-48d05c0129a9/util/0.log" Dec 06 10:20:01 crc kubenswrapper[4672]: I1206 10:20:01.436766 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm_a2459c7d-a6d6-48c8-9a18-48d05c0129a9/pull/0.log" Dec 06 10:20:01 crc kubenswrapper[4672]: I1206 10:20:01.670029 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm_a2459c7d-a6d6-48c8-9a18-48d05c0129a9/extract/0.log" Dec 06 10:20:01 crc kubenswrapper[4672]: I1206 10:20:01.730109 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm_a2459c7d-a6d6-48c8-9a18-48d05c0129a9/pull/0.log" Dec 06 10:20:01 crc kubenswrapper[4672]: I1206 10:20:01.788325 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm_a2459c7d-a6d6-48c8-9a18-48d05c0129a9/util/0.log" Dec 06 10:20:01 crc kubenswrapper[4672]: I1206 10:20:01.928438 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-lh7x2_ce4e8b8a-4f3a-4303-9455-8eb984c06f57/kube-rbac-proxy/0.log" Dec 06 10:20:01 crc kubenswrapper[4672]: I1206 10:20:01.977750 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-lh7x2_ce4e8b8a-4f3a-4303-9455-8eb984c06f57/manager/0.log" Dec 06 10:20:02 crc kubenswrapper[4672]: I1206 10:20:02.063202 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-cpc5n_7dc29189-4c37-4886-af89-7c6cb57f237e/kube-rbac-proxy/0.log" Dec 06 10:20:02 crc kubenswrapper[4672]: I1206 10:20:02.262107 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-cpc5n_7dc29189-4c37-4886-af89-7c6cb57f237e/manager/0.log" Dec 06 10:20:02 crc kubenswrapper[4672]: I1206 10:20:02.888219 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-6jcpj_7e99a7a0-5a1d-4143-a8b7-9fb170d119a2/kube-rbac-proxy/0.log" Dec 06 10:20:02 crc kubenswrapper[4672]: I1206 10:20:02.921304 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-6jcpj_7e99a7a0-5a1d-4143-a8b7-9fb170d119a2/manager/0.log" Dec 06 10:20:03 crc kubenswrapper[4672]: I1206 10:20:03.127450 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-p7c94_018edeb2-cc58-49fe-a7ea-15a8b6646ddd/kube-rbac-proxy/0.log" Dec 06 10:20:03 crc kubenswrapper[4672]: I1206 10:20:03.209140 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-2zwxr_96ee3cc6-bf15-4fa0-9efc-7a0aa1338b43/kube-rbac-proxy/0.log" Dec 06 10:20:03 crc kubenswrapper[4672]: I1206 10:20:03.266487 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-p7c94_018edeb2-cc58-49fe-a7ea-15a8b6646ddd/manager/0.log" Dec 06 10:20:03 crc kubenswrapper[4672]: I1206 10:20:03.398162 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-2zwxr_96ee3cc6-bf15-4fa0-9efc-7a0aa1338b43/manager/0.log" Dec 06 10:20:03 crc kubenswrapper[4672]: I1206 10:20:03.431840 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-dvzm4_7753548d-df52-4a65-b447-d20dcd379cde/kube-rbac-proxy/0.log" Dec 06 10:20:03 crc kubenswrapper[4672]: I1206 10:20:03.525976 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-dvzm4_7753548d-df52-4a65-b447-d20dcd379cde/manager/0.log" Dec 06 10:20:03 crc kubenswrapper[4672]: I1206 10:20:03.557179 4672 scope.go:117] "RemoveContainer" containerID="17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6" Dec 06 10:20:03 crc kubenswrapper[4672]: E1206 10:20:03.557400 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:20:04 crc kubenswrapper[4672]: I1206 10:20:04.402247 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-8ql2p_9977f421-c235-40ef-8d9f-2e0125bf3593/kube-rbac-proxy/0.log" Dec 06 10:20:04 crc kubenswrapper[4672]: I1206 10:20:04.408450 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-rwjvr_6bbb7d8a-ba3a-476a-b09d-0fd084fc325e/kube-rbac-proxy/0.log" Dec 06 10:20:04 crc kubenswrapper[4672]: I1206 10:20:04.575430 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-rwjvr_6bbb7d8a-ba3a-476a-b09d-0fd084fc325e/manager/0.log" Dec 06 10:20:04 crc kubenswrapper[4672]: I1206 10:20:04.702396 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-8ql2p_9977f421-c235-40ef-8d9f-2e0125bf3593/manager/0.log" Dec 06 10:20:04 crc kubenswrapper[4672]: I1206 10:20:04.866224 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-j7cvj_308c58b1-3c6a-4c79-88fc-b4d515efd96d/kube-rbac-proxy/0.log" Dec 06 10:20:04 crc kubenswrapper[4672]: I1206 10:20:04.911389 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-j7cvj_308c58b1-3c6a-4c79-88fc-b4d515efd96d/manager/0.log" Dec 06 10:20:05 crc kubenswrapper[4672]: I1206 10:20:05.064622 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-zxcvx_3fda2255-f593-42c6-b17e-2996a6ce7c5e/kube-rbac-proxy/0.log" Dec 06 10:20:05 crc kubenswrapper[4672]: I1206 10:20:05.215242 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-zxcvx_3fda2255-f593-42c6-b17e-2996a6ce7c5e/manager/0.log" Dec 06 10:20:05 crc kubenswrapper[4672]: I1206 10:20:05.250233 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-crbgz_27d7a7f5-ab93-40b6-8718-0a8b930d2c0f/kube-rbac-proxy/0.log" Dec 06 10:20:05 crc kubenswrapper[4672]: I1206 10:20:05.355389 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-crbgz_27d7a7f5-ab93-40b6-8718-0a8b930d2c0f/manager/0.log" Dec 06 10:20:05 crc kubenswrapper[4672]: I1206 10:20:05.413681 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-pqnb9_a59bea52-a8d1-4ac9-8ce0-0a623efcb009/kube-rbac-proxy/0.log" Dec 06 10:20:05 crc kubenswrapper[4672]: I1206 10:20:05.603612 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-pqnb9_a59bea52-a8d1-4ac9-8ce0-0a623efcb009/manager/0.log" Dec 06 10:20:05 crc kubenswrapper[4672]: I1206 10:20:05.683394 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-kpmch_8244458a-10b4-4c4f-8f9e-dc93e90329af/kube-rbac-proxy/0.log" Dec 06 10:20:05 crc kubenswrapper[4672]: I1206 10:20:05.788948 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-nkk8g_73aa720c-9e22-4ef9-a5b4-512c0194f0a4/kube-rbac-proxy/0.log" Dec 06 10:20:05 crc kubenswrapper[4672]: I1206 10:20:05.863168 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-kpmch_8244458a-10b4-4c4f-8f9e-dc93e90329af/manager/0.log" Dec 06 10:20:05 crc kubenswrapper[4672]: I1206 10:20:05.975294 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-nkk8g_73aa720c-9e22-4ef9-a5b4-512c0194f0a4/manager/0.log" Dec 06 10:20:06 crc kubenswrapper[4672]: I1206 10:20:06.053917 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-55c85496f586tjc_4794dd53-214a-4537-90c9-0527db628c8b/kube-rbac-proxy/0.log" Dec 06 10:20:06 crc kubenswrapper[4672]: I1206 10:20:06.203138 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-55c85496f586tjc_4794dd53-214a-4537-90c9-0527db628c8b/manager/0.log" Dec 06 10:20:06 crc kubenswrapper[4672]: I1206 10:20:06.623777 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-rx48l_250af723-f950-4125-8748-d7eac336f4c1/registry-server/0.log" Dec 06 10:20:06 crc kubenswrapper[4672]: I1206 10:20:06.649021 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-55b6fb9447-vc2xx_63582a9a-093b-44e1-8932-4b910f301e52/operator/0.log" Dec 06 10:20:06 crc kubenswrapper[4672]: I1206 10:20:06.768959 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-nqh5d_e25e6854-1001-4962-bd9b-f4cb37ebefe1/kube-rbac-proxy/0.log" Dec 06 10:20:06 crc kubenswrapper[4672]: I1206 10:20:06.888723 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-nqh5d_e25e6854-1001-4962-bd9b-f4cb37ebefe1/manager/0.log" Dec 06 10:20:06 crc kubenswrapper[4672]: I1206 10:20:06.990669 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-vxgjl_d1ba66a9-3383-413f-b2d3-fb13a4e4592b/kube-rbac-proxy/0.log" Dec 06 10:20:07 crc kubenswrapper[4672]: I1206 10:20:07.053328 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-vxgjl_d1ba66a9-3383-413f-b2d3-fb13a4e4592b/manager/0.log" Dec 06 10:20:07 crc kubenswrapper[4672]: I1206 10:20:07.255423 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-ntvgh_dd2774f1-51aa-4387-aaf1-02cd8329ae1d/operator/0.log" Dec 06 10:20:07 crc kubenswrapper[4672]: I1206 10:20:07.305683 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-5z9dq_d6abdea8-a426-4553-b4e7-8998d96eaed3/kube-rbac-proxy/0.log" Dec 06 10:20:07 crc kubenswrapper[4672]: I1206 10:20:07.346344 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-54bdf956c4-zpt5t_72a85d5f-d856-47b2-b0d6-f1fe23722f39/manager/0.log" Dec 06 10:20:07 crc kubenswrapper[4672]: I1206 10:20:07.505545 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-5z9dq_d6abdea8-a426-4553-b4e7-8998d96eaed3/manager/0.log" Dec 06 10:20:07 crc kubenswrapper[4672]: I1206 10:20:07.601462 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-49652_30a955f4-c456-4d9e-9621-dce7e9f7b8b8/kube-rbac-proxy/0.log" Dec 06 10:20:07 crc kubenswrapper[4672]: I1206 10:20:07.694809 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-49652_30a955f4-c456-4d9e-9621-dce7e9f7b8b8/manager/0.log" Dec 06 10:20:07 crc kubenswrapper[4672]: I1206 10:20:07.815308 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-9p8xf_b88a6b36-14ee-4898-beb2-dae9d2be7600/kube-rbac-proxy/0.log" Dec 06 10:20:07 crc kubenswrapper[4672]: I1206 10:20:07.880678 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-9p8xf_b88a6b36-14ee-4898-beb2-dae9d2be7600/manager/0.log" Dec 06 10:20:07 crc kubenswrapper[4672]: I1206 10:20:07.895991 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-xbspr_274d0d53-a194-47e5-b20d-e56155f01e72/kube-rbac-proxy/0.log" Dec 06 10:20:07 crc kubenswrapper[4672]: I1206 10:20:07.939142 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-xbspr_274d0d53-a194-47e5-b20d-e56155f01e72/manager/0.log" Dec 06 10:20:14 crc kubenswrapper[4672]: I1206 10:20:14.557119 4672 scope.go:117] "RemoveContainer" containerID="17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6" Dec 06 10:20:14 crc kubenswrapper[4672]: E1206 10:20:14.557964 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:20:27 crc kubenswrapper[4672]: I1206 10:20:27.557481 4672 scope.go:117] "RemoveContainer" containerID="17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6" Dec 06 10:20:27 crc kubenswrapper[4672]: E1206 10:20:27.558436 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:20:31 crc kubenswrapper[4672]: I1206 10:20:31.721157 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-vrlvb_9a2d76b4-eb44-49ba-ad51-fbe3022af615/control-plane-machine-set-operator/0.log" Dec 06 10:20:31 crc kubenswrapper[4672]: I1206 10:20:31.974396 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-b8m6z_87e773f5-6efb-4613-9af8-f05c7af849e1/kube-rbac-proxy/0.log" Dec 06 10:20:32 crc kubenswrapper[4672]: I1206 10:20:32.049170 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-b8m6z_87e773f5-6efb-4613-9af8-f05c7af849e1/machine-api-operator/0.log" Dec 06 10:20:42 crc kubenswrapper[4672]: I1206 10:20:42.562752 4672 scope.go:117] "RemoveContainer" containerID="17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6" Dec 06 10:20:42 crc kubenswrapper[4672]: E1206 10:20:42.563529 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:20:47 crc kubenswrapper[4672]: I1206 10:20:47.017671 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-kkdp5_ca049150-2cd7-48c8-a77a-90379dbd818b/cert-manager-controller/0.log" Dec 06 10:20:47 crc kubenswrapper[4672]: I1206 10:20:47.155087 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-qscd7_9a0083d7-9175-4399-aaf0-0767c9d88faf/cert-manager-cainjector/0.log" Dec 06 10:20:47 crc kubenswrapper[4672]: I1206 10:20:47.510962 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-kdh29_23285e10-efd9-47e7-929b-e3fa93131669/cert-manager-webhook/0.log" Dec 06 10:20:54 crc kubenswrapper[4672]: I1206 10:20:54.556694 4672 scope.go:117] "RemoveContainer" containerID="17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6" Dec 06 10:20:54 crc kubenswrapper[4672]: E1206 10:20:54.557510 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:21:02 crc kubenswrapper[4672]: I1206 10:21:02.437064 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-zzxp6_8681df0a-44cf-471f-9257-bda9bae18f87/nmstate-console-plugin/0.log" Dec 06 10:21:02 crc kubenswrapper[4672]: I1206 10:21:02.542317 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-7m49h_a127c4da-7435-45e4-b772-f8e53381bea2/nmstate-handler/0.log" Dec 06 10:21:02 crc kubenswrapper[4672]: I1206 10:21:02.706188 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-kv76p_23695df9-9be3-41a1-af24-8e35e5a875d2/nmstate-metrics/0.log" Dec 06 10:21:02 crc kubenswrapper[4672]: I1206 10:21:02.767779 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-kv76p_23695df9-9be3-41a1-af24-8e35e5a875d2/kube-rbac-proxy/0.log" Dec 06 10:21:02 crc kubenswrapper[4672]: I1206 10:21:02.840052 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-9bwkk_63eadd21-65ec-4fbd-8c8c-265a1ade0b4c/nmstate-operator/0.log" Dec 06 10:21:03 crc kubenswrapper[4672]: I1206 10:21:03.006884 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-mp77w_08d88a07-50e0-4273-bbb4-9d6ed17820a8/nmstate-webhook/0.log" Dec 06 10:21:09 crc kubenswrapper[4672]: I1206 10:21:09.557083 4672 scope.go:117] "RemoveContainer" containerID="17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6" Dec 06 10:21:09 crc kubenswrapper[4672]: E1206 10:21:09.558784 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:21:18 crc kubenswrapper[4672]: I1206 10:21:18.693673 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-ljcvb_35023ac9-ea1e-4576-b700-4afe57f59230/kube-rbac-proxy/0.log" Dec 06 10:21:18 crc kubenswrapper[4672]: I1206 10:21:18.707421 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-ljcvb_35023ac9-ea1e-4576-b700-4afe57f59230/controller/0.log" Dec 06 10:21:18 crc kubenswrapper[4672]: I1206 10:21:18.842314 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-mqk7f_2a795467-0c6f-4dae-bd0e-0595c9eb88b4/frr-k8s-webhook-server/0.log" Dec 06 10:21:18 crc kubenswrapper[4672]: I1206 10:21:18.896092 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/cp-frr-files/0.log" Dec 06 10:21:19 crc kubenswrapper[4672]: I1206 10:21:19.104762 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/cp-reloader/0.log" Dec 06 10:21:19 crc kubenswrapper[4672]: I1206 10:21:19.130811 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/cp-reloader/0.log" Dec 06 10:21:19 crc kubenswrapper[4672]: I1206 10:21:19.135325 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/cp-frr-files/0.log" Dec 06 10:21:19 crc kubenswrapper[4672]: I1206 10:21:19.174570 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/cp-metrics/0.log" Dec 06 10:21:19 crc kubenswrapper[4672]: I1206 10:21:19.305835 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/cp-frr-files/0.log" Dec 06 10:21:19 crc kubenswrapper[4672]: I1206 10:21:19.320916 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/cp-reloader/0.log" Dec 06 10:21:19 crc kubenswrapper[4672]: I1206 10:21:19.381028 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/cp-metrics/0.log" Dec 06 10:21:19 crc kubenswrapper[4672]: I1206 10:21:19.400497 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/cp-metrics/0.log" Dec 06 10:21:19 crc kubenswrapper[4672]: I1206 10:21:19.580113 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/cp-reloader/0.log" Dec 06 10:21:19 crc kubenswrapper[4672]: I1206 10:21:19.580316 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/cp-metrics/0.log" Dec 06 10:21:19 crc kubenswrapper[4672]: I1206 10:21:19.611920 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/controller/0.log" Dec 06 10:21:19 crc kubenswrapper[4672]: I1206 10:21:19.627822 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/cp-frr-files/0.log" Dec 06 10:21:20 crc kubenswrapper[4672]: I1206 10:21:20.041006 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/kube-rbac-proxy-frr/0.log" Dec 06 10:21:20 crc kubenswrapper[4672]: I1206 10:21:20.083704 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/kube-rbac-proxy/0.log" Dec 06 10:21:20 crc kubenswrapper[4672]: I1206 10:21:20.117581 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/frr-metrics/0.log" Dec 06 10:21:20 crc kubenswrapper[4672]: I1206 10:21:20.290941 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/reloader/0.log" Dec 06 10:21:20 crc kubenswrapper[4672]: I1206 10:21:20.434574 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-765868b4fd-qt2wp_216580e9-9198-4b66-bf50-46df3a04c88e/manager/0.log" Dec 06 10:21:20 crc kubenswrapper[4672]: I1206 10:21:20.742655 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6df5976447-kzfnr_8faaf896-2bc9-489b-97dc-29e0efa86a91/webhook-server/0.log" Dec 06 10:21:20 crc kubenswrapper[4672]: I1206 10:21:20.910850 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-skjzl_47d9472b-be65-46ea-8eff-fa70e315ed49/kube-rbac-proxy/0.log" Dec 06 10:21:21 crc kubenswrapper[4672]: I1206 10:21:21.522211 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-skjzl_47d9472b-be65-46ea-8eff-fa70e315ed49/speaker/0.log" Dec 06 10:21:21 crc kubenswrapper[4672]: I1206 10:21:21.673573 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/frr/0.log" Dec 06 10:21:22 crc kubenswrapper[4672]: I1206 10:21:22.559401 4672 scope.go:117] "RemoveContainer" containerID="17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6" Dec 06 10:21:22 crc kubenswrapper[4672]: E1206 10:21:22.559808 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:21:35 crc kubenswrapper[4672]: I1206 10:21:35.290098 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf_08576097-cc2d-49d5-8bda-66efdd1f960a/util/0.log" Dec 06 10:21:35 crc kubenswrapper[4672]: I1206 10:21:35.556791 4672 scope.go:117] "RemoveContainer" containerID="17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6" Dec 06 10:21:35 crc kubenswrapper[4672]: E1206 10:21:35.557219 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:21:35 crc kubenswrapper[4672]: I1206 10:21:35.672089 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf_08576097-cc2d-49d5-8bda-66efdd1f960a/util/0.log" Dec 06 10:21:35 crc kubenswrapper[4672]: I1206 10:21:35.675029 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf_08576097-cc2d-49d5-8bda-66efdd1f960a/pull/0.log" Dec 06 10:21:35 crc kubenswrapper[4672]: I1206 10:21:35.695023 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf_08576097-cc2d-49d5-8bda-66efdd1f960a/pull/0.log" Dec 06 10:21:35 crc kubenswrapper[4672]: I1206 10:21:35.870552 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf_08576097-cc2d-49d5-8bda-66efdd1f960a/util/0.log" Dec 06 10:21:35 crc kubenswrapper[4672]: I1206 10:21:35.891783 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf_08576097-cc2d-49d5-8bda-66efdd1f960a/extract/0.log" Dec 06 10:21:35 crc kubenswrapper[4672]: I1206 10:21:35.913855 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf_08576097-cc2d-49d5-8bda-66efdd1f960a/pull/0.log" Dec 06 10:21:36 crc kubenswrapper[4672]: I1206 10:21:36.108152 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj_d6fce567-e6b2-4968-afff-b87e8c3d5058/util/0.log" Dec 06 10:21:36 crc kubenswrapper[4672]: I1206 10:21:36.282376 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj_d6fce567-e6b2-4968-afff-b87e8c3d5058/util/0.log" Dec 06 10:21:36 crc kubenswrapper[4672]: I1206 10:21:36.323167 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj_d6fce567-e6b2-4968-afff-b87e8c3d5058/pull/0.log" Dec 06 10:21:36 crc kubenswrapper[4672]: I1206 10:21:36.347670 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj_d6fce567-e6b2-4968-afff-b87e8c3d5058/pull/0.log" Dec 06 10:21:36 crc kubenswrapper[4672]: I1206 10:21:36.499787 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj_d6fce567-e6b2-4968-afff-b87e8c3d5058/pull/0.log" Dec 06 10:21:36 crc kubenswrapper[4672]: I1206 10:21:36.615229 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj_d6fce567-e6b2-4968-afff-b87e8c3d5058/extract/0.log" Dec 06 10:21:36 crc kubenswrapper[4672]: I1206 10:21:36.633586 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj_d6fce567-e6b2-4968-afff-b87e8c3d5058/util/0.log" Dec 06 10:21:36 crc kubenswrapper[4672]: I1206 10:21:36.781364 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rrkkx_7c188e7d-d705-41ce-bf0d-468de7745723/extract-utilities/0.log" Dec 06 10:21:36 crc kubenswrapper[4672]: I1206 10:21:36.939690 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rrkkx_7c188e7d-d705-41ce-bf0d-468de7745723/extract-content/0.log" Dec 06 10:21:36 crc kubenswrapper[4672]: I1206 10:21:36.948678 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rrkkx_7c188e7d-d705-41ce-bf0d-468de7745723/extract-utilities/0.log" Dec 06 10:21:36 crc kubenswrapper[4672]: I1206 10:21:36.960055 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rrkkx_7c188e7d-d705-41ce-bf0d-468de7745723/extract-content/0.log" Dec 06 10:21:37 crc kubenswrapper[4672]: I1206 10:21:37.261744 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rrkkx_7c188e7d-d705-41ce-bf0d-468de7745723/extract-utilities/0.log" Dec 06 10:21:37 crc kubenswrapper[4672]: I1206 10:21:37.292609 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rrkkx_7c188e7d-d705-41ce-bf0d-468de7745723/extract-content/0.log" Dec 06 10:21:37 crc kubenswrapper[4672]: I1206 10:21:37.456542 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x867j_e6f78753-7aad-4178-bff5-d45475f4a3df/extract-utilities/0.log" Dec 06 10:21:37 crc kubenswrapper[4672]: I1206 10:21:37.748563 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rrkkx_7c188e7d-d705-41ce-bf0d-468de7745723/registry-server/0.log" Dec 06 10:21:37 crc kubenswrapper[4672]: I1206 10:21:37.831641 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x867j_e6f78753-7aad-4178-bff5-d45475f4a3df/extract-content/0.log" Dec 06 10:21:37 crc kubenswrapper[4672]: I1206 10:21:37.834272 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x867j_e6f78753-7aad-4178-bff5-d45475f4a3df/extract-content/0.log" Dec 06 10:21:37 crc kubenswrapper[4672]: I1206 10:21:37.842670 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x867j_e6f78753-7aad-4178-bff5-d45475f4a3df/extract-utilities/0.log" Dec 06 10:21:38 crc kubenswrapper[4672]: I1206 10:21:38.259023 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x867j_e6f78753-7aad-4178-bff5-d45475f4a3df/extract-content/0.log" Dec 06 10:21:38 crc kubenswrapper[4672]: I1206 10:21:38.347345 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x867j_e6f78753-7aad-4178-bff5-d45475f4a3df/extract-utilities/0.log" Dec 06 10:21:38 crc kubenswrapper[4672]: I1206 10:21:38.587430 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-zhbdf_6f374204-77e4-4b75-afaf-43579bc0506a/marketplace-operator/0.log" Dec 06 10:21:38 crc kubenswrapper[4672]: I1206 10:21:38.828134 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4mbvq_55620a10-8ac9-47b4-88b9-7129c90c4ee4/extract-utilities/0.log" Dec 06 10:21:38 crc kubenswrapper[4672]: I1206 10:21:38.945640 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x867j_e6f78753-7aad-4178-bff5-d45475f4a3df/registry-server/0.log" Dec 06 10:21:39 crc kubenswrapper[4672]: I1206 10:21:39.022607 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4mbvq_55620a10-8ac9-47b4-88b9-7129c90c4ee4/extract-utilities/0.log" Dec 06 10:21:39 crc kubenswrapper[4672]: I1206 10:21:39.093119 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4mbvq_55620a10-8ac9-47b4-88b9-7129c90c4ee4/extract-content/0.log" Dec 06 10:21:39 crc kubenswrapper[4672]: I1206 10:21:39.094726 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4mbvq_55620a10-8ac9-47b4-88b9-7129c90c4ee4/extract-content/0.log" Dec 06 10:21:39 crc kubenswrapper[4672]: I1206 10:21:39.247318 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4mbvq_55620a10-8ac9-47b4-88b9-7129c90c4ee4/extract-content/0.log" Dec 06 10:21:39 crc kubenswrapper[4672]: I1206 10:21:39.336192 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4mbvq_55620a10-8ac9-47b4-88b9-7129c90c4ee4/extract-utilities/0.log" Dec 06 10:21:39 crc kubenswrapper[4672]: I1206 10:21:39.412711 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4mbvq_55620a10-8ac9-47b4-88b9-7129c90c4ee4/registry-server/0.log" Dec 06 10:21:40 crc kubenswrapper[4672]: I1206 10:21:40.069166 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8whpj_e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7/extract-utilities/0.log" Dec 06 10:21:40 crc kubenswrapper[4672]: I1206 10:21:40.210623 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8whpj_e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7/extract-utilities/0.log" Dec 06 10:21:40 crc kubenswrapper[4672]: I1206 10:21:40.259763 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8whpj_e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7/extract-content/0.log" Dec 06 10:21:40 crc kubenswrapper[4672]: I1206 10:21:40.279751 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8whpj_e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7/extract-content/0.log" Dec 06 10:21:40 crc kubenswrapper[4672]: I1206 10:21:40.521079 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8whpj_e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7/extract-content/0.log" Dec 06 10:21:40 crc kubenswrapper[4672]: I1206 10:21:40.595505 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8whpj_e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7/extract-utilities/0.log" Dec 06 10:21:41 crc kubenswrapper[4672]: I1206 10:21:41.028977 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8whpj_e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7/registry-server/0.log" Dec 06 10:21:48 crc kubenswrapper[4672]: I1206 10:21:48.558346 4672 scope.go:117] "RemoveContainer" containerID="17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6" Dec 06 10:21:48 crc kubenswrapper[4672]: E1206 10:21:48.559086 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:22:01 crc kubenswrapper[4672]: I1206 10:22:01.556518 4672 scope.go:117] "RemoveContainer" containerID="17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6" Dec 06 10:22:01 crc kubenswrapper[4672]: E1206 10:22:01.557303 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:22:13 crc kubenswrapper[4672]: I1206 10:22:13.557466 4672 scope.go:117] "RemoveContainer" containerID="17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6" Dec 06 10:22:13 crc kubenswrapper[4672]: E1206 10:22:13.558818 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:22:26 crc kubenswrapper[4672]: I1206 10:22:26.557408 4672 scope.go:117] "RemoveContainer" containerID="17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6" Dec 06 10:22:26 crc kubenswrapper[4672]: E1206 10:22:26.558300 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:22:39 crc kubenswrapper[4672]: I1206 10:22:39.557116 4672 scope.go:117] "RemoveContainer" containerID="17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6" Dec 06 10:22:39 crc kubenswrapper[4672]: E1206 10:22:39.558367 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:22:46 crc kubenswrapper[4672]: I1206 10:22:46.490447 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mw6m7"] Dec 06 10:22:46 crc kubenswrapper[4672]: E1206 10:22:46.491890 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fdf43a6-fd1b-471a-b9c1-7839dd02a969" containerName="container-00" Dec 06 10:22:46 crc kubenswrapper[4672]: I1206 10:22:46.491908 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fdf43a6-fd1b-471a-b9c1-7839dd02a969" containerName="container-00" Dec 06 10:22:46 crc kubenswrapper[4672]: I1206 10:22:46.492164 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fdf43a6-fd1b-471a-b9c1-7839dd02a969" containerName="container-00" Dec 06 10:22:46 crc kubenswrapper[4672]: I1206 10:22:46.493559 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mw6m7" Dec 06 10:22:46 crc kubenswrapper[4672]: I1206 10:22:46.600726 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/401de309-0791-455c-81cf-67acc51335fd-catalog-content\") pod \"redhat-marketplace-mw6m7\" (UID: \"401de309-0791-455c-81cf-67acc51335fd\") " pod="openshift-marketplace/redhat-marketplace-mw6m7" Dec 06 10:22:46 crc kubenswrapper[4672]: I1206 10:22:46.601025 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66f7k\" (UniqueName: \"kubernetes.io/projected/401de309-0791-455c-81cf-67acc51335fd-kube-api-access-66f7k\") pod \"redhat-marketplace-mw6m7\" (UID: \"401de309-0791-455c-81cf-67acc51335fd\") " pod="openshift-marketplace/redhat-marketplace-mw6m7" Dec 06 10:22:46 crc kubenswrapper[4672]: I1206 10:22:46.601089 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/401de309-0791-455c-81cf-67acc51335fd-utilities\") pod \"redhat-marketplace-mw6m7\" (UID: \"401de309-0791-455c-81cf-67acc51335fd\") " pod="openshift-marketplace/redhat-marketplace-mw6m7" Dec 06 10:22:46 crc kubenswrapper[4672]: I1206 10:22:46.815703 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/401de309-0791-455c-81cf-67acc51335fd-utilities\") pod \"redhat-marketplace-mw6m7\" (UID: \"401de309-0791-455c-81cf-67acc51335fd\") " pod="openshift-marketplace/redhat-marketplace-mw6m7" Dec 06 10:22:46 crc kubenswrapper[4672]: I1206 10:22:46.816127 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/401de309-0791-455c-81cf-67acc51335fd-catalog-content\") pod \"redhat-marketplace-mw6m7\" (UID: \"401de309-0791-455c-81cf-67acc51335fd\") " pod="openshift-marketplace/redhat-marketplace-mw6m7" Dec 06 10:22:46 crc kubenswrapper[4672]: I1206 10:22:46.816364 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/401de309-0791-455c-81cf-67acc51335fd-utilities\") pod \"redhat-marketplace-mw6m7\" (UID: \"401de309-0791-455c-81cf-67acc51335fd\") " pod="openshift-marketplace/redhat-marketplace-mw6m7" Dec 06 10:22:46 crc kubenswrapper[4672]: I1206 10:22:46.816453 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/401de309-0791-455c-81cf-67acc51335fd-catalog-content\") pod \"redhat-marketplace-mw6m7\" (UID: \"401de309-0791-455c-81cf-67acc51335fd\") " pod="openshift-marketplace/redhat-marketplace-mw6m7" Dec 06 10:22:46 crc kubenswrapper[4672]: I1206 10:22:46.816583 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66f7k\" (UniqueName: \"kubernetes.io/projected/401de309-0791-455c-81cf-67acc51335fd-kube-api-access-66f7k\") pod \"redhat-marketplace-mw6m7\" (UID: \"401de309-0791-455c-81cf-67acc51335fd\") " pod="openshift-marketplace/redhat-marketplace-mw6m7" Dec 06 10:22:46 crc kubenswrapper[4672]: I1206 10:22:46.857394 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66f7k\" (UniqueName: \"kubernetes.io/projected/401de309-0791-455c-81cf-67acc51335fd-kube-api-access-66f7k\") pod \"redhat-marketplace-mw6m7\" (UID: \"401de309-0791-455c-81cf-67acc51335fd\") " pod="openshift-marketplace/redhat-marketplace-mw6m7" Dec 06 10:22:46 crc kubenswrapper[4672]: I1206 10:22:46.876207 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mw6m7"] Dec 06 10:22:47 crc kubenswrapper[4672]: I1206 10:22:47.115346 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mw6m7" Dec 06 10:22:47 crc kubenswrapper[4672]: I1206 10:22:47.818295 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mw6m7"] Dec 06 10:22:48 crc kubenswrapper[4672]: I1206 10:22:48.644168 4672 generic.go:334] "Generic (PLEG): container finished" podID="401de309-0791-455c-81cf-67acc51335fd" containerID="a5fabff347e73fdc55b714372f1c41782aecd544423da31074f2dfdbfed11f7f" exitCode=0 Dec 06 10:22:48 crc kubenswrapper[4672]: I1206 10:22:48.644229 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mw6m7" event={"ID":"401de309-0791-455c-81cf-67acc51335fd","Type":"ContainerDied","Data":"a5fabff347e73fdc55b714372f1c41782aecd544423da31074f2dfdbfed11f7f"} Dec 06 10:22:48 crc kubenswrapper[4672]: I1206 10:22:48.644838 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mw6m7" event={"ID":"401de309-0791-455c-81cf-67acc51335fd","Type":"ContainerStarted","Data":"f47356f49f0d5e6d5aeacf92479898f83c530ace91b68d52b91a5b110343db5c"} Dec 06 10:22:48 crc kubenswrapper[4672]: I1206 10:22:48.646099 4672 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 10:22:49 crc kubenswrapper[4672]: I1206 10:22:49.663932 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mw6m7" event={"ID":"401de309-0791-455c-81cf-67acc51335fd","Type":"ContainerStarted","Data":"67bb0c45852465869b51369c2108315190c27fd580d1ee1a402e0058e5c82740"} Dec 06 10:22:50 crc kubenswrapper[4672]: I1206 10:22:50.677376 4672 generic.go:334] "Generic (PLEG): container finished" podID="401de309-0791-455c-81cf-67acc51335fd" containerID="67bb0c45852465869b51369c2108315190c27fd580d1ee1a402e0058e5c82740" exitCode=0 Dec 06 10:22:50 crc kubenswrapper[4672]: I1206 10:22:50.677692 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mw6m7" event={"ID":"401de309-0791-455c-81cf-67acc51335fd","Type":"ContainerDied","Data":"67bb0c45852465869b51369c2108315190c27fd580d1ee1a402e0058e5c82740"} Dec 06 10:22:51 crc kubenswrapper[4672]: I1206 10:22:51.556920 4672 scope.go:117] "RemoveContainer" containerID="17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6" Dec 06 10:22:51 crc kubenswrapper[4672]: E1206 10:22:51.557413 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:22:51 crc kubenswrapper[4672]: I1206 10:22:51.689520 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mw6m7" event={"ID":"401de309-0791-455c-81cf-67acc51335fd","Type":"ContainerStarted","Data":"e738d571e3040885e578646139bb2bd5234798f3a750006161bbee84e6a182d8"} Dec 06 10:22:51 crc kubenswrapper[4672]: I1206 10:22:51.713393 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mw6m7" podStartSLOduration=3.208331009 podStartE2EDuration="5.71337058s" podCreationTimestamp="2025-12-06 10:22:46 +0000 UTC" firstStartedPulling="2025-12-06 10:22:48.645888043 +0000 UTC m=+4586.390148330" lastFinishedPulling="2025-12-06 10:22:51.150927604 +0000 UTC m=+4588.895187901" observedRunningTime="2025-12-06 10:22:51.706403111 +0000 UTC m=+4589.450663418" watchObservedRunningTime="2025-12-06 10:22:51.71337058 +0000 UTC m=+4589.457630867" Dec 06 10:22:57 crc kubenswrapper[4672]: I1206 10:22:57.117138 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mw6m7" Dec 06 10:22:57 crc kubenswrapper[4672]: I1206 10:22:57.117793 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mw6m7" Dec 06 10:22:57 crc kubenswrapper[4672]: I1206 10:22:57.190520 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mw6m7" Dec 06 10:22:57 crc kubenswrapper[4672]: I1206 10:22:57.792265 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mw6m7" Dec 06 10:22:58 crc kubenswrapper[4672]: I1206 10:22:58.471420 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mw6m7"] Dec 06 10:22:59 crc kubenswrapper[4672]: I1206 10:22:59.760115 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mw6m7" podUID="401de309-0791-455c-81cf-67acc51335fd" containerName="registry-server" containerID="cri-o://e738d571e3040885e578646139bb2bd5234798f3a750006161bbee84e6a182d8" gracePeriod=2 Dec 06 10:23:00 crc kubenswrapper[4672]: I1206 10:23:00.296827 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mw6m7" Dec 06 10:23:00 crc kubenswrapper[4672]: I1206 10:23:00.441240 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/401de309-0791-455c-81cf-67acc51335fd-utilities\") pod \"401de309-0791-455c-81cf-67acc51335fd\" (UID: \"401de309-0791-455c-81cf-67acc51335fd\") " Dec 06 10:23:00 crc kubenswrapper[4672]: I1206 10:23:00.441566 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66f7k\" (UniqueName: \"kubernetes.io/projected/401de309-0791-455c-81cf-67acc51335fd-kube-api-access-66f7k\") pod \"401de309-0791-455c-81cf-67acc51335fd\" (UID: \"401de309-0791-455c-81cf-67acc51335fd\") " Dec 06 10:23:00 crc kubenswrapper[4672]: I1206 10:23:00.441783 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/401de309-0791-455c-81cf-67acc51335fd-catalog-content\") pod \"401de309-0791-455c-81cf-67acc51335fd\" (UID: \"401de309-0791-455c-81cf-67acc51335fd\") " Dec 06 10:23:00 crc kubenswrapper[4672]: I1206 10:23:00.443758 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/401de309-0791-455c-81cf-67acc51335fd-utilities" (OuterVolumeSpecName: "utilities") pod "401de309-0791-455c-81cf-67acc51335fd" (UID: "401de309-0791-455c-81cf-67acc51335fd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:23:00 crc kubenswrapper[4672]: I1206 10:23:00.450202 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/401de309-0791-455c-81cf-67acc51335fd-kube-api-access-66f7k" (OuterVolumeSpecName: "kube-api-access-66f7k") pod "401de309-0791-455c-81cf-67acc51335fd" (UID: "401de309-0791-455c-81cf-67acc51335fd"). InnerVolumeSpecName "kube-api-access-66f7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:23:00 crc kubenswrapper[4672]: I1206 10:23:00.486194 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/401de309-0791-455c-81cf-67acc51335fd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "401de309-0791-455c-81cf-67acc51335fd" (UID: "401de309-0791-455c-81cf-67acc51335fd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:23:00 crc kubenswrapper[4672]: I1206 10:23:00.544526 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/401de309-0791-455c-81cf-67acc51335fd-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:23:00 crc kubenswrapper[4672]: I1206 10:23:00.544564 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66f7k\" (UniqueName: \"kubernetes.io/projected/401de309-0791-455c-81cf-67acc51335fd-kube-api-access-66f7k\") on node \"crc\" DevicePath \"\"" Dec 06 10:23:00 crc kubenswrapper[4672]: I1206 10:23:00.544578 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/401de309-0791-455c-81cf-67acc51335fd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:23:00 crc kubenswrapper[4672]: I1206 10:23:00.778000 4672 generic.go:334] "Generic (PLEG): container finished" podID="401de309-0791-455c-81cf-67acc51335fd" containerID="e738d571e3040885e578646139bb2bd5234798f3a750006161bbee84e6a182d8" exitCode=0 Dec 06 10:23:00 crc kubenswrapper[4672]: I1206 10:23:00.778049 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mw6m7" event={"ID":"401de309-0791-455c-81cf-67acc51335fd","Type":"ContainerDied","Data":"e738d571e3040885e578646139bb2bd5234798f3a750006161bbee84e6a182d8"} Dec 06 10:23:00 crc kubenswrapper[4672]: I1206 10:23:00.778072 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mw6m7" Dec 06 10:23:00 crc kubenswrapper[4672]: I1206 10:23:00.778095 4672 scope.go:117] "RemoveContainer" containerID="e738d571e3040885e578646139bb2bd5234798f3a750006161bbee84e6a182d8" Dec 06 10:23:00 crc kubenswrapper[4672]: I1206 10:23:00.778077 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mw6m7" event={"ID":"401de309-0791-455c-81cf-67acc51335fd","Type":"ContainerDied","Data":"f47356f49f0d5e6d5aeacf92479898f83c530ace91b68d52b91a5b110343db5c"} Dec 06 10:23:00 crc kubenswrapper[4672]: I1206 10:23:00.809252 4672 scope.go:117] "RemoveContainer" containerID="67bb0c45852465869b51369c2108315190c27fd580d1ee1a402e0058e5c82740" Dec 06 10:23:00 crc kubenswrapper[4672]: I1206 10:23:00.811171 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mw6m7"] Dec 06 10:23:00 crc kubenswrapper[4672]: I1206 10:23:00.821721 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mw6m7"] Dec 06 10:23:00 crc kubenswrapper[4672]: I1206 10:23:00.833454 4672 scope.go:117] "RemoveContainer" containerID="a5fabff347e73fdc55b714372f1c41782aecd544423da31074f2dfdbfed11f7f" Dec 06 10:23:00 crc kubenswrapper[4672]: I1206 10:23:00.890972 4672 scope.go:117] "RemoveContainer" containerID="e738d571e3040885e578646139bb2bd5234798f3a750006161bbee84e6a182d8" Dec 06 10:23:00 crc kubenswrapper[4672]: E1206 10:23:00.891574 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e738d571e3040885e578646139bb2bd5234798f3a750006161bbee84e6a182d8\": container with ID starting with e738d571e3040885e578646139bb2bd5234798f3a750006161bbee84e6a182d8 not found: ID does not exist" containerID="e738d571e3040885e578646139bb2bd5234798f3a750006161bbee84e6a182d8" Dec 06 10:23:00 crc kubenswrapper[4672]: I1206 10:23:00.891733 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e738d571e3040885e578646139bb2bd5234798f3a750006161bbee84e6a182d8"} err="failed to get container status \"e738d571e3040885e578646139bb2bd5234798f3a750006161bbee84e6a182d8\": rpc error: code = NotFound desc = could not find container \"e738d571e3040885e578646139bb2bd5234798f3a750006161bbee84e6a182d8\": container with ID starting with e738d571e3040885e578646139bb2bd5234798f3a750006161bbee84e6a182d8 not found: ID does not exist" Dec 06 10:23:00 crc kubenswrapper[4672]: I1206 10:23:00.891767 4672 scope.go:117] "RemoveContainer" containerID="67bb0c45852465869b51369c2108315190c27fd580d1ee1a402e0058e5c82740" Dec 06 10:23:00 crc kubenswrapper[4672]: E1206 10:23:00.892327 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67bb0c45852465869b51369c2108315190c27fd580d1ee1a402e0058e5c82740\": container with ID starting with 67bb0c45852465869b51369c2108315190c27fd580d1ee1a402e0058e5c82740 not found: ID does not exist" containerID="67bb0c45852465869b51369c2108315190c27fd580d1ee1a402e0058e5c82740" Dec 06 10:23:00 crc kubenswrapper[4672]: I1206 10:23:00.892353 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67bb0c45852465869b51369c2108315190c27fd580d1ee1a402e0058e5c82740"} err="failed to get container status \"67bb0c45852465869b51369c2108315190c27fd580d1ee1a402e0058e5c82740\": rpc error: code = NotFound desc = could not find container \"67bb0c45852465869b51369c2108315190c27fd580d1ee1a402e0058e5c82740\": container with ID starting with 67bb0c45852465869b51369c2108315190c27fd580d1ee1a402e0058e5c82740 not found: ID does not exist" Dec 06 10:23:00 crc kubenswrapper[4672]: I1206 10:23:00.892372 4672 scope.go:117] "RemoveContainer" containerID="a5fabff347e73fdc55b714372f1c41782aecd544423da31074f2dfdbfed11f7f" Dec 06 10:23:00 crc kubenswrapper[4672]: E1206 10:23:00.892897 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5fabff347e73fdc55b714372f1c41782aecd544423da31074f2dfdbfed11f7f\": container with ID starting with a5fabff347e73fdc55b714372f1c41782aecd544423da31074f2dfdbfed11f7f not found: ID does not exist" containerID="a5fabff347e73fdc55b714372f1c41782aecd544423da31074f2dfdbfed11f7f" Dec 06 10:23:00 crc kubenswrapper[4672]: I1206 10:23:00.892961 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5fabff347e73fdc55b714372f1c41782aecd544423da31074f2dfdbfed11f7f"} err="failed to get container status \"a5fabff347e73fdc55b714372f1c41782aecd544423da31074f2dfdbfed11f7f\": rpc error: code = NotFound desc = could not find container \"a5fabff347e73fdc55b714372f1c41782aecd544423da31074f2dfdbfed11f7f\": container with ID starting with a5fabff347e73fdc55b714372f1c41782aecd544423da31074f2dfdbfed11f7f not found: ID does not exist" Dec 06 10:23:02 crc kubenswrapper[4672]: I1206 10:23:02.565376 4672 scope.go:117] "RemoveContainer" containerID="17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6" Dec 06 10:23:02 crc kubenswrapper[4672]: E1206 10:23:02.566230 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:23:02 crc kubenswrapper[4672]: I1206 10:23:02.574740 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="401de309-0791-455c-81cf-67acc51335fd" path="/var/lib/kubelet/pods/401de309-0791-455c-81cf-67acc51335fd/volumes" Dec 06 10:23:15 crc kubenswrapper[4672]: I1206 10:23:15.558229 4672 scope.go:117] "RemoveContainer" containerID="17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6" Dec 06 10:23:15 crc kubenswrapper[4672]: E1206 10:23:15.559416 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:23:28 crc kubenswrapper[4672]: I1206 10:23:28.557721 4672 scope.go:117] "RemoveContainer" containerID="17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6" Dec 06 10:23:28 crc kubenswrapper[4672]: E1206 10:23:28.558771 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:23:29 crc kubenswrapper[4672]: I1206 10:23:29.340486 4672 scope.go:117] "RemoveContainer" containerID="d66a4e55cf79990bbcb7878d5caf4251799ea74afb528278106d1dd0d7aec265" Dec 06 10:23:41 crc kubenswrapper[4672]: I1206 10:23:41.557343 4672 scope.go:117] "RemoveContainer" containerID="17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6" Dec 06 10:23:41 crc kubenswrapper[4672]: E1206 10:23:41.559921 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:23:52 crc kubenswrapper[4672]: I1206 10:23:52.564363 4672 scope.go:117] "RemoveContainer" containerID="17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6" Dec 06 10:23:52 crc kubenswrapper[4672]: E1206 10:23:52.565177 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:24:00 crc kubenswrapper[4672]: I1206 10:24:00.441779 4672 generic.go:334] "Generic (PLEG): container finished" podID="45976e37-3f07-4fe3-ad05-d94ab18e2ce1" containerID="989e6317ca2735021bb8793db4f29d7c9773b335d05055aeb5752f96430ea0b7" exitCode=0 Dec 06 10:24:00 crc kubenswrapper[4672]: I1206 10:24:00.441932 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n57zb/must-gather-q2d4j" event={"ID":"45976e37-3f07-4fe3-ad05-d94ab18e2ce1","Type":"ContainerDied","Data":"989e6317ca2735021bb8793db4f29d7c9773b335d05055aeb5752f96430ea0b7"} Dec 06 10:24:00 crc kubenswrapper[4672]: I1206 10:24:00.447685 4672 scope.go:117] "RemoveContainer" containerID="989e6317ca2735021bb8793db4f29d7c9773b335d05055aeb5752f96430ea0b7" Dec 06 10:24:00 crc kubenswrapper[4672]: I1206 10:24:00.794981 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n57zb_must-gather-q2d4j_45976e37-3f07-4fe3-ad05-d94ab18e2ce1/gather/0.log" Dec 06 10:24:03 crc kubenswrapper[4672]: I1206 10:24:03.557465 4672 scope.go:117] "RemoveContainer" containerID="17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6" Dec 06 10:24:03 crc kubenswrapper[4672]: E1206 10:24:03.558068 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:24:09 crc kubenswrapper[4672]: I1206 10:24:09.394905 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n57zb/must-gather-q2d4j"] Dec 06 10:24:09 crc kubenswrapper[4672]: I1206 10:24:09.396098 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-n57zb/must-gather-q2d4j" podUID="45976e37-3f07-4fe3-ad05-d94ab18e2ce1" containerName="copy" containerID="cri-o://68ccde6b01afcb0bae45d7d09f1163cf3ad68815f92cdfd37360d6939271be52" gracePeriod=2 Dec 06 10:24:09 crc kubenswrapper[4672]: I1206 10:24:09.412661 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n57zb/must-gather-q2d4j"] Dec 06 10:24:09 crc kubenswrapper[4672]: I1206 10:24:09.531426 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n57zb_must-gather-q2d4j_45976e37-3f07-4fe3-ad05-d94ab18e2ce1/copy/0.log" Dec 06 10:24:09 crc kubenswrapper[4672]: I1206 10:24:09.532063 4672 generic.go:334] "Generic (PLEG): container finished" podID="45976e37-3f07-4fe3-ad05-d94ab18e2ce1" containerID="68ccde6b01afcb0bae45d7d09f1163cf3ad68815f92cdfd37360d6939271be52" exitCode=143 Dec 06 10:24:10 crc kubenswrapper[4672]: I1206 10:24:10.267799 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n57zb_must-gather-q2d4j_45976e37-3f07-4fe3-ad05-d94ab18e2ce1/copy/0.log" Dec 06 10:24:10 crc kubenswrapper[4672]: I1206 10:24:10.268304 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n57zb/must-gather-q2d4j" Dec 06 10:24:10 crc kubenswrapper[4672]: I1206 10:24:10.419800 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt2mb\" (UniqueName: \"kubernetes.io/projected/45976e37-3f07-4fe3-ad05-d94ab18e2ce1-kube-api-access-bt2mb\") pod \"45976e37-3f07-4fe3-ad05-d94ab18e2ce1\" (UID: \"45976e37-3f07-4fe3-ad05-d94ab18e2ce1\") " Dec 06 10:24:10 crc kubenswrapper[4672]: I1206 10:24:10.420184 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/45976e37-3f07-4fe3-ad05-d94ab18e2ce1-must-gather-output\") pod \"45976e37-3f07-4fe3-ad05-d94ab18e2ce1\" (UID: \"45976e37-3f07-4fe3-ad05-d94ab18e2ce1\") " Dec 06 10:24:10 crc kubenswrapper[4672]: I1206 10:24:10.438493 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45976e37-3f07-4fe3-ad05-d94ab18e2ce1-kube-api-access-bt2mb" (OuterVolumeSpecName: "kube-api-access-bt2mb") pod "45976e37-3f07-4fe3-ad05-d94ab18e2ce1" (UID: "45976e37-3f07-4fe3-ad05-d94ab18e2ce1"). InnerVolumeSpecName "kube-api-access-bt2mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:24:10 crc kubenswrapper[4672]: I1206 10:24:10.524134 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt2mb\" (UniqueName: \"kubernetes.io/projected/45976e37-3f07-4fe3-ad05-d94ab18e2ce1-kube-api-access-bt2mb\") on node \"crc\" DevicePath \"\"" Dec 06 10:24:10 crc kubenswrapper[4672]: I1206 10:24:10.543892 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n57zb_must-gather-q2d4j_45976e37-3f07-4fe3-ad05-d94ab18e2ce1/copy/0.log" Dec 06 10:24:10 crc kubenswrapper[4672]: I1206 10:24:10.544241 4672 scope.go:117] "RemoveContainer" containerID="68ccde6b01afcb0bae45d7d09f1163cf3ad68815f92cdfd37360d6939271be52" Dec 06 10:24:10 crc kubenswrapper[4672]: I1206 10:24:10.544282 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n57zb/must-gather-q2d4j" Dec 06 10:24:10 crc kubenswrapper[4672]: I1206 10:24:10.580086 4672 scope.go:117] "RemoveContainer" containerID="989e6317ca2735021bb8793db4f29d7c9773b335d05055aeb5752f96430ea0b7" Dec 06 10:24:10 crc kubenswrapper[4672]: I1206 10:24:10.643997 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45976e37-3f07-4fe3-ad05-d94ab18e2ce1-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "45976e37-3f07-4fe3-ad05-d94ab18e2ce1" (UID: "45976e37-3f07-4fe3-ad05-d94ab18e2ce1"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:24:10 crc kubenswrapper[4672]: I1206 10:24:10.734526 4672 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/45976e37-3f07-4fe3-ad05-d94ab18e2ce1-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 06 10:24:12 crc kubenswrapper[4672]: I1206 10:24:12.567741 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45976e37-3f07-4fe3-ad05-d94ab18e2ce1" path="/var/lib/kubelet/pods/45976e37-3f07-4fe3-ad05-d94ab18e2ce1/volumes" Dec 06 10:24:18 crc kubenswrapper[4672]: I1206 10:24:18.556909 4672 scope.go:117] "RemoveContainer" containerID="17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6" Dec 06 10:24:19 crc kubenswrapper[4672]: I1206 10:24:19.636869 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerStarted","Data":"e7d8e30269340f1018466db3bc64484f24c1f912f42266b66c55a54ccd41efab"} Dec 06 10:25:49 crc kubenswrapper[4672]: I1206 10:25:49.206736 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jwn7c"] Dec 06 10:25:49 crc kubenswrapper[4672]: E1206 10:25:49.209591 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45976e37-3f07-4fe3-ad05-d94ab18e2ce1" containerName="gather" Dec 06 10:25:49 crc kubenswrapper[4672]: I1206 10:25:49.209648 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="45976e37-3f07-4fe3-ad05-d94ab18e2ce1" containerName="gather" Dec 06 10:25:49 crc kubenswrapper[4672]: E1206 10:25:49.209670 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="401de309-0791-455c-81cf-67acc51335fd" containerName="extract-utilities" Dec 06 10:25:49 crc kubenswrapper[4672]: I1206 10:25:49.209682 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="401de309-0791-455c-81cf-67acc51335fd" containerName="extract-utilities" Dec 06 10:25:49 crc kubenswrapper[4672]: E1206 10:25:49.209711 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="401de309-0791-455c-81cf-67acc51335fd" containerName="registry-server" Dec 06 10:25:49 crc kubenswrapper[4672]: I1206 10:25:49.209723 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="401de309-0791-455c-81cf-67acc51335fd" containerName="registry-server" Dec 06 10:25:49 crc kubenswrapper[4672]: E1206 10:25:49.209740 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45976e37-3f07-4fe3-ad05-d94ab18e2ce1" containerName="copy" Dec 06 10:25:49 crc kubenswrapper[4672]: I1206 10:25:49.209751 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="45976e37-3f07-4fe3-ad05-d94ab18e2ce1" containerName="copy" Dec 06 10:25:49 crc kubenswrapper[4672]: E1206 10:25:49.209780 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="401de309-0791-455c-81cf-67acc51335fd" containerName="extract-content" Dec 06 10:25:49 crc kubenswrapper[4672]: I1206 10:25:49.209793 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="401de309-0791-455c-81cf-67acc51335fd" containerName="extract-content" Dec 06 10:25:49 crc kubenswrapper[4672]: I1206 10:25:49.210108 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="45976e37-3f07-4fe3-ad05-d94ab18e2ce1" containerName="gather" Dec 06 10:25:49 crc kubenswrapper[4672]: I1206 10:25:49.210137 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="401de309-0791-455c-81cf-67acc51335fd" containerName="registry-server" Dec 06 10:25:49 crc kubenswrapper[4672]: I1206 10:25:49.210168 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="45976e37-3f07-4fe3-ad05-d94ab18e2ce1" containerName="copy" Dec 06 10:25:49 crc kubenswrapper[4672]: I1206 10:25:49.213676 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jwn7c" Dec 06 10:25:49 crc kubenswrapper[4672]: I1206 10:25:49.222709 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jwn7c"] Dec 06 10:25:49 crc kubenswrapper[4672]: I1206 10:25:49.353761 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa5524cd-c42a-4fe4-81b0-5f57648aca6f-utilities\") pod \"redhat-operators-jwn7c\" (UID: \"aa5524cd-c42a-4fe4-81b0-5f57648aca6f\") " pod="openshift-marketplace/redhat-operators-jwn7c" Dec 06 10:25:49 crc kubenswrapper[4672]: I1206 10:25:49.353797 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa5524cd-c42a-4fe4-81b0-5f57648aca6f-catalog-content\") pod \"redhat-operators-jwn7c\" (UID: \"aa5524cd-c42a-4fe4-81b0-5f57648aca6f\") " pod="openshift-marketplace/redhat-operators-jwn7c" Dec 06 10:25:49 crc kubenswrapper[4672]: I1206 10:25:49.353892 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk4sz\" (UniqueName: \"kubernetes.io/projected/aa5524cd-c42a-4fe4-81b0-5f57648aca6f-kube-api-access-gk4sz\") pod \"redhat-operators-jwn7c\" (UID: \"aa5524cd-c42a-4fe4-81b0-5f57648aca6f\") " pod="openshift-marketplace/redhat-operators-jwn7c" Dec 06 10:25:49 crc kubenswrapper[4672]: I1206 10:25:49.455645 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk4sz\" (UniqueName: \"kubernetes.io/projected/aa5524cd-c42a-4fe4-81b0-5f57648aca6f-kube-api-access-gk4sz\") pod \"redhat-operators-jwn7c\" (UID: \"aa5524cd-c42a-4fe4-81b0-5f57648aca6f\") " pod="openshift-marketplace/redhat-operators-jwn7c" Dec 06 10:25:49 crc kubenswrapper[4672]: I1206 10:25:49.455783 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa5524cd-c42a-4fe4-81b0-5f57648aca6f-utilities\") pod \"redhat-operators-jwn7c\" (UID: \"aa5524cd-c42a-4fe4-81b0-5f57648aca6f\") " pod="openshift-marketplace/redhat-operators-jwn7c" Dec 06 10:25:49 crc kubenswrapper[4672]: I1206 10:25:49.455802 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa5524cd-c42a-4fe4-81b0-5f57648aca6f-catalog-content\") pod \"redhat-operators-jwn7c\" (UID: \"aa5524cd-c42a-4fe4-81b0-5f57648aca6f\") " pod="openshift-marketplace/redhat-operators-jwn7c" Dec 06 10:25:49 crc kubenswrapper[4672]: I1206 10:25:49.456267 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa5524cd-c42a-4fe4-81b0-5f57648aca6f-catalog-content\") pod \"redhat-operators-jwn7c\" (UID: \"aa5524cd-c42a-4fe4-81b0-5f57648aca6f\") " pod="openshift-marketplace/redhat-operators-jwn7c" Dec 06 10:25:49 crc kubenswrapper[4672]: I1206 10:25:49.456712 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa5524cd-c42a-4fe4-81b0-5f57648aca6f-utilities\") pod \"redhat-operators-jwn7c\" (UID: \"aa5524cd-c42a-4fe4-81b0-5f57648aca6f\") " pod="openshift-marketplace/redhat-operators-jwn7c" Dec 06 10:25:49 crc kubenswrapper[4672]: I1206 10:25:49.480383 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk4sz\" (UniqueName: \"kubernetes.io/projected/aa5524cd-c42a-4fe4-81b0-5f57648aca6f-kube-api-access-gk4sz\") pod \"redhat-operators-jwn7c\" (UID: \"aa5524cd-c42a-4fe4-81b0-5f57648aca6f\") " pod="openshift-marketplace/redhat-operators-jwn7c" Dec 06 10:25:49 crc kubenswrapper[4672]: I1206 10:25:49.539694 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jwn7c" Dec 06 10:25:50 crc kubenswrapper[4672]: I1206 10:25:50.035919 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jwn7c"] Dec 06 10:25:50 crc kubenswrapper[4672]: I1206 10:25:50.573988 4672 generic.go:334] "Generic (PLEG): container finished" podID="aa5524cd-c42a-4fe4-81b0-5f57648aca6f" containerID="fbe12fee98b1987ebf19c2e75d365bcac4a6237f21752e1da2b27f9cdc0b020a" exitCode=0 Dec 06 10:25:50 crc kubenswrapper[4672]: I1206 10:25:50.574220 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jwn7c" event={"ID":"aa5524cd-c42a-4fe4-81b0-5f57648aca6f","Type":"ContainerDied","Data":"fbe12fee98b1987ebf19c2e75d365bcac4a6237f21752e1da2b27f9cdc0b020a"} Dec 06 10:25:50 crc kubenswrapper[4672]: I1206 10:25:50.574243 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jwn7c" event={"ID":"aa5524cd-c42a-4fe4-81b0-5f57648aca6f","Type":"ContainerStarted","Data":"88f097795798371026c8767a028a04b7cb3d17dfd4c5c3d732ff5ca4bc89358e"} Dec 06 10:25:51 crc kubenswrapper[4672]: I1206 10:25:51.586623 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jwn7c" event={"ID":"aa5524cd-c42a-4fe4-81b0-5f57648aca6f","Type":"ContainerStarted","Data":"b29b48b71409e55df8f4898aea02f16968f3f701479d37dc635a536ad65de954"} Dec 06 10:25:55 crc kubenswrapper[4672]: I1206 10:25:55.651158 4672 generic.go:334] "Generic (PLEG): container finished" podID="aa5524cd-c42a-4fe4-81b0-5f57648aca6f" containerID="b29b48b71409e55df8f4898aea02f16968f3f701479d37dc635a536ad65de954" exitCode=0 Dec 06 10:25:55 crc kubenswrapper[4672]: I1206 10:25:55.651361 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jwn7c" event={"ID":"aa5524cd-c42a-4fe4-81b0-5f57648aca6f","Type":"ContainerDied","Data":"b29b48b71409e55df8f4898aea02f16968f3f701479d37dc635a536ad65de954"} Dec 06 10:25:56 crc kubenswrapper[4672]: I1206 10:25:56.663868 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jwn7c" event={"ID":"aa5524cd-c42a-4fe4-81b0-5f57648aca6f","Type":"ContainerStarted","Data":"345bab35d9c76bf17f18e947187945016c0141e5dabce2effd4a7b2561dec169"} Dec 06 10:25:56 crc kubenswrapper[4672]: I1206 10:25:56.693233 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jwn7c" podStartSLOduration=2.253627734 podStartE2EDuration="7.69320909s" podCreationTimestamp="2025-12-06 10:25:49 +0000 UTC" firstStartedPulling="2025-12-06 10:25:50.576291615 +0000 UTC m=+4768.320551902" lastFinishedPulling="2025-12-06 10:25:56.015872971 +0000 UTC m=+4773.760133258" observedRunningTime="2025-12-06 10:25:56.684092663 +0000 UTC m=+4774.428352980" watchObservedRunningTime="2025-12-06 10:25:56.69320909 +0000 UTC m=+4774.437469387" Dec 06 10:25:59 crc kubenswrapper[4672]: I1206 10:25:59.540032 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jwn7c" Dec 06 10:25:59 crc kubenswrapper[4672]: I1206 10:25:59.540591 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jwn7c" Dec 06 10:26:00 crc kubenswrapper[4672]: I1206 10:26:00.931642 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jwn7c" podUID="aa5524cd-c42a-4fe4-81b0-5f57648aca6f" containerName="registry-server" probeResult="failure" output=< Dec 06 10:26:00 crc kubenswrapper[4672]: timeout: failed to connect service ":50051" within 1s Dec 06 10:26:00 crc kubenswrapper[4672]: > Dec 06 10:26:04 crc kubenswrapper[4672]: I1206 10:26:04.849992 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wh77l"] Dec 06 10:26:04 crc kubenswrapper[4672]: I1206 10:26:04.867965 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wh77l"] Dec 06 10:26:04 crc kubenswrapper[4672]: I1206 10:26:04.870658 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wh77l" Dec 06 10:26:05 crc kubenswrapper[4672]: I1206 10:26:05.005313 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fmqs\" (UniqueName: \"kubernetes.io/projected/3a015d70-4201-4f19-bfbb-9a8d4d5398b4-kube-api-access-4fmqs\") pod \"community-operators-wh77l\" (UID: \"3a015d70-4201-4f19-bfbb-9a8d4d5398b4\") " pod="openshift-marketplace/community-operators-wh77l" Dec 06 10:26:05 crc kubenswrapper[4672]: I1206 10:26:05.005374 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a015d70-4201-4f19-bfbb-9a8d4d5398b4-catalog-content\") pod \"community-operators-wh77l\" (UID: \"3a015d70-4201-4f19-bfbb-9a8d4d5398b4\") " pod="openshift-marketplace/community-operators-wh77l" Dec 06 10:26:05 crc kubenswrapper[4672]: I1206 10:26:05.005833 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a015d70-4201-4f19-bfbb-9a8d4d5398b4-utilities\") pod \"community-operators-wh77l\" (UID: \"3a015d70-4201-4f19-bfbb-9a8d4d5398b4\") " pod="openshift-marketplace/community-operators-wh77l" Dec 06 10:26:05 crc kubenswrapper[4672]: I1206 10:26:05.107400 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a015d70-4201-4f19-bfbb-9a8d4d5398b4-utilities\") pod \"community-operators-wh77l\" (UID: \"3a015d70-4201-4f19-bfbb-9a8d4d5398b4\") " pod="openshift-marketplace/community-operators-wh77l" Dec 06 10:26:05 crc kubenswrapper[4672]: I1206 10:26:05.107489 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fmqs\" (UniqueName: \"kubernetes.io/projected/3a015d70-4201-4f19-bfbb-9a8d4d5398b4-kube-api-access-4fmqs\") pod \"community-operators-wh77l\" (UID: \"3a015d70-4201-4f19-bfbb-9a8d4d5398b4\") " pod="openshift-marketplace/community-operators-wh77l" Dec 06 10:26:05 crc kubenswrapper[4672]: I1206 10:26:05.107561 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a015d70-4201-4f19-bfbb-9a8d4d5398b4-catalog-content\") pod \"community-operators-wh77l\" (UID: \"3a015d70-4201-4f19-bfbb-9a8d4d5398b4\") " pod="openshift-marketplace/community-operators-wh77l" Dec 06 10:26:05 crc kubenswrapper[4672]: I1206 10:26:05.108041 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a015d70-4201-4f19-bfbb-9a8d4d5398b4-catalog-content\") pod \"community-operators-wh77l\" (UID: \"3a015d70-4201-4f19-bfbb-9a8d4d5398b4\") " pod="openshift-marketplace/community-operators-wh77l" Dec 06 10:26:05 crc kubenswrapper[4672]: I1206 10:26:05.108190 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a015d70-4201-4f19-bfbb-9a8d4d5398b4-utilities\") pod \"community-operators-wh77l\" (UID: \"3a015d70-4201-4f19-bfbb-9a8d4d5398b4\") " pod="openshift-marketplace/community-operators-wh77l" Dec 06 10:26:05 crc kubenswrapper[4672]: I1206 10:26:05.131912 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fmqs\" (UniqueName: \"kubernetes.io/projected/3a015d70-4201-4f19-bfbb-9a8d4d5398b4-kube-api-access-4fmqs\") pod \"community-operators-wh77l\" (UID: \"3a015d70-4201-4f19-bfbb-9a8d4d5398b4\") " pod="openshift-marketplace/community-operators-wh77l" Dec 06 10:26:05 crc kubenswrapper[4672]: I1206 10:26:05.197281 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wh77l" Dec 06 10:26:05 crc kubenswrapper[4672]: W1206 10:26:05.768962 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a015d70_4201_4f19_bfbb_9a8d4d5398b4.slice/crio-0471cee7dd940f8b52eaa32cc2f7c458e20b5885dd3179a37fd51cb48f388421 WatchSource:0}: Error finding container 0471cee7dd940f8b52eaa32cc2f7c458e20b5885dd3179a37fd51cb48f388421: Status 404 returned error can't find the container with id 0471cee7dd940f8b52eaa32cc2f7c458e20b5885dd3179a37fd51cb48f388421 Dec 06 10:26:05 crc kubenswrapper[4672]: I1206 10:26:05.769010 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wh77l"] Dec 06 10:26:06 crc kubenswrapper[4672]: I1206 10:26:06.771773 4672 generic.go:334] "Generic (PLEG): container finished" podID="3a015d70-4201-4f19-bfbb-9a8d4d5398b4" containerID="8c1fa8ad6511f3a6cefca4f21d343c004f89bae4136e22394df0ee85c1806ec6" exitCode=0 Dec 06 10:26:06 crc kubenswrapper[4672]: I1206 10:26:06.771840 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wh77l" event={"ID":"3a015d70-4201-4f19-bfbb-9a8d4d5398b4","Type":"ContainerDied","Data":"8c1fa8ad6511f3a6cefca4f21d343c004f89bae4136e22394df0ee85c1806ec6"} Dec 06 10:26:06 crc kubenswrapper[4672]: I1206 10:26:06.772322 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wh77l" event={"ID":"3a015d70-4201-4f19-bfbb-9a8d4d5398b4","Type":"ContainerStarted","Data":"0471cee7dd940f8b52eaa32cc2f7c458e20b5885dd3179a37fd51cb48f388421"} Dec 06 10:26:07 crc kubenswrapper[4672]: I1206 10:26:07.783077 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wh77l" event={"ID":"3a015d70-4201-4f19-bfbb-9a8d4d5398b4","Type":"ContainerStarted","Data":"fe8b793841a4bef56365738a5808bf8d7191cef6382467ed664ea6718b5c95dd"} Dec 06 10:26:09 crc kubenswrapper[4672]: I1206 10:26:09.625640 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jwn7c" Dec 06 10:26:09 crc kubenswrapper[4672]: I1206 10:26:09.691471 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jwn7c" Dec 06 10:26:09 crc kubenswrapper[4672]: I1206 10:26:09.812376 4672 generic.go:334] "Generic (PLEG): container finished" podID="3a015d70-4201-4f19-bfbb-9a8d4d5398b4" containerID="fe8b793841a4bef56365738a5808bf8d7191cef6382467ed664ea6718b5c95dd" exitCode=0 Dec 06 10:26:09 crc kubenswrapper[4672]: I1206 10:26:09.812510 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wh77l" event={"ID":"3a015d70-4201-4f19-bfbb-9a8d4d5398b4","Type":"ContainerDied","Data":"fe8b793841a4bef56365738a5808bf8d7191cef6382467ed664ea6718b5c95dd"} Dec 06 10:26:10 crc kubenswrapper[4672]: I1206 10:26:10.420884 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jwn7c"] Dec 06 10:26:10 crc kubenswrapper[4672]: I1206 10:26:10.820585 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jwn7c" podUID="aa5524cd-c42a-4fe4-81b0-5f57648aca6f" containerName="registry-server" containerID="cri-o://345bab35d9c76bf17f18e947187945016c0141e5dabce2effd4a7b2561dec169" gracePeriod=2 Dec 06 10:26:11 crc kubenswrapper[4672]: I1206 10:26:11.345636 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jwn7c" Dec 06 10:26:11 crc kubenswrapper[4672]: I1206 10:26:11.458527 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa5524cd-c42a-4fe4-81b0-5f57648aca6f-utilities\") pod \"aa5524cd-c42a-4fe4-81b0-5f57648aca6f\" (UID: \"aa5524cd-c42a-4fe4-81b0-5f57648aca6f\") " Dec 06 10:26:11 crc kubenswrapper[4672]: I1206 10:26:11.458628 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa5524cd-c42a-4fe4-81b0-5f57648aca6f-catalog-content\") pod \"aa5524cd-c42a-4fe4-81b0-5f57648aca6f\" (UID: \"aa5524cd-c42a-4fe4-81b0-5f57648aca6f\") " Dec 06 10:26:11 crc kubenswrapper[4672]: I1206 10:26:11.458803 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk4sz\" (UniqueName: \"kubernetes.io/projected/aa5524cd-c42a-4fe4-81b0-5f57648aca6f-kube-api-access-gk4sz\") pod \"aa5524cd-c42a-4fe4-81b0-5f57648aca6f\" (UID: \"aa5524cd-c42a-4fe4-81b0-5f57648aca6f\") " Dec 06 10:26:11 crc kubenswrapper[4672]: I1206 10:26:11.459817 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa5524cd-c42a-4fe4-81b0-5f57648aca6f-utilities" (OuterVolumeSpecName: "utilities") pod "aa5524cd-c42a-4fe4-81b0-5f57648aca6f" (UID: "aa5524cd-c42a-4fe4-81b0-5f57648aca6f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:26:11 crc kubenswrapper[4672]: I1206 10:26:11.468066 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa5524cd-c42a-4fe4-81b0-5f57648aca6f-kube-api-access-gk4sz" (OuterVolumeSpecName: "kube-api-access-gk4sz") pod "aa5524cd-c42a-4fe4-81b0-5f57648aca6f" (UID: "aa5524cd-c42a-4fe4-81b0-5f57648aca6f"). InnerVolumeSpecName "kube-api-access-gk4sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:26:11 crc kubenswrapper[4672]: I1206 10:26:11.561917 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa5524cd-c42a-4fe4-81b0-5f57648aca6f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa5524cd-c42a-4fe4-81b0-5f57648aca6f" (UID: "aa5524cd-c42a-4fe4-81b0-5f57648aca6f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:26:11 crc kubenswrapper[4672]: I1206 10:26:11.562237 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa5524cd-c42a-4fe4-81b0-5f57648aca6f-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:26:11 crc kubenswrapper[4672]: I1206 10:26:11.562256 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa5524cd-c42a-4fe4-81b0-5f57648aca6f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:26:11 crc kubenswrapper[4672]: I1206 10:26:11.562266 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk4sz\" (UniqueName: \"kubernetes.io/projected/aa5524cd-c42a-4fe4-81b0-5f57648aca6f-kube-api-access-gk4sz\") on node \"crc\" DevicePath \"\"" Dec 06 10:26:11 crc kubenswrapper[4672]: I1206 10:26:11.832969 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wh77l" event={"ID":"3a015d70-4201-4f19-bfbb-9a8d4d5398b4","Type":"ContainerStarted","Data":"6a45685b8def21d66a392bcbe7ee28bb474eb551efe9d48ec57653ef08e1cf47"} Dec 06 10:26:11 crc kubenswrapper[4672]: I1206 10:26:11.836232 4672 generic.go:334] "Generic (PLEG): container finished" podID="aa5524cd-c42a-4fe4-81b0-5f57648aca6f" containerID="345bab35d9c76bf17f18e947187945016c0141e5dabce2effd4a7b2561dec169" exitCode=0 Dec 06 10:26:11 crc kubenswrapper[4672]: I1206 10:26:11.836275 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jwn7c" event={"ID":"aa5524cd-c42a-4fe4-81b0-5f57648aca6f","Type":"ContainerDied","Data":"345bab35d9c76bf17f18e947187945016c0141e5dabce2effd4a7b2561dec169"} Dec 06 10:26:11 crc kubenswrapper[4672]: I1206 10:26:11.836289 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jwn7c" Dec 06 10:26:11 crc kubenswrapper[4672]: I1206 10:26:11.836303 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jwn7c" event={"ID":"aa5524cd-c42a-4fe4-81b0-5f57648aca6f","Type":"ContainerDied","Data":"88f097795798371026c8767a028a04b7cb3d17dfd4c5c3d732ff5ca4bc89358e"} Dec 06 10:26:11 crc kubenswrapper[4672]: I1206 10:26:11.836322 4672 scope.go:117] "RemoveContainer" containerID="345bab35d9c76bf17f18e947187945016c0141e5dabce2effd4a7b2561dec169" Dec 06 10:26:11 crc kubenswrapper[4672]: I1206 10:26:11.861107 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wh77l" podStartSLOduration=4.330368067 podStartE2EDuration="7.861088693s" podCreationTimestamp="2025-12-06 10:26:04 +0000 UTC" firstStartedPulling="2025-12-06 10:26:06.774187661 +0000 UTC m=+4784.518447948" lastFinishedPulling="2025-12-06 10:26:10.304908277 +0000 UTC m=+4788.049168574" observedRunningTime="2025-12-06 10:26:11.859140291 +0000 UTC m=+4789.603400568" watchObservedRunningTime="2025-12-06 10:26:11.861088693 +0000 UTC m=+4789.605348980" Dec 06 10:26:11 crc kubenswrapper[4672]: I1206 10:26:11.870752 4672 scope.go:117] "RemoveContainer" containerID="b29b48b71409e55df8f4898aea02f16968f3f701479d37dc635a536ad65de954" Dec 06 10:26:11 crc kubenswrapper[4672]: I1206 10:26:11.889065 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jwn7c"] Dec 06 10:26:11 crc kubenswrapper[4672]: I1206 10:26:11.913900 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jwn7c"] Dec 06 10:26:11 crc kubenswrapper[4672]: I1206 10:26:11.917819 4672 scope.go:117] "RemoveContainer" containerID="fbe12fee98b1987ebf19c2e75d365bcac4a6237f21752e1da2b27f9cdc0b020a" Dec 06 10:26:11 crc kubenswrapper[4672]: I1206 10:26:11.954966 4672 scope.go:117] "RemoveContainer" containerID="345bab35d9c76bf17f18e947187945016c0141e5dabce2effd4a7b2561dec169" Dec 06 10:26:11 crc kubenswrapper[4672]: E1206 10:26:11.955555 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"345bab35d9c76bf17f18e947187945016c0141e5dabce2effd4a7b2561dec169\": container with ID starting with 345bab35d9c76bf17f18e947187945016c0141e5dabce2effd4a7b2561dec169 not found: ID does not exist" containerID="345bab35d9c76bf17f18e947187945016c0141e5dabce2effd4a7b2561dec169" Dec 06 10:26:11 crc kubenswrapper[4672]: I1206 10:26:11.955640 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"345bab35d9c76bf17f18e947187945016c0141e5dabce2effd4a7b2561dec169"} err="failed to get container status \"345bab35d9c76bf17f18e947187945016c0141e5dabce2effd4a7b2561dec169\": rpc error: code = NotFound desc = could not find container \"345bab35d9c76bf17f18e947187945016c0141e5dabce2effd4a7b2561dec169\": container with ID starting with 345bab35d9c76bf17f18e947187945016c0141e5dabce2effd4a7b2561dec169 not found: ID does not exist" Dec 06 10:26:11 crc kubenswrapper[4672]: I1206 10:26:11.955672 4672 scope.go:117] "RemoveContainer" containerID="b29b48b71409e55df8f4898aea02f16968f3f701479d37dc635a536ad65de954" Dec 06 10:26:11 crc kubenswrapper[4672]: E1206 10:26:11.956420 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b29b48b71409e55df8f4898aea02f16968f3f701479d37dc635a536ad65de954\": container with ID starting with b29b48b71409e55df8f4898aea02f16968f3f701479d37dc635a536ad65de954 not found: ID does not exist" containerID="b29b48b71409e55df8f4898aea02f16968f3f701479d37dc635a536ad65de954" Dec 06 10:26:11 crc kubenswrapper[4672]: I1206 10:26:11.956458 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b29b48b71409e55df8f4898aea02f16968f3f701479d37dc635a536ad65de954"} err="failed to get container status \"b29b48b71409e55df8f4898aea02f16968f3f701479d37dc635a536ad65de954\": rpc error: code = NotFound desc = could not find container \"b29b48b71409e55df8f4898aea02f16968f3f701479d37dc635a536ad65de954\": container with ID starting with b29b48b71409e55df8f4898aea02f16968f3f701479d37dc635a536ad65de954 not found: ID does not exist" Dec 06 10:26:11 crc kubenswrapper[4672]: I1206 10:26:11.956476 4672 scope.go:117] "RemoveContainer" containerID="fbe12fee98b1987ebf19c2e75d365bcac4a6237f21752e1da2b27f9cdc0b020a" Dec 06 10:26:11 crc kubenswrapper[4672]: E1206 10:26:11.958409 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbe12fee98b1987ebf19c2e75d365bcac4a6237f21752e1da2b27f9cdc0b020a\": container with ID starting with fbe12fee98b1987ebf19c2e75d365bcac4a6237f21752e1da2b27f9cdc0b020a not found: ID does not exist" containerID="fbe12fee98b1987ebf19c2e75d365bcac4a6237f21752e1da2b27f9cdc0b020a" Dec 06 10:26:11 crc kubenswrapper[4672]: I1206 10:26:11.958456 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbe12fee98b1987ebf19c2e75d365bcac4a6237f21752e1da2b27f9cdc0b020a"} err="failed to get container status \"fbe12fee98b1987ebf19c2e75d365bcac4a6237f21752e1da2b27f9cdc0b020a\": rpc error: code = NotFound desc = could not find container \"fbe12fee98b1987ebf19c2e75d365bcac4a6237f21752e1da2b27f9cdc0b020a\": container with ID starting with fbe12fee98b1987ebf19c2e75d365bcac4a6237f21752e1da2b27f9cdc0b020a not found: ID does not exist" Dec 06 10:26:12 crc kubenswrapper[4672]: I1206 10:26:12.578002 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa5524cd-c42a-4fe4-81b0-5f57648aca6f" path="/var/lib/kubelet/pods/aa5524cd-c42a-4fe4-81b0-5f57648aca6f/volumes" Dec 06 10:26:15 crc kubenswrapper[4672]: I1206 10:26:15.198025 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wh77l" Dec 06 10:26:15 crc kubenswrapper[4672]: I1206 10:26:15.199554 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wh77l" Dec 06 10:26:15 crc kubenswrapper[4672]: I1206 10:26:15.244715 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wh77l" Dec 06 10:26:15 crc kubenswrapper[4672]: I1206 10:26:15.937336 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wh77l" Dec 06 10:26:16 crc kubenswrapper[4672]: I1206 10:26:16.419358 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wh77l"] Dec 06 10:26:17 crc kubenswrapper[4672]: I1206 10:26:17.893329 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wh77l" podUID="3a015d70-4201-4f19-bfbb-9a8d4d5398b4" containerName="registry-server" containerID="cri-o://6a45685b8def21d66a392bcbe7ee28bb474eb551efe9d48ec57653ef08e1cf47" gracePeriod=2 Dec 06 10:26:18 crc kubenswrapper[4672]: I1206 10:26:18.391438 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wh77l" Dec 06 10:26:18 crc kubenswrapper[4672]: I1206 10:26:18.510149 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fmqs\" (UniqueName: \"kubernetes.io/projected/3a015d70-4201-4f19-bfbb-9a8d4d5398b4-kube-api-access-4fmqs\") pod \"3a015d70-4201-4f19-bfbb-9a8d4d5398b4\" (UID: \"3a015d70-4201-4f19-bfbb-9a8d4d5398b4\") " Dec 06 10:26:18 crc kubenswrapper[4672]: I1206 10:26:18.510258 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a015d70-4201-4f19-bfbb-9a8d4d5398b4-catalog-content\") pod \"3a015d70-4201-4f19-bfbb-9a8d4d5398b4\" (UID: \"3a015d70-4201-4f19-bfbb-9a8d4d5398b4\") " Dec 06 10:26:18 crc kubenswrapper[4672]: I1206 10:26:18.510428 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a015d70-4201-4f19-bfbb-9a8d4d5398b4-utilities\") pod \"3a015d70-4201-4f19-bfbb-9a8d4d5398b4\" (UID: \"3a015d70-4201-4f19-bfbb-9a8d4d5398b4\") " Dec 06 10:26:18 crc kubenswrapper[4672]: I1206 10:26:18.511310 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a015d70-4201-4f19-bfbb-9a8d4d5398b4-utilities" (OuterVolumeSpecName: "utilities") pod "3a015d70-4201-4f19-bfbb-9a8d4d5398b4" (UID: "3a015d70-4201-4f19-bfbb-9a8d4d5398b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:26:18 crc kubenswrapper[4672]: I1206 10:26:18.540887 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a015d70-4201-4f19-bfbb-9a8d4d5398b4-kube-api-access-4fmqs" (OuterVolumeSpecName: "kube-api-access-4fmqs") pod "3a015d70-4201-4f19-bfbb-9a8d4d5398b4" (UID: "3a015d70-4201-4f19-bfbb-9a8d4d5398b4"). InnerVolumeSpecName "kube-api-access-4fmqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:26:18 crc kubenswrapper[4672]: I1206 10:26:18.593552 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a015d70-4201-4f19-bfbb-9a8d4d5398b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a015d70-4201-4f19-bfbb-9a8d4d5398b4" (UID: "3a015d70-4201-4f19-bfbb-9a8d4d5398b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:26:18 crc kubenswrapper[4672]: I1206 10:26:18.612717 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a015d70-4201-4f19-bfbb-9a8d4d5398b4-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:26:18 crc kubenswrapper[4672]: I1206 10:26:18.612751 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fmqs\" (UniqueName: \"kubernetes.io/projected/3a015d70-4201-4f19-bfbb-9a8d4d5398b4-kube-api-access-4fmqs\") on node \"crc\" DevicePath \"\"" Dec 06 10:26:18 crc kubenswrapper[4672]: I1206 10:26:18.612762 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a015d70-4201-4f19-bfbb-9a8d4d5398b4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:26:18 crc kubenswrapper[4672]: I1206 10:26:18.904847 4672 generic.go:334] "Generic (PLEG): container finished" podID="3a015d70-4201-4f19-bfbb-9a8d4d5398b4" containerID="6a45685b8def21d66a392bcbe7ee28bb474eb551efe9d48ec57653ef08e1cf47" exitCode=0 Dec 06 10:26:18 crc kubenswrapper[4672]: I1206 10:26:18.904907 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wh77l" Dec 06 10:26:18 crc kubenswrapper[4672]: I1206 10:26:18.904956 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wh77l" event={"ID":"3a015d70-4201-4f19-bfbb-9a8d4d5398b4","Type":"ContainerDied","Data":"6a45685b8def21d66a392bcbe7ee28bb474eb551efe9d48ec57653ef08e1cf47"} Dec 06 10:26:18 crc kubenswrapper[4672]: I1206 10:26:18.905355 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wh77l" event={"ID":"3a015d70-4201-4f19-bfbb-9a8d4d5398b4","Type":"ContainerDied","Data":"0471cee7dd940f8b52eaa32cc2f7c458e20b5885dd3179a37fd51cb48f388421"} Dec 06 10:26:18 crc kubenswrapper[4672]: I1206 10:26:18.905378 4672 scope.go:117] "RemoveContainer" containerID="6a45685b8def21d66a392bcbe7ee28bb474eb551efe9d48ec57653ef08e1cf47" Dec 06 10:26:18 crc kubenswrapper[4672]: I1206 10:26:18.937545 4672 scope.go:117] "RemoveContainer" containerID="fe8b793841a4bef56365738a5808bf8d7191cef6382467ed664ea6718b5c95dd" Dec 06 10:26:18 crc kubenswrapper[4672]: I1206 10:26:18.946166 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wh77l"] Dec 06 10:26:18 crc kubenswrapper[4672]: I1206 10:26:18.954906 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wh77l"] Dec 06 10:26:18 crc kubenswrapper[4672]: I1206 10:26:18.970753 4672 scope.go:117] "RemoveContainer" containerID="8c1fa8ad6511f3a6cefca4f21d343c004f89bae4136e22394df0ee85c1806ec6" Dec 06 10:26:19 crc kubenswrapper[4672]: I1206 10:26:19.017022 4672 scope.go:117] "RemoveContainer" containerID="6a45685b8def21d66a392bcbe7ee28bb474eb551efe9d48ec57653ef08e1cf47" Dec 06 10:26:19 crc kubenswrapper[4672]: E1206 10:26:19.017664 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a45685b8def21d66a392bcbe7ee28bb474eb551efe9d48ec57653ef08e1cf47\": container with ID starting with 6a45685b8def21d66a392bcbe7ee28bb474eb551efe9d48ec57653ef08e1cf47 not found: ID does not exist" containerID="6a45685b8def21d66a392bcbe7ee28bb474eb551efe9d48ec57653ef08e1cf47" Dec 06 10:26:19 crc kubenswrapper[4672]: I1206 10:26:19.017702 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a45685b8def21d66a392bcbe7ee28bb474eb551efe9d48ec57653ef08e1cf47"} err="failed to get container status \"6a45685b8def21d66a392bcbe7ee28bb474eb551efe9d48ec57653ef08e1cf47\": rpc error: code = NotFound desc = could not find container \"6a45685b8def21d66a392bcbe7ee28bb474eb551efe9d48ec57653ef08e1cf47\": container with ID starting with 6a45685b8def21d66a392bcbe7ee28bb474eb551efe9d48ec57653ef08e1cf47 not found: ID does not exist" Dec 06 10:26:19 crc kubenswrapper[4672]: I1206 10:26:19.017726 4672 scope.go:117] "RemoveContainer" containerID="fe8b793841a4bef56365738a5808bf8d7191cef6382467ed664ea6718b5c95dd" Dec 06 10:26:19 crc kubenswrapper[4672]: E1206 10:26:19.018822 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe8b793841a4bef56365738a5808bf8d7191cef6382467ed664ea6718b5c95dd\": container with ID starting with fe8b793841a4bef56365738a5808bf8d7191cef6382467ed664ea6718b5c95dd not found: ID does not exist" containerID="fe8b793841a4bef56365738a5808bf8d7191cef6382467ed664ea6718b5c95dd" Dec 06 10:26:19 crc kubenswrapper[4672]: I1206 10:26:19.018871 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe8b793841a4bef56365738a5808bf8d7191cef6382467ed664ea6718b5c95dd"} err="failed to get container status \"fe8b793841a4bef56365738a5808bf8d7191cef6382467ed664ea6718b5c95dd\": rpc error: code = NotFound desc = could not find container \"fe8b793841a4bef56365738a5808bf8d7191cef6382467ed664ea6718b5c95dd\": container with ID starting with fe8b793841a4bef56365738a5808bf8d7191cef6382467ed664ea6718b5c95dd not found: ID does not exist" Dec 06 10:26:19 crc kubenswrapper[4672]: I1206 10:26:19.018891 4672 scope.go:117] "RemoveContainer" containerID="8c1fa8ad6511f3a6cefca4f21d343c004f89bae4136e22394df0ee85c1806ec6" Dec 06 10:26:19 crc kubenswrapper[4672]: E1206 10:26:19.019390 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c1fa8ad6511f3a6cefca4f21d343c004f89bae4136e22394df0ee85c1806ec6\": container with ID starting with 8c1fa8ad6511f3a6cefca4f21d343c004f89bae4136e22394df0ee85c1806ec6 not found: ID does not exist" containerID="8c1fa8ad6511f3a6cefca4f21d343c004f89bae4136e22394df0ee85c1806ec6" Dec 06 10:26:19 crc kubenswrapper[4672]: I1206 10:26:19.019421 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c1fa8ad6511f3a6cefca4f21d343c004f89bae4136e22394df0ee85c1806ec6"} err="failed to get container status \"8c1fa8ad6511f3a6cefca4f21d343c004f89bae4136e22394df0ee85c1806ec6\": rpc error: code = NotFound desc = could not find container \"8c1fa8ad6511f3a6cefca4f21d343c004f89bae4136e22394df0ee85c1806ec6\": container with ID starting with 8c1fa8ad6511f3a6cefca4f21d343c004f89bae4136e22394df0ee85c1806ec6 not found: ID does not exist" Dec 06 10:26:20 crc kubenswrapper[4672]: I1206 10:26:20.566615 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a015d70-4201-4f19-bfbb-9a8d4d5398b4" path="/var/lib/kubelet/pods/3a015d70-4201-4f19-bfbb-9a8d4d5398b4/volumes" Dec 06 10:26:42 crc kubenswrapper[4672]: I1206 10:26:42.321367 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:26:42 crc kubenswrapper[4672]: I1206 10:26:42.322092 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:27:09 crc kubenswrapper[4672]: I1206 10:27:09.306908 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2jnct/must-gather-c8qqv"] Dec 06 10:27:09 crc kubenswrapper[4672]: E1206 10:27:09.307991 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a015d70-4201-4f19-bfbb-9a8d4d5398b4" containerName="registry-server" Dec 06 10:27:09 crc kubenswrapper[4672]: I1206 10:27:09.308007 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a015d70-4201-4f19-bfbb-9a8d4d5398b4" containerName="registry-server" Dec 06 10:27:09 crc kubenswrapper[4672]: E1206 10:27:09.308016 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a015d70-4201-4f19-bfbb-9a8d4d5398b4" containerName="extract-content" Dec 06 10:27:09 crc kubenswrapper[4672]: I1206 10:27:09.308023 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a015d70-4201-4f19-bfbb-9a8d4d5398b4" containerName="extract-content" Dec 06 10:27:09 crc kubenswrapper[4672]: E1206 10:27:09.308050 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa5524cd-c42a-4fe4-81b0-5f57648aca6f" containerName="extract-content" Dec 06 10:27:09 crc kubenswrapper[4672]: I1206 10:27:09.308056 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5524cd-c42a-4fe4-81b0-5f57648aca6f" containerName="extract-content" Dec 06 10:27:09 crc kubenswrapper[4672]: E1206 10:27:09.308071 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa5524cd-c42a-4fe4-81b0-5f57648aca6f" containerName="registry-server" Dec 06 10:27:09 crc kubenswrapper[4672]: I1206 10:27:09.308077 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5524cd-c42a-4fe4-81b0-5f57648aca6f" containerName="registry-server" Dec 06 10:27:09 crc kubenswrapper[4672]: E1206 10:27:09.308092 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a015d70-4201-4f19-bfbb-9a8d4d5398b4" containerName="extract-utilities" Dec 06 10:27:09 crc kubenswrapper[4672]: I1206 10:27:09.308098 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a015d70-4201-4f19-bfbb-9a8d4d5398b4" containerName="extract-utilities" Dec 06 10:27:09 crc kubenswrapper[4672]: E1206 10:27:09.308106 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa5524cd-c42a-4fe4-81b0-5f57648aca6f" containerName="extract-utilities" Dec 06 10:27:09 crc kubenswrapper[4672]: I1206 10:27:09.308111 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5524cd-c42a-4fe4-81b0-5f57648aca6f" containerName="extract-utilities" Dec 06 10:27:09 crc kubenswrapper[4672]: I1206 10:27:09.308309 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a015d70-4201-4f19-bfbb-9a8d4d5398b4" containerName="registry-server" Dec 06 10:27:09 crc kubenswrapper[4672]: I1206 10:27:09.308318 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa5524cd-c42a-4fe4-81b0-5f57648aca6f" containerName="registry-server" Dec 06 10:27:09 crc kubenswrapper[4672]: I1206 10:27:09.309406 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2jnct/must-gather-c8qqv" Dec 06 10:27:09 crc kubenswrapper[4672]: I1206 10:27:09.317043 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-2jnct"/"openshift-service-ca.crt" Dec 06 10:27:09 crc kubenswrapper[4672]: I1206 10:27:09.320836 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-2jnct"/"kube-root-ca.crt" Dec 06 10:27:09 crc kubenswrapper[4672]: I1206 10:27:09.325788 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2jnct/must-gather-c8qqv"] Dec 06 10:27:09 crc kubenswrapper[4672]: I1206 10:27:09.326201 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-2jnct"/"default-dockercfg-kh5vx" Dec 06 10:27:09 crc kubenswrapper[4672]: I1206 10:27:09.427234 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c237ec5d-7c8c-423b-b427-d5064e2bce86-must-gather-output\") pod \"must-gather-c8qqv\" (UID: \"c237ec5d-7c8c-423b-b427-d5064e2bce86\") " pod="openshift-must-gather-2jnct/must-gather-c8qqv" Dec 06 10:27:09 crc kubenswrapper[4672]: I1206 10:27:09.427372 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhhvq\" (UniqueName: \"kubernetes.io/projected/c237ec5d-7c8c-423b-b427-d5064e2bce86-kube-api-access-lhhvq\") pod \"must-gather-c8qqv\" (UID: \"c237ec5d-7c8c-423b-b427-d5064e2bce86\") " pod="openshift-must-gather-2jnct/must-gather-c8qqv" Dec 06 10:27:09 crc kubenswrapper[4672]: I1206 10:27:09.530206 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c237ec5d-7c8c-423b-b427-d5064e2bce86-must-gather-output\") pod \"must-gather-c8qqv\" (UID: \"c237ec5d-7c8c-423b-b427-d5064e2bce86\") " pod="openshift-must-gather-2jnct/must-gather-c8qqv" Dec 06 10:27:09 crc kubenswrapper[4672]: I1206 10:27:09.530480 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhhvq\" (UniqueName: \"kubernetes.io/projected/c237ec5d-7c8c-423b-b427-d5064e2bce86-kube-api-access-lhhvq\") pod \"must-gather-c8qqv\" (UID: \"c237ec5d-7c8c-423b-b427-d5064e2bce86\") " pod="openshift-must-gather-2jnct/must-gather-c8qqv" Dec 06 10:27:09 crc kubenswrapper[4672]: I1206 10:27:09.531687 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c237ec5d-7c8c-423b-b427-d5064e2bce86-must-gather-output\") pod \"must-gather-c8qqv\" (UID: \"c237ec5d-7c8c-423b-b427-d5064e2bce86\") " pod="openshift-must-gather-2jnct/must-gather-c8qqv" Dec 06 10:27:09 crc kubenswrapper[4672]: I1206 10:27:09.572873 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhhvq\" (UniqueName: \"kubernetes.io/projected/c237ec5d-7c8c-423b-b427-d5064e2bce86-kube-api-access-lhhvq\") pod \"must-gather-c8qqv\" (UID: \"c237ec5d-7c8c-423b-b427-d5064e2bce86\") " pod="openshift-must-gather-2jnct/must-gather-c8qqv" Dec 06 10:27:09 crc kubenswrapper[4672]: I1206 10:27:09.626185 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2jnct/must-gather-c8qqv" Dec 06 10:27:10 crc kubenswrapper[4672]: I1206 10:27:10.091097 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2jnct/must-gather-c8qqv"] Dec 06 10:27:10 crc kubenswrapper[4672]: I1206 10:27:10.471263 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2jnct/must-gather-c8qqv" event={"ID":"c237ec5d-7c8c-423b-b427-d5064e2bce86","Type":"ContainerStarted","Data":"ca158d895deb60a04a1aed917b90343a0a2cfea1d88b2c4c046f6adfd768dd73"} Dec 06 10:27:11 crc kubenswrapper[4672]: I1206 10:27:11.481253 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2jnct/must-gather-c8qqv" event={"ID":"c237ec5d-7c8c-423b-b427-d5064e2bce86","Type":"ContainerStarted","Data":"2ceeb04e454cdd87a68b109a5e54b511d54f99d82cd79a3052f9ddec67621ed3"} Dec 06 10:27:11 crc kubenswrapper[4672]: I1206 10:27:11.481627 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2jnct/must-gather-c8qqv" event={"ID":"c237ec5d-7c8c-423b-b427-d5064e2bce86","Type":"ContainerStarted","Data":"ef2eea298a1e2adfb29f0904d8bd4192330b9efb9d8fc035b8a55f241bb604cc"} Dec 06 10:27:11 crc kubenswrapper[4672]: I1206 10:27:11.496452 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2jnct/must-gather-c8qqv" podStartSLOduration=2.496429315 podStartE2EDuration="2.496429315s" podCreationTimestamp="2025-12-06 10:27:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 10:27:11.495949752 +0000 UTC m=+4849.240210049" watchObservedRunningTime="2025-12-06 10:27:11.496429315 +0000 UTC m=+4849.240689622" Dec 06 10:27:12 crc kubenswrapper[4672]: I1206 10:27:12.319900 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:27:12 crc kubenswrapper[4672]: I1206 10:27:12.319951 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:27:15 crc kubenswrapper[4672]: I1206 10:27:15.534376 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2jnct/crc-debug-b5ghj"] Dec 06 10:27:15 crc kubenswrapper[4672]: I1206 10:27:15.535939 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2jnct/crc-debug-b5ghj" Dec 06 10:27:15 crc kubenswrapper[4672]: I1206 10:27:15.656971 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgwk9\" (UniqueName: \"kubernetes.io/projected/29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0-kube-api-access-jgwk9\") pod \"crc-debug-b5ghj\" (UID: \"29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0\") " pod="openshift-must-gather-2jnct/crc-debug-b5ghj" Dec 06 10:27:15 crc kubenswrapper[4672]: I1206 10:27:15.657028 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0-host\") pod \"crc-debug-b5ghj\" (UID: \"29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0\") " pod="openshift-must-gather-2jnct/crc-debug-b5ghj" Dec 06 10:27:15 crc kubenswrapper[4672]: I1206 10:27:15.759249 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgwk9\" (UniqueName: \"kubernetes.io/projected/29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0-kube-api-access-jgwk9\") pod \"crc-debug-b5ghj\" (UID: \"29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0\") " pod="openshift-must-gather-2jnct/crc-debug-b5ghj" Dec 06 10:27:15 crc kubenswrapper[4672]: I1206 10:27:15.759292 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0-host\") pod \"crc-debug-b5ghj\" (UID: \"29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0\") " pod="openshift-must-gather-2jnct/crc-debug-b5ghj" Dec 06 10:27:15 crc kubenswrapper[4672]: I1206 10:27:15.759411 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0-host\") pod \"crc-debug-b5ghj\" (UID: \"29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0\") " pod="openshift-must-gather-2jnct/crc-debug-b5ghj" Dec 06 10:27:15 crc kubenswrapper[4672]: I1206 10:27:15.782697 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgwk9\" (UniqueName: \"kubernetes.io/projected/29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0-kube-api-access-jgwk9\") pod \"crc-debug-b5ghj\" (UID: \"29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0\") " pod="openshift-must-gather-2jnct/crc-debug-b5ghj" Dec 06 10:27:15 crc kubenswrapper[4672]: I1206 10:27:15.850942 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2jnct/crc-debug-b5ghj" Dec 06 10:27:15 crc kubenswrapper[4672]: W1206 10:27:15.898570 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29bbc421_d7cb_49d2_bfd5_3e9e9f2735a0.slice/crio-0b2b47773b3371a2f7a9555ddb2139685400350b1414f3c1327405c80f8fe031 WatchSource:0}: Error finding container 0b2b47773b3371a2f7a9555ddb2139685400350b1414f3c1327405c80f8fe031: Status 404 returned error can't find the container with id 0b2b47773b3371a2f7a9555ddb2139685400350b1414f3c1327405c80f8fe031 Dec 06 10:27:16 crc kubenswrapper[4672]: I1206 10:27:16.520566 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2jnct/crc-debug-b5ghj" event={"ID":"29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0","Type":"ContainerStarted","Data":"6e9d7584e59fa93c4468387641c34fe8b7b27a0d4feb86c65e87114e58c7d59d"} Dec 06 10:27:16 crc kubenswrapper[4672]: I1206 10:27:16.521137 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2jnct/crc-debug-b5ghj" event={"ID":"29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0","Type":"ContainerStarted","Data":"0b2b47773b3371a2f7a9555ddb2139685400350b1414f3c1327405c80f8fe031"} Dec 06 10:27:16 crc kubenswrapper[4672]: I1206 10:27:16.544424 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2jnct/crc-debug-b5ghj" podStartSLOduration=1.544408585 podStartE2EDuration="1.544408585s" podCreationTimestamp="2025-12-06 10:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 10:27:16.537369335 +0000 UTC m=+4854.281629642" watchObservedRunningTime="2025-12-06 10:27:16.544408585 +0000 UTC m=+4854.288668872" Dec 06 10:27:42 crc kubenswrapper[4672]: I1206 10:27:42.320240 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:27:42 crc kubenswrapper[4672]: I1206 10:27:42.320819 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:27:42 crc kubenswrapper[4672]: I1206 10:27:42.320871 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 10:27:42 crc kubenswrapper[4672]: I1206 10:27:42.321668 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e7d8e30269340f1018466db3bc64484f24c1f912f42266b66c55a54ccd41efab"} pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 10:27:42 crc kubenswrapper[4672]: I1206 10:27:42.321715 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" containerID="cri-o://e7d8e30269340f1018466db3bc64484f24c1f912f42266b66c55a54ccd41efab" gracePeriod=600 Dec 06 10:27:42 crc kubenswrapper[4672]: I1206 10:27:42.754447 4672 generic.go:334] "Generic (PLEG): container finished" podID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerID="e7d8e30269340f1018466db3bc64484f24c1f912f42266b66c55a54ccd41efab" exitCode=0 Dec 06 10:27:42 crc kubenswrapper[4672]: I1206 10:27:42.754571 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerDied","Data":"e7d8e30269340f1018466db3bc64484f24c1f912f42266b66c55a54ccd41efab"} Dec 06 10:27:42 crc kubenswrapper[4672]: I1206 10:27:42.754813 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerStarted","Data":"94b4f355b405b1790b65eef3318ce600b63c6d71249d935e5c628d5678535d3e"} Dec 06 10:27:42 crc kubenswrapper[4672]: I1206 10:27:42.754858 4672 scope.go:117] "RemoveContainer" containerID="17b1ffbbcb6cd9a1b6e17c4c4ee59d9ad091952701613dcb8dbefbb24a8138f6" Dec 06 10:27:50 crc kubenswrapper[4672]: I1206 10:27:50.825346 4672 generic.go:334] "Generic (PLEG): container finished" podID="29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0" containerID="6e9d7584e59fa93c4468387641c34fe8b7b27a0d4feb86c65e87114e58c7d59d" exitCode=0 Dec 06 10:27:50 crc kubenswrapper[4672]: I1206 10:27:50.825516 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2jnct/crc-debug-b5ghj" event={"ID":"29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0","Type":"ContainerDied","Data":"6e9d7584e59fa93c4468387641c34fe8b7b27a0d4feb86c65e87114e58c7d59d"} Dec 06 10:27:51 crc kubenswrapper[4672]: I1206 10:27:51.939411 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2jnct/crc-debug-b5ghj" Dec 06 10:27:51 crc kubenswrapper[4672]: I1206 10:27:51.975673 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2jnct/crc-debug-b5ghj"] Dec 06 10:27:51 crc kubenswrapper[4672]: I1206 10:27:51.988245 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2jnct/crc-debug-b5ghj"] Dec 06 10:27:52 crc kubenswrapper[4672]: I1206 10:27:52.066516 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0-host\") pod \"29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0\" (UID: \"29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0\") " Dec 06 10:27:52 crc kubenswrapper[4672]: I1206 10:27:52.066679 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgwk9\" (UniqueName: \"kubernetes.io/projected/29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0-kube-api-access-jgwk9\") pod \"29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0\" (UID: \"29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0\") " Dec 06 10:27:52 crc kubenswrapper[4672]: I1206 10:27:52.066692 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0-host" (OuterVolumeSpecName: "host") pod "29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0" (UID: "29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 10:27:52 crc kubenswrapper[4672]: I1206 10:27:52.068001 4672 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0-host\") on node \"crc\" DevicePath \"\"" Dec 06 10:27:52 crc kubenswrapper[4672]: I1206 10:27:52.071656 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0-kube-api-access-jgwk9" (OuterVolumeSpecName: "kube-api-access-jgwk9") pod "29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0" (UID: "29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0"). InnerVolumeSpecName "kube-api-access-jgwk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:27:52 crc kubenswrapper[4672]: I1206 10:27:52.169710 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgwk9\" (UniqueName: \"kubernetes.io/projected/29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0-kube-api-access-jgwk9\") on node \"crc\" DevicePath \"\"" Dec 06 10:27:52 crc kubenswrapper[4672]: I1206 10:27:52.570560 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0" path="/var/lib/kubelet/pods/29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0/volumes" Dec 06 10:27:52 crc kubenswrapper[4672]: I1206 10:27:52.843267 4672 scope.go:117] "RemoveContainer" containerID="6e9d7584e59fa93c4468387641c34fe8b7b27a0d4feb86c65e87114e58c7d59d" Dec 06 10:27:52 crc kubenswrapper[4672]: I1206 10:27:52.843613 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2jnct/crc-debug-b5ghj" Dec 06 10:27:53 crc kubenswrapper[4672]: I1206 10:27:53.177857 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2jnct/crc-debug-425lv"] Dec 06 10:27:53 crc kubenswrapper[4672]: E1206 10:27:53.179365 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0" containerName="container-00" Dec 06 10:27:53 crc kubenswrapper[4672]: I1206 10:27:53.179450 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0" containerName="container-00" Dec 06 10:27:53 crc kubenswrapper[4672]: I1206 10:27:53.179692 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="29bbc421-d7cb-49d2-bfd5-3e9e9f2735a0" containerName="container-00" Dec 06 10:27:53 crc kubenswrapper[4672]: I1206 10:27:53.180470 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2jnct/crc-debug-425lv" Dec 06 10:27:53 crc kubenswrapper[4672]: I1206 10:27:53.290762 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mt42\" (UniqueName: \"kubernetes.io/projected/453e3081-fdcf-482c-b0a9-971642658723-kube-api-access-8mt42\") pod \"crc-debug-425lv\" (UID: \"453e3081-fdcf-482c-b0a9-971642658723\") " pod="openshift-must-gather-2jnct/crc-debug-425lv" Dec 06 10:27:53 crc kubenswrapper[4672]: I1206 10:27:53.290821 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/453e3081-fdcf-482c-b0a9-971642658723-host\") pod \"crc-debug-425lv\" (UID: \"453e3081-fdcf-482c-b0a9-971642658723\") " pod="openshift-must-gather-2jnct/crc-debug-425lv" Dec 06 10:27:53 crc kubenswrapper[4672]: I1206 10:27:53.392277 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mt42\" (UniqueName: \"kubernetes.io/projected/453e3081-fdcf-482c-b0a9-971642658723-kube-api-access-8mt42\") pod \"crc-debug-425lv\" (UID: \"453e3081-fdcf-482c-b0a9-971642658723\") " pod="openshift-must-gather-2jnct/crc-debug-425lv" Dec 06 10:27:53 crc kubenswrapper[4672]: I1206 10:27:53.392338 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/453e3081-fdcf-482c-b0a9-971642658723-host\") pod \"crc-debug-425lv\" (UID: \"453e3081-fdcf-482c-b0a9-971642658723\") " pod="openshift-must-gather-2jnct/crc-debug-425lv" Dec 06 10:27:53 crc kubenswrapper[4672]: I1206 10:27:53.392486 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/453e3081-fdcf-482c-b0a9-971642658723-host\") pod \"crc-debug-425lv\" (UID: \"453e3081-fdcf-482c-b0a9-971642658723\") " pod="openshift-must-gather-2jnct/crc-debug-425lv" Dec 06 10:27:53 crc kubenswrapper[4672]: I1206 10:27:53.421037 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mt42\" (UniqueName: \"kubernetes.io/projected/453e3081-fdcf-482c-b0a9-971642658723-kube-api-access-8mt42\") pod \"crc-debug-425lv\" (UID: \"453e3081-fdcf-482c-b0a9-971642658723\") " pod="openshift-must-gather-2jnct/crc-debug-425lv" Dec 06 10:27:53 crc kubenswrapper[4672]: I1206 10:27:53.501966 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2jnct/crc-debug-425lv" Dec 06 10:27:53 crc kubenswrapper[4672]: W1206 10:27:53.529330 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod453e3081_fdcf_482c_b0a9_971642658723.slice/crio-09d81ac637044d643cc678a12fb48551aca9149a2f236d488326cc4b25c95416 WatchSource:0}: Error finding container 09d81ac637044d643cc678a12fb48551aca9149a2f236d488326cc4b25c95416: Status 404 returned error can't find the container with id 09d81ac637044d643cc678a12fb48551aca9149a2f236d488326cc4b25c95416 Dec 06 10:27:53 crc kubenswrapper[4672]: I1206 10:27:53.855063 4672 generic.go:334] "Generic (PLEG): container finished" podID="453e3081-fdcf-482c-b0a9-971642658723" containerID="6e9d6f00400f4b599feb72c85c665c0940bd7f9b37cef45043e171f0fc90356c" exitCode=0 Dec 06 10:27:53 crc kubenswrapper[4672]: I1206 10:27:53.855156 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2jnct/crc-debug-425lv" event={"ID":"453e3081-fdcf-482c-b0a9-971642658723","Type":"ContainerDied","Data":"6e9d6f00400f4b599feb72c85c665c0940bd7f9b37cef45043e171f0fc90356c"} Dec 06 10:27:53 crc kubenswrapper[4672]: I1206 10:27:53.855422 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2jnct/crc-debug-425lv" event={"ID":"453e3081-fdcf-482c-b0a9-971642658723","Type":"ContainerStarted","Data":"09d81ac637044d643cc678a12fb48551aca9149a2f236d488326cc4b25c95416"} Dec 06 10:27:54 crc kubenswrapper[4672]: I1206 10:27:54.284831 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2jnct/crc-debug-425lv"] Dec 06 10:27:54 crc kubenswrapper[4672]: I1206 10:27:54.294832 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2jnct/crc-debug-425lv"] Dec 06 10:27:54 crc kubenswrapper[4672]: I1206 10:27:54.965390 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2jnct/crc-debug-425lv" Dec 06 10:27:55 crc kubenswrapper[4672]: I1206 10:27:55.023923 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mt42\" (UniqueName: \"kubernetes.io/projected/453e3081-fdcf-482c-b0a9-971642658723-kube-api-access-8mt42\") pod \"453e3081-fdcf-482c-b0a9-971642658723\" (UID: \"453e3081-fdcf-482c-b0a9-971642658723\") " Dec 06 10:27:55 crc kubenswrapper[4672]: I1206 10:27:55.024117 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/453e3081-fdcf-482c-b0a9-971642658723-host\") pod \"453e3081-fdcf-482c-b0a9-971642658723\" (UID: \"453e3081-fdcf-482c-b0a9-971642658723\") " Dec 06 10:27:55 crc kubenswrapper[4672]: I1206 10:27:55.024336 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/453e3081-fdcf-482c-b0a9-971642658723-host" (OuterVolumeSpecName: "host") pod "453e3081-fdcf-482c-b0a9-971642658723" (UID: "453e3081-fdcf-482c-b0a9-971642658723"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 10:27:55 crc kubenswrapper[4672]: I1206 10:27:55.024668 4672 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/453e3081-fdcf-482c-b0a9-971642658723-host\") on node \"crc\" DevicePath \"\"" Dec 06 10:27:55 crc kubenswrapper[4672]: I1206 10:27:55.029805 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/453e3081-fdcf-482c-b0a9-971642658723-kube-api-access-8mt42" (OuterVolumeSpecName: "kube-api-access-8mt42") pod "453e3081-fdcf-482c-b0a9-971642658723" (UID: "453e3081-fdcf-482c-b0a9-971642658723"). InnerVolumeSpecName "kube-api-access-8mt42". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:27:55 crc kubenswrapper[4672]: I1206 10:27:55.125914 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mt42\" (UniqueName: \"kubernetes.io/projected/453e3081-fdcf-482c-b0a9-971642658723-kube-api-access-8mt42\") on node \"crc\" DevicePath \"\"" Dec 06 10:27:55 crc kubenswrapper[4672]: I1206 10:27:55.532952 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2jnct/crc-debug-b9dfl"] Dec 06 10:27:55 crc kubenswrapper[4672]: E1206 10:27:55.533407 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="453e3081-fdcf-482c-b0a9-971642658723" containerName="container-00" Dec 06 10:27:55 crc kubenswrapper[4672]: I1206 10:27:55.533424 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="453e3081-fdcf-482c-b0a9-971642658723" containerName="container-00" Dec 06 10:27:55 crc kubenswrapper[4672]: I1206 10:27:55.533782 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="453e3081-fdcf-482c-b0a9-971642658723" containerName="container-00" Dec 06 10:27:55 crc kubenswrapper[4672]: I1206 10:27:55.534484 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2jnct/crc-debug-b9dfl" Dec 06 10:27:55 crc kubenswrapper[4672]: I1206 10:27:55.634646 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b11d9ce3-c564-4384-8373-8ab157e28eb7-host\") pod \"crc-debug-b9dfl\" (UID: \"b11d9ce3-c564-4384-8373-8ab157e28eb7\") " pod="openshift-must-gather-2jnct/crc-debug-b9dfl" Dec 06 10:27:55 crc kubenswrapper[4672]: I1206 10:27:55.634956 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2942g\" (UniqueName: \"kubernetes.io/projected/b11d9ce3-c564-4384-8373-8ab157e28eb7-kube-api-access-2942g\") pod \"crc-debug-b9dfl\" (UID: \"b11d9ce3-c564-4384-8373-8ab157e28eb7\") " pod="openshift-must-gather-2jnct/crc-debug-b9dfl" Dec 06 10:27:55 crc kubenswrapper[4672]: I1206 10:27:55.737422 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2942g\" (UniqueName: \"kubernetes.io/projected/b11d9ce3-c564-4384-8373-8ab157e28eb7-kube-api-access-2942g\") pod \"crc-debug-b9dfl\" (UID: \"b11d9ce3-c564-4384-8373-8ab157e28eb7\") " pod="openshift-must-gather-2jnct/crc-debug-b9dfl" Dec 06 10:27:55 crc kubenswrapper[4672]: I1206 10:27:55.737779 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b11d9ce3-c564-4384-8373-8ab157e28eb7-host\") pod \"crc-debug-b9dfl\" (UID: \"b11d9ce3-c564-4384-8373-8ab157e28eb7\") " pod="openshift-must-gather-2jnct/crc-debug-b9dfl" Dec 06 10:27:55 crc kubenswrapper[4672]: I1206 10:27:55.737950 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b11d9ce3-c564-4384-8373-8ab157e28eb7-host\") pod \"crc-debug-b9dfl\" (UID: \"b11d9ce3-c564-4384-8373-8ab157e28eb7\") " pod="openshift-must-gather-2jnct/crc-debug-b9dfl" Dec 06 10:27:55 crc kubenswrapper[4672]: I1206 10:27:55.768286 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2942g\" (UniqueName: \"kubernetes.io/projected/b11d9ce3-c564-4384-8373-8ab157e28eb7-kube-api-access-2942g\") pod \"crc-debug-b9dfl\" (UID: \"b11d9ce3-c564-4384-8373-8ab157e28eb7\") " pod="openshift-must-gather-2jnct/crc-debug-b9dfl" Dec 06 10:27:55 crc kubenswrapper[4672]: I1206 10:27:55.849562 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2jnct/crc-debug-b9dfl" Dec 06 10:27:55 crc kubenswrapper[4672]: I1206 10:27:55.874345 4672 scope.go:117] "RemoveContainer" containerID="6e9d6f00400f4b599feb72c85c665c0940bd7f9b37cef45043e171f0fc90356c" Dec 06 10:27:55 crc kubenswrapper[4672]: I1206 10:27:55.874388 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2jnct/crc-debug-425lv" Dec 06 10:27:56 crc kubenswrapper[4672]: I1206 10:27:56.568011 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="453e3081-fdcf-482c-b0a9-971642658723" path="/var/lib/kubelet/pods/453e3081-fdcf-482c-b0a9-971642658723/volumes" Dec 06 10:27:56 crc kubenswrapper[4672]: I1206 10:27:56.885424 4672 generic.go:334] "Generic (PLEG): container finished" podID="b11d9ce3-c564-4384-8373-8ab157e28eb7" containerID="4ec07eea6e0794566080281ba62d185959eb6b9a110cef4f66c16cd3db9b018f" exitCode=0 Dec 06 10:27:56 crc kubenswrapper[4672]: I1206 10:27:56.885465 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2jnct/crc-debug-b9dfl" event={"ID":"b11d9ce3-c564-4384-8373-8ab157e28eb7","Type":"ContainerDied","Data":"4ec07eea6e0794566080281ba62d185959eb6b9a110cef4f66c16cd3db9b018f"} Dec 06 10:27:56 crc kubenswrapper[4672]: I1206 10:27:56.885493 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2jnct/crc-debug-b9dfl" event={"ID":"b11d9ce3-c564-4384-8373-8ab157e28eb7","Type":"ContainerStarted","Data":"005d62ddf2a2a451ca38ff5d0a2b4c3d0b7d75bec34601100be5e01fc11c1643"} Dec 06 10:27:56 crc kubenswrapper[4672]: I1206 10:27:56.922132 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2jnct/crc-debug-b9dfl"] Dec 06 10:27:56 crc kubenswrapper[4672]: I1206 10:27:56.931556 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2jnct/crc-debug-b9dfl"] Dec 06 10:27:57 crc kubenswrapper[4672]: I1206 10:27:57.990453 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2jnct/crc-debug-b9dfl" Dec 06 10:27:58 crc kubenswrapper[4672]: I1206 10:27:58.087938 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2942g\" (UniqueName: \"kubernetes.io/projected/b11d9ce3-c564-4384-8373-8ab157e28eb7-kube-api-access-2942g\") pod \"b11d9ce3-c564-4384-8373-8ab157e28eb7\" (UID: \"b11d9ce3-c564-4384-8373-8ab157e28eb7\") " Dec 06 10:27:58 crc kubenswrapper[4672]: I1206 10:27:58.088016 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b11d9ce3-c564-4384-8373-8ab157e28eb7-host\") pod \"b11d9ce3-c564-4384-8373-8ab157e28eb7\" (UID: \"b11d9ce3-c564-4384-8373-8ab157e28eb7\") " Dec 06 10:27:58 crc kubenswrapper[4672]: I1206 10:27:58.088215 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b11d9ce3-c564-4384-8373-8ab157e28eb7-host" (OuterVolumeSpecName: "host") pod "b11d9ce3-c564-4384-8373-8ab157e28eb7" (UID: "b11d9ce3-c564-4384-8373-8ab157e28eb7"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 10:27:58 crc kubenswrapper[4672]: I1206 10:27:58.088675 4672 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b11d9ce3-c564-4384-8373-8ab157e28eb7-host\") on node \"crc\" DevicePath \"\"" Dec 06 10:27:58 crc kubenswrapper[4672]: I1206 10:27:58.102798 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11d9ce3-c564-4384-8373-8ab157e28eb7-kube-api-access-2942g" (OuterVolumeSpecName: "kube-api-access-2942g") pod "b11d9ce3-c564-4384-8373-8ab157e28eb7" (UID: "b11d9ce3-c564-4384-8373-8ab157e28eb7"). InnerVolumeSpecName "kube-api-access-2942g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:27:58 crc kubenswrapper[4672]: I1206 10:27:58.190740 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2942g\" (UniqueName: \"kubernetes.io/projected/b11d9ce3-c564-4384-8373-8ab157e28eb7-kube-api-access-2942g\") on node \"crc\" DevicePath \"\"" Dec 06 10:27:58 crc kubenswrapper[4672]: I1206 10:27:58.569907 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11d9ce3-c564-4384-8373-8ab157e28eb7" path="/var/lib/kubelet/pods/b11d9ce3-c564-4384-8373-8ab157e28eb7/volumes" Dec 06 10:27:58 crc kubenswrapper[4672]: I1206 10:27:58.903227 4672 scope.go:117] "RemoveContainer" containerID="4ec07eea6e0794566080281ba62d185959eb6b9a110cef4f66c16cd3db9b018f" Dec 06 10:27:58 crc kubenswrapper[4672]: I1206 10:27:58.903631 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2jnct/crc-debug-b9dfl" Dec 06 10:29:18 crc kubenswrapper[4672]: I1206 10:29:18.152323 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7fcb648c94-6bbbh_28410d08-bb47-4a67-a4d8-c06929b8c644/barbican-api/0.log" Dec 06 10:29:18 crc kubenswrapper[4672]: I1206 10:29:18.353918 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7fcb648c94-6bbbh_28410d08-bb47-4a67-a4d8-c06929b8c644/barbican-api-log/0.log" Dec 06 10:29:18 crc kubenswrapper[4672]: I1206 10:29:18.396032 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-868fbbdb46-nq8wk_79992d1b-dc0e-43ad-b6cd-942fadb148e6/barbican-keystone-listener-log/0.log" Dec 06 10:29:18 crc kubenswrapper[4672]: I1206 10:29:18.448479 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-868fbbdb46-nq8wk_79992d1b-dc0e-43ad-b6cd-942fadb148e6/barbican-keystone-listener/0.log" Dec 06 10:29:18 crc kubenswrapper[4672]: I1206 10:29:18.617933 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6d54549b45-whq64_963924f1-d56b-4422-af4a-cc5c3a17944f/barbican-worker/0.log" Dec 06 10:29:18 crc kubenswrapper[4672]: I1206 10:29:18.682983 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6d54549b45-whq64_963924f1-d56b-4422-af4a-cc5c3a17944f/barbican-worker-log/0.log" Dec 06 10:29:18 crc kubenswrapper[4672]: I1206 10:29:18.944200 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-cdwd9_87cce220-e210-44d8-ac72-946b6e9bb4c4/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 10:29:18 crc kubenswrapper[4672]: I1206 10:29:18.960022 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_42976197-15a4-4ceb-baf3-fa56682d89a6/ceilometer-central-agent/0.log" Dec 06 10:29:19 crc kubenswrapper[4672]: I1206 10:29:19.052352 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_42976197-15a4-4ceb-baf3-fa56682d89a6/ceilometer-notification-agent/0.log" Dec 06 10:29:19 crc kubenswrapper[4672]: I1206 10:29:19.128532 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_42976197-15a4-4ceb-baf3-fa56682d89a6/sg-core/0.log" Dec 06 10:29:19 crc kubenswrapper[4672]: I1206 10:29:19.164537 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_42976197-15a4-4ceb-baf3-fa56682d89a6/proxy-httpd/0.log" Dec 06 10:29:19 crc kubenswrapper[4672]: I1206 10:29:19.288262 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-8q75v_a0bb0cdb-025d-4251-b0f0-06185ea6fe8f/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 10:29:19 crc kubenswrapper[4672]: I1206 10:29:19.466949 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8792d_fc76ad12-899a-427b-abd7-57b3375a29ea/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 10:29:19 crc kubenswrapper[4672]: I1206 10:29:19.528075 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_1d2747dd-122d-4920-a266-6be569a3ab33/cinder-api/0.log" Dec 06 10:29:19 crc kubenswrapper[4672]: I1206 10:29:19.692463 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_1d2747dd-122d-4920-a266-6be569a3ab33/cinder-api-log/0.log" Dec 06 10:29:19 crc kubenswrapper[4672]: I1206 10:29:19.897582 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_db88dbb4-2112-4bec-a4e4-f0bf562bb173/probe/0.log" Dec 06 10:29:19 crc kubenswrapper[4672]: I1206 10:29:19.931353 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_db88dbb4-2112-4bec-a4e4-f0bf562bb173/cinder-backup/0.log" Dec 06 10:29:20 crc kubenswrapper[4672]: I1206 10:29:20.011806 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8dda2373-3b28-4086-b29c-3415f50f1d92/cinder-scheduler/0.log" Dec 06 10:29:20 crc kubenswrapper[4672]: I1206 10:29:20.210358 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8dda2373-3b28-4086-b29c-3415f50f1d92/probe/0.log" Dec 06 10:29:20 crc kubenswrapper[4672]: I1206 10:29:20.332715 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_a35af03e-7b48-40ee-a857-20824a664f4e/probe/0.log" Dec 06 10:29:20 crc kubenswrapper[4672]: I1206 10:29:20.372208 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_a35af03e-7b48-40ee-a857-20824a664f4e/cinder-volume/0.log" Dec 06 10:29:21 crc kubenswrapper[4672]: I1206 10:29:21.022771 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-kdtt2_d3188b54-be64-4ee4-a4ff-4af6f300e58a/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 10:29:21 crc kubenswrapper[4672]: I1206 10:29:21.039455 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-7b7gc_02d2290f-9fc0-4247-8db8-660f26601528/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 10:29:21 crc kubenswrapper[4672]: I1206 10:29:21.541628 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b586f587-9rz4l_961904ba-a936-4912-b5a1-a20e4a4028e6/init/0.log" Dec 06 10:29:21 crc kubenswrapper[4672]: I1206 10:29:21.828002 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b586f587-9rz4l_961904ba-a936-4912-b5a1-a20e4a4028e6/init/0.log" Dec 06 10:29:22 crc kubenswrapper[4672]: I1206 10:29:22.041738 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_9a6d2d22-6464-4bf5-9bf6-e3515efedbf4/glance-httpd/0.log" Dec 06 10:29:22 crc kubenswrapper[4672]: I1206 10:29:22.050919 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b586f587-9rz4l_961904ba-a936-4912-b5a1-a20e4a4028e6/dnsmasq-dns/0.log" Dec 06 10:29:22 crc kubenswrapper[4672]: I1206 10:29:22.071592 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_9a6d2d22-6464-4bf5-9bf6-e3515efedbf4/glance-log/0.log" Dec 06 10:29:22 crc kubenswrapper[4672]: I1206 10:29:22.331609 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_97cea7c5-c51e-4001-b398-28bdbccd9a97/glance-log/0.log" Dec 06 10:29:22 crc kubenswrapper[4672]: I1206 10:29:22.515863 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_97cea7c5-c51e-4001-b398-28bdbccd9a97/glance-httpd/0.log" Dec 06 10:29:22 crc kubenswrapper[4672]: I1206 10:29:22.642866 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-8c74dbc66-8ghhf_70d8ba3e-3f2d-4627-afab-5bb8908f89eb/horizon/0.log" Dec 06 10:29:22 crc kubenswrapper[4672]: I1206 10:29:22.823479 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-d7wtj_131ab019-8934-4783-ab57-b3ecccd11b05/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 10:29:22 crc kubenswrapper[4672]: I1206 10:29:22.938640 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-8c74dbc66-8ghhf_70d8ba3e-3f2d-4627-afab-5bb8908f89eb/horizon-log/0.log" Dec 06 10:29:22 crc kubenswrapper[4672]: I1206 10:29:22.964550 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-gzmnw_cad15908-dabc-4b48-9aa7-977801ce63ff/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 10:29:23 crc kubenswrapper[4672]: I1206 10:29:23.165840 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29416921-tdtbn_f321169c-b38c-4403-8541-48064fd878b2/keystone-cron/0.log" Dec 06 10:29:23 crc kubenswrapper[4672]: I1206 10:29:23.281105 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-76bb4c894-tw7m5_010644c2-5d3a-41e3-a27a-31a6e1d3a0b4/keystone-api/0.log" Dec 06 10:29:23 crc kubenswrapper[4672]: I1206 10:29:23.501309 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_d6a82c78-9f40-4c1a-8f10-03f92549df7b/kube-state-metrics/0.log" Dec 06 10:29:23 crc kubenswrapper[4672]: I1206 10:29:23.523470 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-bnr4r_cc85e883-c516-489f-b15d-6e57e4236b75/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 10:29:23 crc kubenswrapper[4672]: I1206 10:29:23.736454 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_42ee5091-4d7a-4807-905e-19dddd238386/manila-api/0.log" Dec 06 10:29:23 crc kubenswrapper[4672]: I1206 10:29:23.773866 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_42ee5091-4d7a-4807-905e-19dddd238386/manila-api-log/0.log" Dec 06 10:29:23 crc kubenswrapper[4672]: I1206 10:29:23.837338 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_ab157348-d161-4de2-bf4c-084cb71b0982/probe/0.log" Dec 06 10:29:23 crc kubenswrapper[4672]: I1206 10:29:23.920435 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_ab157348-d161-4de2-bf4c-084cb71b0982/manila-scheduler/0.log" Dec 06 10:29:24 crc kubenswrapper[4672]: I1206 10:29:24.082760 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_d7cfcb36-13c1-4215-b316-b2082d41bcae/manila-share/0.log" Dec 06 10:29:24 crc kubenswrapper[4672]: I1206 10:29:24.174039 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_d7cfcb36-13c1-4215-b316-b2082d41bcae/probe/0.log" Dec 06 10:29:24 crc kubenswrapper[4672]: I1206 10:29:24.587833 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-d999c477-wf9vn_bc4c1773-bc77-4592-aff9-04323f477805/neutron-httpd/0.log" Dec 06 10:29:24 crc kubenswrapper[4672]: I1206 10:29:24.628579 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-d999c477-wf9vn_bc4c1773-bc77-4592-aff9-04323f477805/neutron-api/0.log" Dec 06 10:29:25 crc kubenswrapper[4672]: I1206 10:29:25.087994 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-2v64f_0a4871b2-574b-433c-8491-9147da825602/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 10:29:25 crc kubenswrapper[4672]: I1206 10:29:25.699957 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1/nova-api-log/0.log" Dec 06 10:29:25 crc kubenswrapper[4672]: I1206 10:29:25.735542 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_ee39fc48-02ae-46b6-90b8-5b82cafad74d/nova-cell0-conductor-conductor/0.log" Dec 06 10:29:26 crc kubenswrapper[4672]: I1206 10:29:26.101254 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8eb0159c-5fe5-4ec4-9f3c-ba851fedf3f1/nova-api-api/0.log" Dec 06 10:29:26 crc kubenswrapper[4672]: I1206 10:29:26.351903 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_2c1902d9-bb65-4974-a922-056811447603/nova-cell1-conductor-conductor/0.log" Dec 06 10:29:26 crc kubenswrapper[4672]: I1206 10:29:26.503667 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_21ff730f-c3e2-4cf0-8e52-8345907156f1/nova-cell1-novncproxy-novncproxy/0.log" Dec 06 10:29:26 crc kubenswrapper[4672]: I1206 10:29:26.652117 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9j9pb_b27237d2-1240-4f55-a12b-9248c3a899e4/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 10:29:26 crc kubenswrapper[4672]: I1206 10:29:26.848920 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2610b3a3-94e4-4583-b42a-739e7dd1bfc7/nova-metadata-log/0.log" Dec 06 10:29:27 crc kubenswrapper[4672]: I1206 10:29:27.250429 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_e0222d9d-628a-423d-b12a-377e94b3ac5c/nova-scheduler-scheduler/0.log" Dec 06 10:29:27 crc kubenswrapper[4672]: I1206 10:29:27.382642 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8c53efb2-1642-4efd-b920-7ad41e6c136a/mysql-bootstrap/0.log" Dec 06 10:29:27 crc kubenswrapper[4672]: I1206 10:29:27.601424 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8c53efb2-1642-4efd-b920-7ad41e6c136a/mysql-bootstrap/0.log" Dec 06 10:29:27 crc kubenswrapper[4672]: I1206 10:29:27.608441 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8c53efb2-1642-4efd-b920-7ad41e6c136a/galera/0.log" Dec 06 10:29:27 crc kubenswrapper[4672]: I1206 10:29:27.927671 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_37d0f081-e2da-4845-9097-31607c42efc4/mysql-bootstrap/0.log" Dec 06 10:29:28 crc kubenswrapper[4672]: I1206 10:29:28.183269 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_37d0f081-e2da-4845-9097-31607c42efc4/mysql-bootstrap/0.log" Dec 06 10:29:28 crc kubenswrapper[4672]: I1206 10:29:28.220159 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_37d0f081-e2da-4845-9097-31607c42efc4/galera/0.log" Dec 06 10:29:28 crc kubenswrapper[4672]: I1206 10:29:28.457715 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-hxgmq_2f9d3ddd-e69d-48c6-ae37-094f18a1ddc6/ovn-controller/0.log" Dec 06 10:29:28 crc kubenswrapper[4672]: I1206 10:29:28.743381 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2610b3a3-94e4-4583-b42a-739e7dd1bfc7/nova-metadata-metadata/0.log" Dec 06 10:29:28 crc kubenswrapper[4672]: I1206 10:29:28.847391 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_44fea3e1-80c8-4525-b613-467978a95351/openstackclient/0.log" Dec 06 10:29:29 crc kubenswrapper[4672]: I1206 10:29:29.227089 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rsxq7_8cc7e2b2-ad6c-44f4-b477-951936b867d8/ovsdb-server-init/0.log" Dec 06 10:29:29 crc kubenswrapper[4672]: I1206 10:29:29.270788 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-q5ktw_a8b30f64-653c-49e8-857d-af30b3126e2d/openstack-network-exporter/0.log" Dec 06 10:29:29 crc kubenswrapper[4672]: I1206 10:29:29.443264 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rsxq7_8cc7e2b2-ad6c-44f4-b477-951936b867d8/ovs-vswitchd/0.log" Dec 06 10:29:29 crc kubenswrapper[4672]: I1206 10:29:29.501582 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rsxq7_8cc7e2b2-ad6c-44f4-b477-951936b867d8/ovsdb-server-init/0.log" Dec 06 10:29:29 crc kubenswrapper[4672]: I1206 10:29:29.519647 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rsxq7_8cc7e2b2-ad6c-44f4-b477-951936b867d8/ovsdb-server/0.log" Dec 06 10:29:29 crc kubenswrapper[4672]: I1206 10:29:29.766933 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb/openstack-network-exporter/0.log" Dec 06 10:29:29 crc kubenswrapper[4672]: I1206 10:29:29.795515 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-ckzpr_3c8ad536-4cb5-4454-bcc3-5b13cb92215d/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 10:29:29 crc kubenswrapper[4672]: I1206 10:29:29.956964 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_a6f49a03-3f9d-46c1-86a8-9ad0a7e6c7fb/ovn-northd/0.log" Dec 06 10:29:30 crc kubenswrapper[4672]: I1206 10:29:30.003339 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9f085ca1-832b-40dc-b131-2c287df92f6e/openstack-network-exporter/0.log" Dec 06 10:29:30 crc kubenswrapper[4672]: I1206 10:29:30.178266 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9f085ca1-832b-40dc-b131-2c287df92f6e/ovsdbserver-nb/0.log" Dec 06 10:29:30 crc kubenswrapper[4672]: I1206 10:29:30.320886 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4cddfb03-e3ff-478e-91c7-e3b58145d1e6/openstack-network-exporter/0.log" Dec 06 10:29:30 crc kubenswrapper[4672]: I1206 10:29:30.333946 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4cddfb03-e3ff-478e-91c7-e3b58145d1e6/ovsdbserver-sb/0.log" Dec 06 10:29:30 crc kubenswrapper[4672]: I1206 10:29:30.594713 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-777f8d8c58-75kwt_46caf0fe-e392-43fb-8893-2a7bd67bd1a7/placement-api/0.log" Dec 06 10:29:30 crc kubenswrapper[4672]: I1206 10:29:30.718583 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-777f8d8c58-75kwt_46caf0fe-e392-43fb-8893-2a7bd67bd1a7/placement-log/0.log" Dec 06 10:29:30 crc kubenswrapper[4672]: I1206 10:29:30.803689 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3f71615b-1205-44b2-b4aa-c03548716486/setup-container/0.log" Dec 06 10:29:31 crc kubenswrapper[4672]: I1206 10:29:31.190687 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3f71615b-1205-44b2-b4aa-c03548716486/setup-container/0.log" Dec 06 10:29:31 crc kubenswrapper[4672]: I1206 10:29:31.261021 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3f71615b-1205-44b2-b4aa-c03548716486/rabbitmq/0.log" Dec 06 10:29:31 crc kubenswrapper[4672]: I1206 10:29:31.392871 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f3cf9f22-30ac-48ca-9d05-407868710c73/setup-container/0.log" Dec 06 10:29:31 crc kubenswrapper[4672]: I1206 10:29:31.669862 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f3cf9f22-30ac-48ca-9d05-407868710c73/setup-container/0.log" Dec 06 10:29:31 crc kubenswrapper[4672]: I1206 10:29:31.677671 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f3cf9f22-30ac-48ca-9d05-407868710c73/rabbitmq/0.log" Dec 06 10:29:31 crc kubenswrapper[4672]: I1206 10:29:31.767587 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-h2tnt_bb01149f-0837-46f0-8636-b72f5fb85e9a/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 10:29:32 crc kubenswrapper[4672]: I1206 10:29:32.104068 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-z6wlt_38dd2d36-2778-405f-97b8-d2651746de0c/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 10:29:32 crc kubenswrapper[4672]: I1206 10:29:32.203475 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-9f6tf_5c6bfe13-aab7-4455-9879-a1e1e7276407/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 10:29:32 crc kubenswrapper[4672]: I1206 10:29:32.384091 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-pmvr2_e6867be1-002c-4eae-b841-885e3e5e5d20/ssh-known-hosts-edpm-deployment/0.log" Dec 06 10:29:32 crc kubenswrapper[4672]: I1206 10:29:32.586266 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_5faa4ff5-ff2e-43f8-b9b6-64f44e7a489d/tempest-tests-tempest-tests-runner/0.log" Dec 06 10:29:32 crc kubenswrapper[4672]: I1206 10:29:32.705300 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_6db39db9-f682-4b08-adce-32d7478a345b/test-operator-logs-container/0.log" Dec 06 10:29:32 crc kubenswrapper[4672]: I1206 10:29:32.911822 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-kzvqn_53ed8161-58e0-4b3b-91bf-190216b16b12/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 10:29:42 crc kubenswrapper[4672]: I1206 10:29:42.324963 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:29:42 crc kubenswrapper[4672]: I1206 10:29:42.325542 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:29:47 crc kubenswrapper[4672]: I1206 10:29:47.040159 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_7595f929-2c12-4a7f-ba33-2701f7a701ee/memcached/0.log" Dec 06 10:30:00 crc kubenswrapper[4672]: I1206 10:30:00.158648 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416950-4rlrk"] Dec 06 10:30:00 crc kubenswrapper[4672]: E1206 10:30:00.161016 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b11d9ce3-c564-4384-8373-8ab157e28eb7" containerName="container-00" Dec 06 10:30:00 crc kubenswrapper[4672]: I1206 10:30:00.161116 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="b11d9ce3-c564-4384-8373-8ab157e28eb7" containerName="container-00" Dec 06 10:30:00 crc kubenswrapper[4672]: I1206 10:30:00.161391 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="b11d9ce3-c564-4384-8373-8ab157e28eb7" containerName="container-00" Dec 06 10:30:00 crc kubenswrapper[4672]: I1206 10:30:00.162270 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-4rlrk" Dec 06 10:30:00 crc kubenswrapper[4672]: I1206 10:30:00.164933 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 10:30:00 crc kubenswrapper[4672]: I1206 10:30:00.165510 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 10:30:00 crc kubenswrapper[4672]: I1206 10:30:00.172394 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416950-4rlrk"] Dec 06 10:30:00 crc kubenswrapper[4672]: I1206 10:30:00.269815 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh59p\" (UniqueName: \"kubernetes.io/projected/c970d87b-a996-4f22-b099-48730d112392-kube-api-access-kh59p\") pod \"collect-profiles-29416950-4rlrk\" (UID: \"c970d87b-a996-4f22-b099-48730d112392\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-4rlrk" Dec 06 10:30:00 crc kubenswrapper[4672]: I1206 10:30:00.270003 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c970d87b-a996-4f22-b099-48730d112392-secret-volume\") pod \"collect-profiles-29416950-4rlrk\" (UID: \"c970d87b-a996-4f22-b099-48730d112392\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-4rlrk" Dec 06 10:30:00 crc kubenswrapper[4672]: I1206 10:30:00.270057 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c970d87b-a996-4f22-b099-48730d112392-config-volume\") pod \"collect-profiles-29416950-4rlrk\" (UID: \"c970d87b-a996-4f22-b099-48730d112392\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-4rlrk" Dec 06 10:30:00 crc kubenswrapper[4672]: I1206 10:30:00.372547 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c970d87b-a996-4f22-b099-48730d112392-secret-volume\") pod \"collect-profiles-29416950-4rlrk\" (UID: \"c970d87b-a996-4f22-b099-48730d112392\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-4rlrk" Dec 06 10:30:00 crc kubenswrapper[4672]: I1206 10:30:00.372703 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c970d87b-a996-4f22-b099-48730d112392-config-volume\") pod \"collect-profiles-29416950-4rlrk\" (UID: \"c970d87b-a996-4f22-b099-48730d112392\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-4rlrk" Dec 06 10:30:00 crc kubenswrapper[4672]: I1206 10:30:00.374068 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c970d87b-a996-4f22-b099-48730d112392-config-volume\") pod \"collect-profiles-29416950-4rlrk\" (UID: \"c970d87b-a996-4f22-b099-48730d112392\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-4rlrk" Dec 06 10:30:00 crc kubenswrapper[4672]: I1206 10:30:00.374324 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh59p\" (UniqueName: \"kubernetes.io/projected/c970d87b-a996-4f22-b099-48730d112392-kube-api-access-kh59p\") pod \"collect-profiles-29416950-4rlrk\" (UID: \"c970d87b-a996-4f22-b099-48730d112392\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-4rlrk" Dec 06 10:30:00 crc kubenswrapper[4672]: I1206 10:30:00.382169 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c970d87b-a996-4f22-b099-48730d112392-secret-volume\") pod \"collect-profiles-29416950-4rlrk\" (UID: \"c970d87b-a996-4f22-b099-48730d112392\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-4rlrk" Dec 06 10:30:00 crc kubenswrapper[4672]: I1206 10:30:00.396263 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh59p\" (UniqueName: \"kubernetes.io/projected/c970d87b-a996-4f22-b099-48730d112392-kube-api-access-kh59p\") pod \"collect-profiles-29416950-4rlrk\" (UID: \"c970d87b-a996-4f22-b099-48730d112392\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-4rlrk" Dec 06 10:30:00 crc kubenswrapper[4672]: I1206 10:30:00.485989 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-4rlrk" Dec 06 10:30:00 crc kubenswrapper[4672]: I1206 10:30:00.943972 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416950-4rlrk"] Dec 06 10:30:01 crc kubenswrapper[4672]: I1206 10:30:01.029780 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-4rlrk" event={"ID":"c970d87b-a996-4f22-b099-48730d112392","Type":"ContainerStarted","Data":"61019b2adf5ea15a815420c6cb3f45be6ea04ff940413d4228ad0c84acf2429c"} Dec 06 10:30:02 crc kubenswrapper[4672]: I1206 10:30:02.039331 4672 generic.go:334] "Generic (PLEG): container finished" podID="c970d87b-a996-4f22-b099-48730d112392" containerID="5f0252e4f2cce5cbefea9448c957a6e7d46a4868917c3cdd88bab9c054d28394" exitCode=0 Dec 06 10:30:02 crc kubenswrapper[4672]: I1206 10:30:02.039464 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-4rlrk" event={"ID":"c970d87b-a996-4f22-b099-48730d112392","Type":"ContainerDied","Data":"5f0252e4f2cce5cbefea9448c957a6e7d46a4868917c3cdd88bab9c054d28394"} Dec 06 10:30:03 crc kubenswrapper[4672]: I1206 10:30:03.473610 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-4rlrk" Dec 06 10:30:03 crc kubenswrapper[4672]: I1206 10:30:03.650010 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh59p\" (UniqueName: \"kubernetes.io/projected/c970d87b-a996-4f22-b099-48730d112392-kube-api-access-kh59p\") pod \"c970d87b-a996-4f22-b099-48730d112392\" (UID: \"c970d87b-a996-4f22-b099-48730d112392\") " Dec 06 10:30:03 crc kubenswrapper[4672]: I1206 10:30:03.650090 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c970d87b-a996-4f22-b099-48730d112392-secret-volume\") pod \"c970d87b-a996-4f22-b099-48730d112392\" (UID: \"c970d87b-a996-4f22-b099-48730d112392\") " Dec 06 10:30:03 crc kubenswrapper[4672]: I1206 10:30:03.650148 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c970d87b-a996-4f22-b099-48730d112392-config-volume\") pod \"c970d87b-a996-4f22-b099-48730d112392\" (UID: \"c970d87b-a996-4f22-b099-48730d112392\") " Dec 06 10:30:03 crc kubenswrapper[4672]: I1206 10:30:03.650857 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c970d87b-a996-4f22-b099-48730d112392-config-volume" (OuterVolumeSpecName: "config-volume") pod "c970d87b-a996-4f22-b099-48730d112392" (UID: "c970d87b-a996-4f22-b099-48730d112392"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:30:03 crc kubenswrapper[4672]: I1206 10:30:03.651416 4672 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c970d87b-a996-4f22-b099-48730d112392-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 10:30:03 crc kubenswrapper[4672]: I1206 10:30:03.657765 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c970d87b-a996-4f22-b099-48730d112392-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c970d87b-a996-4f22-b099-48730d112392" (UID: "c970d87b-a996-4f22-b099-48730d112392"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:30:03 crc kubenswrapper[4672]: I1206 10:30:03.662874 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c970d87b-a996-4f22-b099-48730d112392-kube-api-access-kh59p" (OuterVolumeSpecName: "kube-api-access-kh59p") pod "c970d87b-a996-4f22-b099-48730d112392" (UID: "c970d87b-a996-4f22-b099-48730d112392"). InnerVolumeSpecName "kube-api-access-kh59p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:30:03 crc kubenswrapper[4672]: I1206 10:30:03.753739 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kh59p\" (UniqueName: \"kubernetes.io/projected/c970d87b-a996-4f22-b099-48730d112392-kube-api-access-kh59p\") on node \"crc\" DevicePath \"\"" Dec 06 10:30:03 crc kubenswrapper[4672]: I1206 10:30:03.754062 4672 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c970d87b-a996-4f22-b099-48730d112392-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 10:30:04 crc kubenswrapper[4672]: I1206 10:30:04.071342 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-4rlrk" event={"ID":"c970d87b-a996-4f22-b099-48730d112392","Type":"ContainerDied","Data":"61019b2adf5ea15a815420c6cb3f45be6ea04ff940413d4228ad0c84acf2429c"} Dec 06 10:30:04 crc kubenswrapper[4672]: I1206 10:30:04.071590 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61019b2adf5ea15a815420c6cb3f45be6ea04ff940413d4228ad0c84acf2429c" Dec 06 10:30:04 crc kubenswrapper[4672]: I1206 10:30:04.071403 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-4rlrk" Dec 06 10:30:04 crc kubenswrapper[4672]: I1206 10:30:04.550993 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416905-jm4xv"] Dec 06 10:30:04 crc kubenswrapper[4672]: I1206 10:30:04.567991 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416905-jm4xv"] Dec 06 10:30:05 crc kubenswrapper[4672]: I1206 10:30:05.825559 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm_a2459c7d-a6d6-48c8-9a18-48d05c0129a9/util/0.log" Dec 06 10:30:05 crc kubenswrapper[4672]: I1206 10:30:05.977303 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm_a2459c7d-a6d6-48c8-9a18-48d05c0129a9/pull/0.log" Dec 06 10:30:06 crc kubenswrapper[4672]: I1206 10:30:06.050138 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm_a2459c7d-a6d6-48c8-9a18-48d05c0129a9/util/0.log" Dec 06 10:30:06 crc kubenswrapper[4672]: I1206 10:30:06.069469 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm_a2459c7d-a6d6-48c8-9a18-48d05c0129a9/pull/0.log" Dec 06 10:30:06 crc kubenswrapper[4672]: I1206 10:30:06.274476 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm_a2459c7d-a6d6-48c8-9a18-48d05c0129a9/extract/0.log" Dec 06 10:30:06 crc kubenswrapper[4672]: I1206 10:30:06.274491 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm_a2459c7d-a6d6-48c8-9a18-48d05c0129a9/pull/0.log" Dec 06 10:30:06 crc kubenswrapper[4672]: I1206 10:30:06.277077 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafxvjrm_a2459c7d-a6d6-48c8-9a18-48d05c0129a9/util/0.log" Dec 06 10:30:06 crc kubenswrapper[4672]: I1206 10:30:06.461826 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-lh7x2_ce4e8b8a-4f3a-4303-9455-8eb984c06f57/kube-rbac-proxy/0.log" Dec 06 10:30:06 crc kubenswrapper[4672]: I1206 10:30:06.566827 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42b72307-c01a-44b8-88ce-9a267335daff" path="/var/lib/kubelet/pods/42b72307-c01a-44b8-88ce-9a267335daff/volumes" Dec 06 10:30:06 crc kubenswrapper[4672]: I1206 10:30:06.571477 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-lh7x2_ce4e8b8a-4f3a-4303-9455-8eb984c06f57/manager/0.log" Dec 06 10:30:06 crc kubenswrapper[4672]: I1206 10:30:06.604285 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-cpc5n_7dc29189-4c37-4886-af89-7c6cb57f237e/kube-rbac-proxy/0.log" Dec 06 10:30:06 crc kubenswrapper[4672]: I1206 10:30:06.747547 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-cpc5n_7dc29189-4c37-4886-af89-7c6cb57f237e/manager/0.log" Dec 06 10:30:06 crc kubenswrapper[4672]: I1206 10:30:06.790147 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-6jcpj_7e99a7a0-5a1d-4143-a8b7-9fb170d119a2/kube-rbac-proxy/0.log" Dec 06 10:30:06 crc kubenswrapper[4672]: I1206 10:30:06.843041 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-6jcpj_7e99a7a0-5a1d-4143-a8b7-9fb170d119a2/manager/0.log" Dec 06 10:30:07 crc kubenswrapper[4672]: I1206 10:30:07.045828 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-p7c94_018edeb2-cc58-49fe-a7ea-15a8b6646ddd/kube-rbac-proxy/0.log" Dec 06 10:30:07 crc kubenswrapper[4672]: I1206 10:30:07.089680 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-p7c94_018edeb2-cc58-49fe-a7ea-15a8b6646ddd/manager/0.log" Dec 06 10:30:07 crc kubenswrapper[4672]: I1206 10:30:07.241711 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-2zwxr_96ee3cc6-bf15-4fa0-9efc-7a0aa1338b43/manager/0.log" Dec 06 10:30:07 crc kubenswrapper[4672]: I1206 10:30:07.299252 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-2zwxr_96ee3cc6-bf15-4fa0-9efc-7a0aa1338b43/kube-rbac-proxy/0.log" Dec 06 10:30:07 crc kubenswrapper[4672]: I1206 10:30:07.323833 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-dvzm4_7753548d-df52-4a65-b447-d20dcd379cde/kube-rbac-proxy/0.log" Dec 06 10:30:07 crc kubenswrapper[4672]: I1206 10:30:07.453662 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-dvzm4_7753548d-df52-4a65-b447-d20dcd379cde/manager/0.log" Dec 06 10:30:07 crc kubenswrapper[4672]: I1206 10:30:07.514250 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-rwjvr_6bbb7d8a-ba3a-476a-b09d-0fd084fc325e/kube-rbac-proxy/0.log" Dec 06 10:30:07 crc kubenswrapper[4672]: I1206 10:30:07.687332 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-rwjvr_6bbb7d8a-ba3a-476a-b09d-0fd084fc325e/manager/0.log" Dec 06 10:30:07 crc kubenswrapper[4672]: I1206 10:30:07.826821 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-8ql2p_9977f421-c235-40ef-8d9f-2e0125bf3593/kube-rbac-proxy/0.log" Dec 06 10:30:07 crc kubenswrapper[4672]: I1206 10:30:07.838884 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-8ql2p_9977f421-c235-40ef-8d9f-2e0125bf3593/manager/0.log" Dec 06 10:30:07 crc kubenswrapper[4672]: I1206 10:30:07.999295 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-j7cvj_308c58b1-3c6a-4c79-88fc-b4d515efd96d/kube-rbac-proxy/0.log" Dec 06 10:30:08 crc kubenswrapper[4672]: I1206 10:30:08.089986 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-j7cvj_308c58b1-3c6a-4c79-88fc-b4d515efd96d/manager/0.log" Dec 06 10:30:08 crc kubenswrapper[4672]: I1206 10:30:08.169783 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-zxcvx_3fda2255-f593-42c6-b17e-2996a6ce7c5e/kube-rbac-proxy/0.log" Dec 06 10:30:08 crc kubenswrapper[4672]: I1206 10:30:08.318589 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-zxcvx_3fda2255-f593-42c6-b17e-2996a6ce7c5e/manager/0.log" Dec 06 10:30:08 crc kubenswrapper[4672]: I1206 10:30:08.413543 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-crbgz_27d7a7f5-ab93-40b6-8718-0a8b930d2c0f/manager/0.log" Dec 06 10:30:08 crc kubenswrapper[4672]: I1206 10:30:08.420564 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-crbgz_27d7a7f5-ab93-40b6-8718-0a8b930d2c0f/kube-rbac-proxy/0.log" Dec 06 10:30:08 crc kubenswrapper[4672]: I1206 10:30:08.607328 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-pqnb9_a59bea52-a8d1-4ac9-8ce0-0a623efcb009/kube-rbac-proxy/0.log" Dec 06 10:30:08 crc kubenswrapper[4672]: I1206 10:30:08.725646 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-pqnb9_a59bea52-a8d1-4ac9-8ce0-0a623efcb009/manager/0.log" Dec 06 10:30:08 crc kubenswrapper[4672]: I1206 10:30:08.774413 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-kpmch_8244458a-10b4-4c4f-8f9e-dc93e90329af/kube-rbac-proxy/0.log" Dec 06 10:30:08 crc kubenswrapper[4672]: I1206 10:30:08.946082 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-kpmch_8244458a-10b4-4c4f-8f9e-dc93e90329af/manager/0.log" Dec 06 10:30:08 crc kubenswrapper[4672]: I1206 10:30:08.993402 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-nkk8g_73aa720c-9e22-4ef9-a5b4-512c0194f0a4/kube-rbac-proxy/0.log" Dec 06 10:30:09 crc kubenswrapper[4672]: I1206 10:30:09.002290 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-nkk8g_73aa720c-9e22-4ef9-a5b4-512c0194f0a4/manager/0.log" Dec 06 10:30:09 crc kubenswrapper[4672]: I1206 10:30:09.222350 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-55c85496f586tjc_4794dd53-214a-4537-90c9-0527db628c8b/kube-rbac-proxy/0.log" Dec 06 10:30:09 crc kubenswrapper[4672]: I1206 10:30:09.254145 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-55c85496f586tjc_4794dd53-214a-4537-90c9-0527db628c8b/manager/0.log" Dec 06 10:30:09 crc kubenswrapper[4672]: I1206 10:30:09.941821 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-55b6fb9447-vc2xx_63582a9a-093b-44e1-8932-4b910f301e52/operator/0.log" Dec 06 10:30:09 crc kubenswrapper[4672]: I1206 10:30:09.960891 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-rx48l_250af723-f950-4125-8748-d7eac336f4c1/registry-server/0.log" Dec 06 10:30:10 crc kubenswrapper[4672]: I1206 10:30:10.087800 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-nqh5d_e25e6854-1001-4962-bd9b-f4cb37ebefe1/kube-rbac-proxy/0.log" Dec 06 10:30:10 crc kubenswrapper[4672]: I1206 10:30:10.351862 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-vxgjl_d1ba66a9-3383-413f-b2d3-fb13a4e4592b/kube-rbac-proxy/0.log" Dec 06 10:30:10 crc kubenswrapper[4672]: I1206 10:30:10.360225 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-nqh5d_e25e6854-1001-4962-bd9b-f4cb37ebefe1/manager/0.log" Dec 06 10:30:10 crc kubenswrapper[4672]: I1206 10:30:10.381366 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-vxgjl_d1ba66a9-3383-413f-b2d3-fb13a4e4592b/manager/0.log" Dec 06 10:30:10 crc kubenswrapper[4672]: I1206 10:30:10.794972 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-54bdf956c4-zpt5t_72a85d5f-d856-47b2-b0d6-f1fe23722f39/manager/0.log" Dec 06 10:30:11 crc kubenswrapper[4672]: I1206 10:30:11.321839 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-5z9dq_d6abdea8-a426-4553-b4e7-8998d96eaed3/manager/0.log" Dec 06 10:30:11 crc kubenswrapper[4672]: I1206 10:30:11.357632 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-5z9dq_d6abdea8-a426-4553-b4e7-8998d96eaed3/kube-rbac-proxy/0.log" Dec 06 10:30:11 crc kubenswrapper[4672]: I1206 10:30:11.358484 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-ntvgh_dd2774f1-51aa-4387-aaf1-02cd8329ae1d/operator/0.log" Dec 06 10:30:11 crc kubenswrapper[4672]: I1206 10:30:11.510365 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-49652_30a955f4-c456-4d9e-9621-dce7e9f7b8b8/kube-rbac-proxy/0.log" Dec 06 10:30:11 crc kubenswrapper[4672]: I1206 10:30:11.633865 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-49652_30a955f4-c456-4d9e-9621-dce7e9f7b8b8/manager/0.log" Dec 06 10:30:11 crc kubenswrapper[4672]: I1206 10:30:11.661748 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-9p8xf_b88a6b36-14ee-4898-beb2-dae9d2be7600/kube-rbac-proxy/0.log" Dec 06 10:30:11 crc kubenswrapper[4672]: I1206 10:30:11.670170 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-9p8xf_b88a6b36-14ee-4898-beb2-dae9d2be7600/manager/0.log" Dec 06 10:30:11 crc kubenswrapper[4672]: I1206 10:30:11.847776 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-xbspr_274d0d53-a194-47e5-b20d-e56155f01e72/kube-rbac-proxy/0.log" Dec 06 10:30:11 crc kubenswrapper[4672]: I1206 10:30:11.899208 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-xbspr_274d0d53-a194-47e5-b20d-e56155f01e72/manager/0.log" Dec 06 10:30:12 crc kubenswrapper[4672]: I1206 10:30:12.320286 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:30:12 crc kubenswrapper[4672]: I1206 10:30:12.320627 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:30:29 crc kubenswrapper[4672]: I1206 10:30:29.716894 4672 scope.go:117] "RemoveContainer" containerID="7d95e0c8ff81809b4bdf44e73b16dcab4d27146c291fa055d1d0e0ac2caf7529" Dec 06 10:30:32 crc kubenswrapper[4672]: I1206 10:30:32.137916 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-vrlvb_9a2d76b4-eb44-49ba-ad51-fbe3022af615/control-plane-machine-set-operator/0.log" Dec 06 10:30:32 crc kubenswrapper[4672]: I1206 10:30:32.382344 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-b8m6z_87e773f5-6efb-4613-9af8-f05c7af849e1/kube-rbac-proxy/0.log" Dec 06 10:30:32 crc kubenswrapper[4672]: I1206 10:30:32.411217 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-b8m6z_87e773f5-6efb-4613-9af8-f05c7af849e1/machine-api-operator/0.log" Dec 06 10:30:42 crc kubenswrapper[4672]: I1206 10:30:42.319538 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:30:42 crc kubenswrapper[4672]: I1206 10:30:42.320131 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:30:42 crc kubenswrapper[4672]: I1206 10:30:42.320180 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" Dec 06 10:30:42 crc kubenswrapper[4672]: I1206 10:30:42.320962 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"94b4f355b405b1790b65eef3318ce600b63c6d71249d935e5c628d5678535d3e"} pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 10:30:42 crc kubenswrapper[4672]: I1206 10:30:42.321010 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" containerID="cri-o://94b4f355b405b1790b65eef3318ce600b63c6d71249d935e5c628d5678535d3e" gracePeriod=600 Dec 06 10:30:42 crc kubenswrapper[4672]: E1206 10:30:42.441547 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:30:43 crc kubenswrapper[4672]: I1206 10:30:43.418454 4672 generic.go:334] "Generic (PLEG): container finished" podID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerID="94b4f355b405b1790b65eef3318ce600b63c6d71249d935e5c628d5678535d3e" exitCode=0 Dec 06 10:30:43 crc kubenswrapper[4672]: I1206 10:30:43.418677 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerDied","Data":"94b4f355b405b1790b65eef3318ce600b63c6d71249d935e5c628d5678535d3e"} Dec 06 10:30:43 crc kubenswrapper[4672]: I1206 10:30:43.418900 4672 scope.go:117] "RemoveContainer" containerID="e7d8e30269340f1018466db3bc64484f24c1f912f42266b66c55a54ccd41efab" Dec 06 10:30:43 crc kubenswrapper[4672]: I1206 10:30:43.419682 4672 scope.go:117] "RemoveContainer" containerID="94b4f355b405b1790b65eef3318ce600b63c6d71249d935e5c628d5678535d3e" Dec 06 10:30:43 crc kubenswrapper[4672]: E1206 10:30:43.424257 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:30:46 crc kubenswrapper[4672]: I1206 10:30:46.747725 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-kkdp5_ca049150-2cd7-48c8-a77a-90379dbd818b/cert-manager-controller/0.log" Dec 06 10:30:46 crc kubenswrapper[4672]: I1206 10:30:46.963920 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-kdh29_23285e10-efd9-47e7-929b-e3fa93131669/cert-manager-webhook/0.log" Dec 06 10:30:46 crc kubenswrapper[4672]: I1206 10:30:46.971176 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-qscd7_9a0083d7-9175-4399-aaf0-0767c9d88faf/cert-manager-cainjector/0.log" Dec 06 10:30:54 crc kubenswrapper[4672]: I1206 10:30:54.558352 4672 scope.go:117] "RemoveContainer" containerID="94b4f355b405b1790b65eef3318ce600b63c6d71249d935e5c628d5678535d3e" Dec 06 10:30:54 crc kubenswrapper[4672]: E1206 10:30:54.559945 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:31:00 crc kubenswrapper[4672]: I1206 10:31:00.099260 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-zzxp6_8681df0a-44cf-471f-9257-bda9bae18f87/nmstate-console-plugin/0.log" Dec 06 10:31:00 crc kubenswrapper[4672]: I1206 10:31:00.259642 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-kv76p_23695df9-9be3-41a1-af24-8e35e5a875d2/kube-rbac-proxy/0.log" Dec 06 10:31:00 crc kubenswrapper[4672]: I1206 10:31:00.264787 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-7m49h_a127c4da-7435-45e4-b772-f8e53381bea2/nmstate-handler/0.log" Dec 06 10:31:00 crc kubenswrapper[4672]: I1206 10:31:00.386386 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-kv76p_23695df9-9be3-41a1-af24-8e35e5a875d2/nmstate-metrics/0.log" Dec 06 10:31:00 crc kubenswrapper[4672]: I1206 10:31:00.493064 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-9bwkk_63eadd21-65ec-4fbd-8c8c-265a1ade0b4c/nmstate-operator/0.log" Dec 06 10:31:00 crc kubenswrapper[4672]: I1206 10:31:00.654744 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-mp77w_08d88a07-50e0-4273-bbb4-9d6ed17820a8/nmstate-webhook/0.log" Dec 06 10:31:09 crc kubenswrapper[4672]: I1206 10:31:09.557511 4672 scope.go:117] "RemoveContainer" containerID="94b4f355b405b1790b65eef3318ce600b63c6d71249d935e5c628d5678535d3e" Dec 06 10:31:09 crc kubenswrapper[4672]: E1206 10:31:09.558466 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:31:17 crc kubenswrapper[4672]: I1206 10:31:17.034283 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-ljcvb_35023ac9-ea1e-4576-b700-4afe57f59230/kube-rbac-proxy/0.log" Dec 06 10:31:17 crc kubenswrapper[4672]: I1206 10:31:17.116181 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-ljcvb_35023ac9-ea1e-4576-b700-4afe57f59230/controller/0.log" Dec 06 10:31:17 crc kubenswrapper[4672]: I1206 10:31:17.259839 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-mqk7f_2a795467-0c6f-4dae-bd0e-0595c9eb88b4/frr-k8s-webhook-server/0.log" Dec 06 10:31:17 crc kubenswrapper[4672]: I1206 10:31:17.359233 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/cp-frr-files/0.log" Dec 06 10:31:17 crc kubenswrapper[4672]: I1206 10:31:17.523226 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/cp-frr-files/0.log" Dec 06 10:31:17 crc kubenswrapper[4672]: I1206 10:31:17.543076 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/cp-reloader/0.log" Dec 06 10:31:17 crc kubenswrapper[4672]: I1206 10:31:17.582798 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/cp-reloader/0.log" Dec 06 10:31:17 crc kubenswrapper[4672]: I1206 10:31:17.630381 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/cp-metrics/0.log" Dec 06 10:31:17 crc kubenswrapper[4672]: I1206 10:31:17.784410 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/cp-metrics/0.log" Dec 06 10:31:17 crc kubenswrapper[4672]: I1206 10:31:17.795337 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/cp-reloader/0.log" Dec 06 10:31:17 crc kubenswrapper[4672]: I1206 10:31:17.803128 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/cp-frr-files/0.log" Dec 06 10:31:17 crc kubenswrapper[4672]: I1206 10:31:17.850643 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/cp-metrics/0.log" Dec 06 10:31:17 crc kubenswrapper[4672]: I1206 10:31:17.990844 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/cp-reloader/0.log" Dec 06 10:31:17 crc kubenswrapper[4672]: I1206 10:31:17.998263 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/cp-frr-files/0.log" Dec 06 10:31:18 crc kubenswrapper[4672]: I1206 10:31:18.015671 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/cp-metrics/0.log" Dec 06 10:31:18 crc kubenswrapper[4672]: I1206 10:31:18.082000 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/controller/0.log" Dec 06 10:31:18 crc kubenswrapper[4672]: I1206 10:31:18.173630 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/frr-metrics/0.log" Dec 06 10:31:18 crc kubenswrapper[4672]: I1206 10:31:18.270458 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/kube-rbac-proxy/0.log" Dec 06 10:31:18 crc kubenswrapper[4672]: I1206 10:31:18.334255 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/kube-rbac-proxy-frr/0.log" Dec 06 10:31:18 crc kubenswrapper[4672]: I1206 10:31:18.453536 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/reloader/0.log" Dec 06 10:31:18 crc kubenswrapper[4672]: I1206 10:31:18.601772 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-765868b4fd-qt2wp_216580e9-9198-4b66-bf50-46df3a04c88e/manager/0.log" Dec 06 10:31:18 crc kubenswrapper[4672]: I1206 10:31:18.722582 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6df5976447-kzfnr_8faaf896-2bc9-489b-97dc-29e0efa86a91/webhook-server/0.log" Dec 06 10:31:18 crc kubenswrapper[4672]: I1206 10:31:18.926588 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-skjzl_47d9472b-be65-46ea-8eff-fa70e315ed49/kube-rbac-proxy/0.log" Dec 06 10:31:19 crc kubenswrapper[4672]: I1206 10:31:19.436634 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-skjzl_47d9472b-be65-46ea-8eff-fa70e315ed49/speaker/0.log" Dec 06 10:31:19 crc kubenswrapper[4672]: I1206 10:31:19.691463 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wjtmh_faa92f29-2bae-4481-ab38-1a0b681d73d9/frr/0.log" Dec 06 10:31:22 crc kubenswrapper[4672]: I1206 10:31:22.564576 4672 scope.go:117] "RemoveContainer" containerID="94b4f355b405b1790b65eef3318ce600b63c6d71249d935e5c628d5678535d3e" Dec 06 10:31:22 crc kubenswrapper[4672]: E1206 10:31:22.565128 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:31:34 crc kubenswrapper[4672]: I1206 10:31:34.090330 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf_08576097-cc2d-49d5-8bda-66efdd1f960a/util/0.log" Dec 06 10:31:34 crc kubenswrapper[4672]: I1206 10:31:34.645149 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf_08576097-cc2d-49d5-8bda-66efdd1f960a/util/0.log" Dec 06 10:31:34 crc kubenswrapper[4672]: I1206 10:31:34.706862 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf_08576097-cc2d-49d5-8bda-66efdd1f960a/pull/0.log" Dec 06 10:31:34 crc kubenswrapper[4672]: I1206 10:31:34.760471 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf_08576097-cc2d-49d5-8bda-66efdd1f960a/pull/0.log" Dec 06 10:31:34 crc kubenswrapper[4672]: I1206 10:31:34.970450 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf_08576097-cc2d-49d5-8bda-66efdd1f960a/util/0.log" Dec 06 10:31:35 crc kubenswrapper[4672]: I1206 10:31:35.010969 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf_08576097-cc2d-49d5-8bda-66efdd1f960a/pull/0.log" Dec 06 10:31:35 crc kubenswrapper[4672]: I1206 10:31:35.013734 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg76pf_08576097-cc2d-49d5-8bda-66efdd1f960a/extract/0.log" Dec 06 10:31:35 crc kubenswrapper[4672]: I1206 10:31:35.196165 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj_d6fce567-e6b2-4968-afff-b87e8c3d5058/util/0.log" Dec 06 10:31:35 crc kubenswrapper[4672]: I1206 10:31:35.381555 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj_d6fce567-e6b2-4968-afff-b87e8c3d5058/util/0.log" Dec 06 10:31:35 crc kubenswrapper[4672]: I1206 10:31:35.443720 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj_d6fce567-e6b2-4968-afff-b87e8c3d5058/pull/0.log" Dec 06 10:31:35 crc kubenswrapper[4672]: I1206 10:31:35.478037 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj_d6fce567-e6b2-4968-afff-b87e8c3d5058/pull/0.log" Dec 06 10:31:35 crc kubenswrapper[4672]: I1206 10:31:35.715830 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj_d6fce567-e6b2-4968-afff-b87e8c3d5058/util/0.log" Dec 06 10:31:35 crc kubenswrapper[4672]: I1206 10:31:35.718070 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj_d6fce567-e6b2-4968-afff-b87e8c3d5058/extract/0.log" Dec 06 10:31:35 crc kubenswrapper[4672]: I1206 10:31:35.754557 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vk6fj_d6fce567-e6b2-4968-afff-b87e8c3d5058/pull/0.log" Dec 06 10:31:35 crc kubenswrapper[4672]: I1206 10:31:35.892917 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rrkkx_7c188e7d-d705-41ce-bf0d-468de7745723/extract-utilities/0.log" Dec 06 10:31:36 crc kubenswrapper[4672]: I1206 10:31:36.118441 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rrkkx_7c188e7d-d705-41ce-bf0d-468de7745723/extract-utilities/0.log" Dec 06 10:31:36 crc kubenswrapper[4672]: I1206 10:31:36.156517 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rrkkx_7c188e7d-d705-41ce-bf0d-468de7745723/extract-content/0.log" Dec 06 10:31:36 crc kubenswrapper[4672]: I1206 10:31:36.156992 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rrkkx_7c188e7d-d705-41ce-bf0d-468de7745723/extract-content/0.log" Dec 06 10:31:36 crc kubenswrapper[4672]: I1206 10:31:36.429369 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rrkkx_7c188e7d-d705-41ce-bf0d-468de7745723/extract-content/0.log" Dec 06 10:31:36 crc kubenswrapper[4672]: I1206 10:31:36.463358 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rrkkx_7c188e7d-d705-41ce-bf0d-468de7745723/extract-utilities/0.log" Dec 06 10:31:36 crc kubenswrapper[4672]: I1206 10:31:36.557753 4672 scope.go:117] "RemoveContainer" containerID="94b4f355b405b1790b65eef3318ce600b63c6d71249d935e5c628d5678535d3e" Dec 06 10:31:36 crc kubenswrapper[4672]: E1206 10:31:36.557993 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:31:36 crc kubenswrapper[4672]: I1206 10:31:36.693472 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x867j_e6f78753-7aad-4178-bff5-d45475f4a3df/extract-utilities/0.log" Dec 06 10:31:36 crc kubenswrapper[4672]: I1206 10:31:36.853737 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rrkkx_7c188e7d-d705-41ce-bf0d-468de7745723/registry-server/0.log" Dec 06 10:31:36 crc kubenswrapper[4672]: I1206 10:31:36.986468 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x867j_e6f78753-7aad-4178-bff5-d45475f4a3df/extract-utilities/0.log" Dec 06 10:31:37 crc kubenswrapper[4672]: I1206 10:31:37.061321 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x867j_e6f78753-7aad-4178-bff5-d45475f4a3df/extract-content/0.log" Dec 06 10:31:37 crc kubenswrapper[4672]: I1206 10:31:37.061691 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x867j_e6f78753-7aad-4178-bff5-d45475f4a3df/extract-content/0.log" Dec 06 10:31:37 crc kubenswrapper[4672]: I1206 10:31:37.304492 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x867j_e6f78753-7aad-4178-bff5-d45475f4a3df/extract-utilities/0.log" Dec 06 10:31:37 crc kubenswrapper[4672]: I1206 10:31:37.361720 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x867j_e6f78753-7aad-4178-bff5-d45475f4a3df/extract-content/0.log" Dec 06 10:31:37 crc kubenswrapper[4672]: I1206 10:31:37.658608 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-zhbdf_6f374204-77e4-4b75-afaf-43579bc0506a/marketplace-operator/0.log" Dec 06 10:31:37 crc kubenswrapper[4672]: I1206 10:31:37.805955 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4mbvq_55620a10-8ac9-47b4-88b9-7129c90c4ee4/extract-utilities/0.log" Dec 06 10:31:38 crc kubenswrapper[4672]: I1206 10:31:38.053042 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4mbvq_55620a10-8ac9-47b4-88b9-7129c90c4ee4/extract-utilities/0.log" Dec 06 10:31:38 crc kubenswrapper[4672]: I1206 10:31:38.079076 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4mbvq_55620a10-8ac9-47b4-88b9-7129c90c4ee4/extract-content/0.log" Dec 06 10:31:38 crc kubenswrapper[4672]: I1206 10:31:38.080277 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x867j_e6f78753-7aad-4178-bff5-d45475f4a3df/registry-server/0.log" Dec 06 10:31:38 crc kubenswrapper[4672]: I1206 10:31:38.081510 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4mbvq_55620a10-8ac9-47b4-88b9-7129c90c4ee4/extract-content/0.log" Dec 06 10:31:38 crc kubenswrapper[4672]: I1206 10:31:38.512362 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4mbvq_55620a10-8ac9-47b4-88b9-7129c90c4ee4/extract-content/0.log" Dec 06 10:31:38 crc kubenswrapper[4672]: I1206 10:31:38.547876 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4mbvq_55620a10-8ac9-47b4-88b9-7129c90c4ee4/extract-utilities/0.log" Dec 06 10:31:38 crc kubenswrapper[4672]: I1206 10:31:38.655694 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4mbvq_55620a10-8ac9-47b4-88b9-7129c90c4ee4/registry-server/0.log" Dec 06 10:31:38 crc kubenswrapper[4672]: I1206 10:31:38.670418 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8whpj_e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7/extract-utilities/0.log" Dec 06 10:31:38 crc kubenswrapper[4672]: I1206 10:31:38.842699 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8whpj_e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7/extract-content/0.log" Dec 06 10:31:38 crc kubenswrapper[4672]: I1206 10:31:38.853068 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8whpj_e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7/extract-utilities/0.log" Dec 06 10:31:38 crc kubenswrapper[4672]: I1206 10:31:38.898650 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8whpj_e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7/extract-content/0.log" Dec 06 10:31:39 crc kubenswrapper[4672]: I1206 10:31:39.080625 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8whpj_e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7/extract-utilities/0.log" Dec 06 10:31:39 crc kubenswrapper[4672]: I1206 10:31:39.101355 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8whpj_e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7/extract-content/0.log" Dec 06 10:31:39 crc kubenswrapper[4672]: I1206 10:31:39.729831 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8whpj_e40e6dc4-bcb4-420e-93a6-7eb7c11e12c7/registry-server/0.log" Dec 06 10:31:48 crc kubenswrapper[4672]: I1206 10:31:48.768723 4672 trace.go:236] Trace[1285551064]: "Calculate volume metrics of rabbitmq-erlang-cookie for pod openstack/rabbitmq-server-0" (06-Dec-2025 10:31:47.039) (total time: 1727ms): Dec 06 10:31:48 crc kubenswrapper[4672]: Trace[1285551064]: [1.727614902s] [1.727614902s] END Dec 06 10:31:48 crc kubenswrapper[4672]: I1206 10:31:48.770857 4672 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-4vzn5 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 06 10:31:48 crc kubenswrapper[4672]: I1206 10:31:48.770940 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vzn5" podUID="637b32e8-5e9a-47ac-aeaf-60709cdfba63" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 06 10:31:48 crc kubenswrapper[4672]: I1206 10:31:48.791704 4672 scope.go:117] "RemoveContainer" containerID="94b4f355b405b1790b65eef3318ce600b63c6d71249d935e5c628d5678535d3e" Dec 06 10:31:48 crc kubenswrapper[4672]: E1206 10:31:48.791936 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:32:03 crc kubenswrapper[4672]: I1206 10:32:03.556969 4672 scope.go:117] "RemoveContainer" containerID="94b4f355b405b1790b65eef3318ce600b63c6d71249d935e5c628d5678535d3e" Dec 06 10:32:03 crc kubenswrapper[4672]: E1206 10:32:03.557826 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:32:06 crc kubenswrapper[4672]: I1206 10:32:06.241692 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l65q7"] Dec 06 10:32:06 crc kubenswrapper[4672]: E1206 10:32:06.242842 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c970d87b-a996-4f22-b099-48730d112392" containerName="collect-profiles" Dec 06 10:32:06 crc kubenswrapper[4672]: I1206 10:32:06.242857 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c970d87b-a996-4f22-b099-48730d112392" containerName="collect-profiles" Dec 06 10:32:06 crc kubenswrapper[4672]: I1206 10:32:06.243056 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="c970d87b-a996-4f22-b099-48730d112392" containerName="collect-profiles" Dec 06 10:32:06 crc kubenswrapper[4672]: I1206 10:32:06.244437 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l65q7" Dec 06 10:32:06 crc kubenswrapper[4672]: I1206 10:32:06.263038 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l65q7"] Dec 06 10:32:06 crc kubenswrapper[4672]: I1206 10:32:06.344034 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f5e4610-f4ea-4997-821c-7692f4203765-utilities\") pod \"certified-operators-l65q7\" (UID: \"8f5e4610-f4ea-4997-821c-7692f4203765\") " pod="openshift-marketplace/certified-operators-l65q7" Dec 06 10:32:06 crc kubenswrapper[4672]: I1206 10:32:06.344136 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lndb2\" (UniqueName: \"kubernetes.io/projected/8f5e4610-f4ea-4997-821c-7692f4203765-kube-api-access-lndb2\") pod \"certified-operators-l65q7\" (UID: \"8f5e4610-f4ea-4997-821c-7692f4203765\") " pod="openshift-marketplace/certified-operators-l65q7" Dec 06 10:32:06 crc kubenswrapper[4672]: I1206 10:32:06.344167 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f5e4610-f4ea-4997-821c-7692f4203765-catalog-content\") pod \"certified-operators-l65q7\" (UID: \"8f5e4610-f4ea-4997-821c-7692f4203765\") " pod="openshift-marketplace/certified-operators-l65q7" Dec 06 10:32:06 crc kubenswrapper[4672]: I1206 10:32:06.445755 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lndb2\" (UniqueName: \"kubernetes.io/projected/8f5e4610-f4ea-4997-821c-7692f4203765-kube-api-access-lndb2\") pod \"certified-operators-l65q7\" (UID: \"8f5e4610-f4ea-4997-821c-7692f4203765\") " pod="openshift-marketplace/certified-operators-l65q7" Dec 06 10:32:06 crc kubenswrapper[4672]: I1206 10:32:06.445807 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f5e4610-f4ea-4997-821c-7692f4203765-catalog-content\") pod \"certified-operators-l65q7\" (UID: \"8f5e4610-f4ea-4997-821c-7692f4203765\") " pod="openshift-marketplace/certified-operators-l65q7" Dec 06 10:32:06 crc kubenswrapper[4672]: I1206 10:32:06.445915 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f5e4610-f4ea-4997-821c-7692f4203765-utilities\") pod \"certified-operators-l65q7\" (UID: \"8f5e4610-f4ea-4997-821c-7692f4203765\") " pod="openshift-marketplace/certified-operators-l65q7" Dec 06 10:32:06 crc kubenswrapper[4672]: I1206 10:32:06.446479 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f5e4610-f4ea-4997-821c-7692f4203765-utilities\") pod \"certified-operators-l65q7\" (UID: \"8f5e4610-f4ea-4997-821c-7692f4203765\") " pod="openshift-marketplace/certified-operators-l65q7" Dec 06 10:32:06 crc kubenswrapper[4672]: I1206 10:32:06.446715 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f5e4610-f4ea-4997-821c-7692f4203765-catalog-content\") pod \"certified-operators-l65q7\" (UID: \"8f5e4610-f4ea-4997-821c-7692f4203765\") " pod="openshift-marketplace/certified-operators-l65q7" Dec 06 10:32:06 crc kubenswrapper[4672]: I1206 10:32:06.476877 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lndb2\" (UniqueName: \"kubernetes.io/projected/8f5e4610-f4ea-4997-821c-7692f4203765-kube-api-access-lndb2\") pod \"certified-operators-l65q7\" (UID: \"8f5e4610-f4ea-4997-821c-7692f4203765\") " pod="openshift-marketplace/certified-operators-l65q7" Dec 06 10:32:06 crc kubenswrapper[4672]: I1206 10:32:06.567206 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l65q7" Dec 06 10:32:07 crc kubenswrapper[4672]: I1206 10:32:07.274141 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l65q7"] Dec 06 10:32:08 crc kubenswrapper[4672]: I1206 10:32:08.024196 4672 generic.go:334] "Generic (PLEG): container finished" podID="8f5e4610-f4ea-4997-821c-7692f4203765" containerID="18e7a8de8c62711d3e46b66826bb860ad434ca91889a3f5ef4213c0a18df95d4" exitCode=0 Dec 06 10:32:08 crc kubenswrapper[4672]: I1206 10:32:08.024496 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l65q7" event={"ID":"8f5e4610-f4ea-4997-821c-7692f4203765","Type":"ContainerDied","Data":"18e7a8de8c62711d3e46b66826bb860ad434ca91889a3f5ef4213c0a18df95d4"} Dec 06 10:32:08 crc kubenswrapper[4672]: I1206 10:32:08.024527 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l65q7" event={"ID":"8f5e4610-f4ea-4997-821c-7692f4203765","Type":"ContainerStarted","Data":"e42905e24d217de80623866e74ec5cb0dcae7026b4b4a452a8ba822865e27eed"} Dec 06 10:32:08 crc kubenswrapper[4672]: I1206 10:32:08.026723 4672 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 10:32:09 crc kubenswrapper[4672]: I1206 10:32:09.053329 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l65q7" event={"ID":"8f5e4610-f4ea-4997-821c-7692f4203765","Type":"ContainerStarted","Data":"4a56dfe51857d672562abb0e44b7a04c08a51d577880190d9f819d734f70d36a"} Dec 06 10:32:10 crc kubenswrapper[4672]: I1206 10:32:10.066194 4672 generic.go:334] "Generic (PLEG): container finished" podID="8f5e4610-f4ea-4997-821c-7692f4203765" containerID="4a56dfe51857d672562abb0e44b7a04c08a51d577880190d9f819d734f70d36a" exitCode=0 Dec 06 10:32:10 crc kubenswrapper[4672]: I1206 10:32:10.066260 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l65q7" event={"ID":"8f5e4610-f4ea-4997-821c-7692f4203765","Type":"ContainerDied","Data":"4a56dfe51857d672562abb0e44b7a04c08a51d577880190d9f819d734f70d36a"} Dec 06 10:32:12 crc kubenswrapper[4672]: I1206 10:32:12.097315 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l65q7" event={"ID":"8f5e4610-f4ea-4997-821c-7692f4203765","Type":"ContainerStarted","Data":"23533f52d1e7754a1e79857d21ac86e5a9b6ba458e9f725c9ea118c57e350ced"} Dec 06 10:32:12 crc kubenswrapper[4672]: I1206 10:32:12.115916 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l65q7" podStartSLOduration=3.5170668750000003 podStartE2EDuration="6.115897724s" podCreationTimestamp="2025-12-06 10:32:06 +0000 UTC" firstStartedPulling="2025-12-06 10:32:08.026366117 +0000 UTC m=+5145.770626414" lastFinishedPulling="2025-12-06 10:32:10.625196976 +0000 UTC m=+5148.369457263" observedRunningTime="2025-12-06 10:32:12.114446465 +0000 UTC m=+5149.858706752" watchObservedRunningTime="2025-12-06 10:32:12.115897724 +0000 UTC m=+5149.860158011" Dec 06 10:32:16 crc kubenswrapper[4672]: I1206 10:32:16.567998 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l65q7" Dec 06 10:32:16 crc kubenswrapper[4672]: I1206 10:32:16.568397 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l65q7" Dec 06 10:32:17 crc kubenswrapper[4672]: I1206 10:32:17.625700 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-l65q7" podUID="8f5e4610-f4ea-4997-821c-7692f4203765" containerName="registry-server" probeResult="failure" output=< Dec 06 10:32:17 crc kubenswrapper[4672]: timeout: failed to connect service ":50051" within 1s Dec 06 10:32:17 crc kubenswrapper[4672]: > Dec 06 10:32:18 crc kubenswrapper[4672]: I1206 10:32:18.567385 4672 scope.go:117] "RemoveContainer" containerID="94b4f355b405b1790b65eef3318ce600b63c6d71249d935e5c628d5678535d3e" Dec 06 10:32:18 crc kubenswrapper[4672]: E1206 10:32:18.567890 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:32:25 crc kubenswrapper[4672]: E1206 10:32:25.730769 4672 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.30:48746->38.102.83.30:37519: read tcp 38.102.83.30:48746->38.102.83.30:37519: read: connection reset by peer Dec 06 10:32:26 crc kubenswrapper[4672]: I1206 10:32:26.628490 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l65q7" Dec 06 10:32:26 crc kubenswrapper[4672]: I1206 10:32:26.701020 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l65q7" Dec 06 10:32:26 crc kubenswrapper[4672]: I1206 10:32:26.879504 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l65q7"] Dec 06 10:32:28 crc kubenswrapper[4672]: I1206 10:32:28.230573 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l65q7" podUID="8f5e4610-f4ea-4997-821c-7692f4203765" containerName="registry-server" containerID="cri-o://23533f52d1e7754a1e79857d21ac86e5a9b6ba458e9f725c9ea118c57e350ced" gracePeriod=2 Dec 06 10:32:28 crc kubenswrapper[4672]: I1206 10:32:28.841278 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l65q7" Dec 06 10:32:28 crc kubenswrapper[4672]: I1206 10:32:28.954049 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lndb2\" (UniqueName: \"kubernetes.io/projected/8f5e4610-f4ea-4997-821c-7692f4203765-kube-api-access-lndb2\") pod \"8f5e4610-f4ea-4997-821c-7692f4203765\" (UID: \"8f5e4610-f4ea-4997-821c-7692f4203765\") " Dec 06 10:32:28 crc kubenswrapper[4672]: I1206 10:32:28.954131 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f5e4610-f4ea-4997-821c-7692f4203765-catalog-content\") pod \"8f5e4610-f4ea-4997-821c-7692f4203765\" (UID: \"8f5e4610-f4ea-4997-821c-7692f4203765\") " Dec 06 10:32:28 crc kubenswrapper[4672]: I1206 10:32:28.954222 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f5e4610-f4ea-4997-821c-7692f4203765-utilities\") pod \"8f5e4610-f4ea-4997-821c-7692f4203765\" (UID: \"8f5e4610-f4ea-4997-821c-7692f4203765\") " Dec 06 10:32:28 crc kubenswrapper[4672]: I1206 10:32:28.955230 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f5e4610-f4ea-4997-821c-7692f4203765-utilities" (OuterVolumeSpecName: "utilities") pod "8f5e4610-f4ea-4997-821c-7692f4203765" (UID: "8f5e4610-f4ea-4997-821c-7692f4203765"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:32:28 crc kubenswrapper[4672]: I1206 10:32:28.975812 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f5e4610-f4ea-4997-821c-7692f4203765-kube-api-access-lndb2" (OuterVolumeSpecName: "kube-api-access-lndb2") pod "8f5e4610-f4ea-4997-821c-7692f4203765" (UID: "8f5e4610-f4ea-4997-821c-7692f4203765"). InnerVolumeSpecName "kube-api-access-lndb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:32:29 crc kubenswrapper[4672]: I1206 10:32:29.016694 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f5e4610-f4ea-4997-821c-7692f4203765-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f5e4610-f4ea-4997-821c-7692f4203765" (UID: "8f5e4610-f4ea-4997-821c-7692f4203765"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:32:29 crc kubenswrapper[4672]: I1206 10:32:29.056642 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lndb2\" (UniqueName: \"kubernetes.io/projected/8f5e4610-f4ea-4997-821c-7692f4203765-kube-api-access-lndb2\") on node \"crc\" DevicePath \"\"" Dec 06 10:32:29 crc kubenswrapper[4672]: I1206 10:32:29.056681 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f5e4610-f4ea-4997-821c-7692f4203765-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:32:29 crc kubenswrapper[4672]: I1206 10:32:29.056692 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f5e4610-f4ea-4997-821c-7692f4203765-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:32:29 crc kubenswrapper[4672]: I1206 10:32:29.241130 4672 generic.go:334] "Generic (PLEG): container finished" podID="8f5e4610-f4ea-4997-821c-7692f4203765" containerID="23533f52d1e7754a1e79857d21ac86e5a9b6ba458e9f725c9ea118c57e350ced" exitCode=0 Dec 06 10:32:29 crc kubenswrapper[4672]: I1206 10:32:29.241183 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l65q7" event={"ID":"8f5e4610-f4ea-4997-821c-7692f4203765","Type":"ContainerDied","Data":"23533f52d1e7754a1e79857d21ac86e5a9b6ba458e9f725c9ea118c57e350ced"} Dec 06 10:32:29 crc kubenswrapper[4672]: I1206 10:32:29.241220 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l65q7" event={"ID":"8f5e4610-f4ea-4997-821c-7692f4203765","Type":"ContainerDied","Data":"e42905e24d217de80623866e74ec5cb0dcae7026b4b4a452a8ba822865e27eed"} Dec 06 10:32:29 crc kubenswrapper[4672]: I1206 10:32:29.241239 4672 scope.go:117] "RemoveContainer" containerID="23533f52d1e7754a1e79857d21ac86e5a9b6ba458e9f725c9ea118c57e350ced" Dec 06 10:32:29 crc kubenswrapper[4672]: I1206 10:32:29.241297 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l65q7" Dec 06 10:32:29 crc kubenswrapper[4672]: I1206 10:32:29.275121 4672 scope.go:117] "RemoveContainer" containerID="4a56dfe51857d672562abb0e44b7a04c08a51d577880190d9f819d734f70d36a" Dec 06 10:32:29 crc kubenswrapper[4672]: I1206 10:32:29.293518 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l65q7"] Dec 06 10:32:29 crc kubenswrapper[4672]: I1206 10:32:29.298122 4672 scope.go:117] "RemoveContainer" containerID="18e7a8de8c62711d3e46b66826bb860ad434ca91889a3f5ef4213c0a18df95d4" Dec 06 10:32:29 crc kubenswrapper[4672]: I1206 10:32:29.303188 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l65q7"] Dec 06 10:32:29 crc kubenswrapper[4672]: I1206 10:32:29.340688 4672 scope.go:117] "RemoveContainer" containerID="23533f52d1e7754a1e79857d21ac86e5a9b6ba458e9f725c9ea118c57e350ced" Dec 06 10:32:29 crc kubenswrapper[4672]: E1206 10:32:29.341231 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23533f52d1e7754a1e79857d21ac86e5a9b6ba458e9f725c9ea118c57e350ced\": container with ID starting with 23533f52d1e7754a1e79857d21ac86e5a9b6ba458e9f725c9ea118c57e350ced not found: ID does not exist" containerID="23533f52d1e7754a1e79857d21ac86e5a9b6ba458e9f725c9ea118c57e350ced" Dec 06 10:32:29 crc kubenswrapper[4672]: I1206 10:32:29.341267 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23533f52d1e7754a1e79857d21ac86e5a9b6ba458e9f725c9ea118c57e350ced"} err="failed to get container status \"23533f52d1e7754a1e79857d21ac86e5a9b6ba458e9f725c9ea118c57e350ced\": rpc error: code = NotFound desc = could not find container \"23533f52d1e7754a1e79857d21ac86e5a9b6ba458e9f725c9ea118c57e350ced\": container with ID starting with 23533f52d1e7754a1e79857d21ac86e5a9b6ba458e9f725c9ea118c57e350ced not found: ID does not exist" Dec 06 10:32:29 crc kubenswrapper[4672]: I1206 10:32:29.341291 4672 scope.go:117] "RemoveContainer" containerID="4a56dfe51857d672562abb0e44b7a04c08a51d577880190d9f819d734f70d36a" Dec 06 10:32:29 crc kubenswrapper[4672]: E1206 10:32:29.341667 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a56dfe51857d672562abb0e44b7a04c08a51d577880190d9f819d734f70d36a\": container with ID starting with 4a56dfe51857d672562abb0e44b7a04c08a51d577880190d9f819d734f70d36a not found: ID does not exist" containerID="4a56dfe51857d672562abb0e44b7a04c08a51d577880190d9f819d734f70d36a" Dec 06 10:32:29 crc kubenswrapper[4672]: I1206 10:32:29.341701 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a56dfe51857d672562abb0e44b7a04c08a51d577880190d9f819d734f70d36a"} err="failed to get container status \"4a56dfe51857d672562abb0e44b7a04c08a51d577880190d9f819d734f70d36a\": rpc error: code = NotFound desc = could not find container \"4a56dfe51857d672562abb0e44b7a04c08a51d577880190d9f819d734f70d36a\": container with ID starting with 4a56dfe51857d672562abb0e44b7a04c08a51d577880190d9f819d734f70d36a not found: ID does not exist" Dec 06 10:32:29 crc kubenswrapper[4672]: I1206 10:32:29.341723 4672 scope.go:117] "RemoveContainer" containerID="18e7a8de8c62711d3e46b66826bb860ad434ca91889a3f5ef4213c0a18df95d4" Dec 06 10:32:29 crc kubenswrapper[4672]: E1206 10:32:29.341992 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18e7a8de8c62711d3e46b66826bb860ad434ca91889a3f5ef4213c0a18df95d4\": container with ID starting with 18e7a8de8c62711d3e46b66826bb860ad434ca91889a3f5ef4213c0a18df95d4 not found: ID does not exist" containerID="18e7a8de8c62711d3e46b66826bb860ad434ca91889a3f5ef4213c0a18df95d4" Dec 06 10:32:29 crc kubenswrapper[4672]: I1206 10:32:29.342017 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18e7a8de8c62711d3e46b66826bb860ad434ca91889a3f5ef4213c0a18df95d4"} err="failed to get container status \"18e7a8de8c62711d3e46b66826bb860ad434ca91889a3f5ef4213c0a18df95d4\": rpc error: code = NotFound desc = could not find container \"18e7a8de8c62711d3e46b66826bb860ad434ca91889a3f5ef4213c0a18df95d4\": container with ID starting with 18e7a8de8c62711d3e46b66826bb860ad434ca91889a3f5ef4213c0a18df95d4 not found: ID does not exist" Dec 06 10:32:30 crc kubenswrapper[4672]: I1206 10:32:30.569658 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f5e4610-f4ea-4997-821c-7692f4203765" path="/var/lib/kubelet/pods/8f5e4610-f4ea-4997-821c-7692f4203765/volumes" Dec 06 10:32:31 crc kubenswrapper[4672]: I1206 10:32:31.557263 4672 scope.go:117] "RemoveContainer" containerID="94b4f355b405b1790b65eef3318ce600b63c6d71249d935e5c628d5678535d3e" Dec 06 10:32:31 crc kubenswrapper[4672]: E1206 10:32:31.558072 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:32:45 crc kubenswrapper[4672]: I1206 10:32:45.558630 4672 scope.go:117] "RemoveContainer" containerID="94b4f355b405b1790b65eef3318ce600b63c6d71249d935e5c628d5678535d3e" Dec 06 10:32:45 crc kubenswrapper[4672]: E1206 10:32:45.559398 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:33:00 crc kubenswrapper[4672]: I1206 10:33:00.559168 4672 scope.go:117] "RemoveContainer" containerID="94b4f355b405b1790b65eef3318ce600b63c6d71249d935e5c628d5678535d3e" Dec 06 10:33:00 crc kubenswrapper[4672]: E1206 10:33:00.559840 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:33:13 crc kubenswrapper[4672]: I1206 10:33:13.557397 4672 scope.go:117] "RemoveContainer" containerID="94b4f355b405b1790b65eef3318ce600b63c6d71249d935e5c628d5678535d3e" Dec 06 10:33:13 crc kubenswrapper[4672]: E1206 10:33:13.558239 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:33:28 crc kubenswrapper[4672]: I1206 10:33:28.558057 4672 scope.go:117] "RemoveContainer" containerID="94b4f355b405b1790b65eef3318ce600b63c6d71249d935e5c628d5678535d3e" Dec 06 10:33:28 crc kubenswrapper[4672]: E1206 10:33:28.559269 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:33:41 crc kubenswrapper[4672]: I1206 10:33:41.561477 4672 scope.go:117] "RemoveContainer" containerID="94b4f355b405b1790b65eef3318ce600b63c6d71249d935e5c628d5678535d3e" Dec 06 10:33:41 crc kubenswrapper[4672]: E1206 10:33:41.562533 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:33:50 crc kubenswrapper[4672]: I1206 10:33:50.401377 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-twd5p"] Dec 06 10:33:50 crc kubenswrapper[4672]: E1206 10:33:50.402513 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f5e4610-f4ea-4997-821c-7692f4203765" containerName="extract-utilities" Dec 06 10:33:50 crc kubenswrapper[4672]: I1206 10:33:50.402530 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f5e4610-f4ea-4997-821c-7692f4203765" containerName="extract-utilities" Dec 06 10:33:50 crc kubenswrapper[4672]: E1206 10:33:50.402552 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f5e4610-f4ea-4997-821c-7692f4203765" containerName="registry-server" Dec 06 10:33:50 crc kubenswrapper[4672]: I1206 10:33:50.402560 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f5e4610-f4ea-4997-821c-7692f4203765" containerName="registry-server" Dec 06 10:33:50 crc kubenswrapper[4672]: E1206 10:33:50.403273 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f5e4610-f4ea-4997-821c-7692f4203765" containerName="extract-content" Dec 06 10:33:50 crc kubenswrapper[4672]: I1206 10:33:50.403323 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f5e4610-f4ea-4997-821c-7692f4203765" containerName="extract-content" Dec 06 10:33:50 crc kubenswrapper[4672]: I1206 10:33:50.403724 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f5e4610-f4ea-4997-821c-7692f4203765" containerName="registry-server" Dec 06 10:33:50 crc kubenswrapper[4672]: I1206 10:33:50.405951 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-twd5p" Dec 06 10:33:50 crc kubenswrapper[4672]: I1206 10:33:50.421753 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-twd5p"] Dec 06 10:33:50 crc kubenswrapper[4672]: I1206 10:33:50.441957 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e-catalog-content\") pod \"redhat-marketplace-twd5p\" (UID: \"cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e\") " pod="openshift-marketplace/redhat-marketplace-twd5p" Dec 06 10:33:50 crc kubenswrapper[4672]: I1206 10:33:50.442071 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e-utilities\") pod \"redhat-marketplace-twd5p\" (UID: \"cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e\") " pod="openshift-marketplace/redhat-marketplace-twd5p" Dec 06 10:33:50 crc kubenswrapper[4672]: I1206 10:33:50.442190 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpw92\" (UniqueName: \"kubernetes.io/projected/cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e-kube-api-access-lpw92\") pod \"redhat-marketplace-twd5p\" (UID: \"cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e\") " pod="openshift-marketplace/redhat-marketplace-twd5p" Dec 06 10:33:50 crc kubenswrapper[4672]: I1206 10:33:50.543498 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e-utilities\") pod \"redhat-marketplace-twd5p\" (UID: \"cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e\") " pod="openshift-marketplace/redhat-marketplace-twd5p" Dec 06 10:33:50 crc kubenswrapper[4672]: I1206 10:33:50.543666 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpw92\" (UniqueName: \"kubernetes.io/projected/cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e-kube-api-access-lpw92\") pod \"redhat-marketplace-twd5p\" (UID: \"cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e\") " pod="openshift-marketplace/redhat-marketplace-twd5p" Dec 06 10:33:50 crc kubenswrapper[4672]: I1206 10:33:50.543784 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e-catalog-content\") pod \"redhat-marketplace-twd5p\" (UID: \"cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e\") " pod="openshift-marketplace/redhat-marketplace-twd5p" Dec 06 10:33:50 crc kubenswrapper[4672]: I1206 10:33:50.544445 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e-utilities\") pod \"redhat-marketplace-twd5p\" (UID: \"cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e\") " pod="openshift-marketplace/redhat-marketplace-twd5p" Dec 06 10:33:50 crc kubenswrapper[4672]: I1206 10:33:50.544556 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e-catalog-content\") pod \"redhat-marketplace-twd5p\" (UID: \"cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e\") " pod="openshift-marketplace/redhat-marketplace-twd5p" Dec 06 10:33:50 crc kubenswrapper[4672]: I1206 10:33:50.571927 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpw92\" (UniqueName: \"kubernetes.io/projected/cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e-kube-api-access-lpw92\") pod \"redhat-marketplace-twd5p\" (UID: \"cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e\") " pod="openshift-marketplace/redhat-marketplace-twd5p" Dec 06 10:33:50 crc kubenswrapper[4672]: I1206 10:33:50.745638 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-twd5p" Dec 06 10:33:51 crc kubenswrapper[4672]: I1206 10:33:51.261394 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-twd5p"] Dec 06 10:33:51 crc kubenswrapper[4672]: W1206 10:33:51.267778 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd44bfb4_5ec4_433a_b31c_cb6464a9ee2e.slice/crio-18bfd1c85962379ba28f8af0ad01d62487c5d68d6e7eadd4b56a8cc735a410be WatchSource:0}: Error finding container 18bfd1c85962379ba28f8af0ad01d62487c5d68d6e7eadd4b56a8cc735a410be: Status 404 returned error can't find the container with id 18bfd1c85962379ba28f8af0ad01d62487c5d68d6e7eadd4b56a8cc735a410be Dec 06 10:33:52 crc kubenswrapper[4672]: I1206 10:33:52.095740 4672 generic.go:334] "Generic (PLEG): container finished" podID="cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e" containerID="e3092d06f1011fe44149ccc732fe5efd6fd170904016dcd3d0c3695a4d854745" exitCode=0 Dec 06 10:33:52 crc kubenswrapper[4672]: I1206 10:33:52.096059 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-twd5p" event={"ID":"cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e","Type":"ContainerDied","Data":"e3092d06f1011fe44149ccc732fe5efd6fd170904016dcd3d0c3695a4d854745"} Dec 06 10:33:52 crc kubenswrapper[4672]: I1206 10:33:52.096090 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-twd5p" event={"ID":"cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e","Type":"ContainerStarted","Data":"18bfd1c85962379ba28f8af0ad01d62487c5d68d6e7eadd4b56a8cc735a410be"} Dec 06 10:33:52 crc kubenswrapper[4672]: I1206 10:33:52.566008 4672 scope.go:117] "RemoveContainer" containerID="94b4f355b405b1790b65eef3318ce600b63c6d71249d935e5c628d5678535d3e" Dec 06 10:33:52 crc kubenswrapper[4672]: E1206 10:33:52.566397 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:33:53 crc kubenswrapper[4672]: I1206 10:33:53.106338 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-twd5p" event={"ID":"cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e","Type":"ContainerStarted","Data":"562add6428d8a7f5be65aa96248541ee1dcee4dcb4853345be25ef0d16d91e8b"} Dec 06 10:33:54 crc kubenswrapper[4672]: I1206 10:33:54.116872 4672 generic.go:334] "Generic (PLEG): container finished" podID="cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e" containerID="562add6428d8a7f5be65aa96248541ee1dcee4dcb4853345be25ef0d16d91e8b" exitCode=0 Dec 06 10:33:54 crc kubenswrapper[4672]: I1206 10:33:54.116928 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-twd5p" event={"ID":"cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e","Type":"ContainerDied","Data":"562add6428d8a7f5be65aa96248541ee1dcee4dcb4853345be25ef0d16d91e8b"} Dec 06 10:33:55 crc kubenswrapper[4672]: I1206 10:33:55.136305 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-twd5p" event={"ID":"cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e","Type":"ContainerStarted","Data":"ced811d68daa99c95159dbf54e94156e9dc982b0a37bc66954866a5583da691e"} Dec 06 10:33:55 crc kubenswrapper[4672]: I1206 10:33:55.156568 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-twd5p" podStartSLOduration=2.678073038 podStartE2EDuration="5.156550161s" podCreationTimestamp="2025-12-06 10:33:50 +0000 UTC" firstStartedPulling="2025-12-06 10:33:52.09800351 +0000 UTC m=+5249.842263797" lastFinishedPulling="2025-12-06 10:33:54.576480633 +0000 UTC m=+5252.320740920" observedRunningTime="2025-12-06 10:33:55.152639045 +0000 UTC m=+5252.896899352" watchObservedRunningTime="2025-12-06 10:33:55.156550161 +0000 UTC m=+5252.900810448" Dec 06 10:34:00 crc kubenswrapper[4672]: I1206 10:34:00.747434 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-twd5p" Dec 06 10:34:00 crc kubenswrapper[4672]: I1206 10:34:00.749191 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-twd5p" Dec 06 10:34:00 crc kubenswrapper[4672]: I1206 10:34:00.811136 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-twd5p" Dec 06 10:34:01 crc kubenswrapper[4672]: I1206 10:34:01.226838 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-twd5p" Dec 06 10:34:01 crc kubenswrapper[4672]: I1206 10:34:01.274269 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-twd5p"] Dec 06 10:34:03 crc kubenswrapper[4672]: I1206 10:34:03.206803 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-twd5p" podUID="cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e" containerName="registry-server" containerID="cri-o://ced811d68daa99c95159dbf54e94156e9dc982b0a37bc66954866a5583da691e" gracePeriod=2 Dec 06 10:34:03 crc kubenswrapper[4672]: I1206 10:34:03.651584 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-twd5p" Dec 06 10:34:03 crc kubenswrapper[4672]: I1206 10:34:03.818873 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e-utilities\") pod \"cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e\" (UID: \"cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e\") " Dec 06 10:34:03 crc kubenswrapper[4672]: I1206 10:34:03.819086 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e-catalog-content\") pod \"cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e\" (UID: \"cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e\") " Dec 06 10:34:03 crc kubenswrapper[4672]: I1206 10:34:03.819216 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpw92\" (UniqueName: \"kubernetes.io/projected/cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e-kube-api-access-lpw92\") pod \"cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e\" (UID: \"cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e\") " Dec 06 10:34:03 crc kubenswrapper[4672]: I1206 10:34:03.819928 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e-utilities" (OuterVolumeSpecName: "utilities") pod "cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e" (UID: "cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:34:03 crc kubenswrapper[4672]: I1206 10:34:03.825122 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e-kube-api-access-lpw92" (OuterVolumeSpecName: "kube-api-access-lpw92") pod "cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e" (UID: "cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e"). InnerVolumeSpecName "kube-api-access-lpw92". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:34:03 crc kubenswrapper[4672]: I1206 10:34:03.849835 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e" (UID: "cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:34:03 crc kubenswrapper[4672]: I1206 10:34:03.921562 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpw92\" (UniqueName: \"kubernetes.io/projected/cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e-kube-api-access-lpw92\") on node \"crc\" DevicePath \"\"" Dec 06 10:34:03 crc kubenswrapper[4672]: I1206 10:34:03.921639 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:34:03 crc kubenswrapper[4672]: I1206 10:34:03.921656 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:34:04 crc kubenswrapper[4672]: I1206 10:34:04.218782 4672 generic.go:334] "Generic (PLEG): container finished" podID="cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e" containerID="ced811d68daa99c95159dbf54e94156e9dc982b0a37bc66954866a5583da691e" exitCode=0 Dec 06 10:34:04 crc kubenswrapper[4672]: I1206 10:34:04.218847 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-twd5p" Dec 06 10:34:04 crc kubenswrapper[4672]: I1206 10:34:04.218851 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-twd5p" event={"ID":"cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e","Type":"ContainerDied","Data":"ced811d68daa99c95159dbf54e94156e9dc982b0a37bc66954866a5583da691e"} Dec 06 10:34:04 crc kubenswrapper[4672]: I1206 10:34:04.219316 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-twd5p" event={"ID":"cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e","Type":"ContainerDied","Data":"18bfd1c85962379ba28f8af0ad01d62487c5d68d6e7eadd4b56a8cc735a410be"} Dec 06 10:34:04 crc kubenswrapper[4672]: I1206 10:34:04.219350 4672 scope.go:117] "RemoveContainer" containerID="ced811d68daa99c95159dbf54e94156e9dc982b0a37bc66954866a5583da691e" Dec 06 10:34:04 crc kubenswrapper[4672]: I1206 10:34:04.250260 4672 scope.go:117] "RemoveContainer" containerID="562add6428d8a7f5be65aa96248541ee1dcee4dcb4853345be25ef0d16d91e8b" Dec 06 10:34:04 crc kubenswrapper[4672]: I1206 10:34:04.281509 4672 scope.go:117] "RemoveContainer" containerID="e3092d06f1011fe44149ccc732fe5efd6fd170904016dcd3d0c3695a4d854745" Dec 06 10:34:04 crc kubenswrapper[4672]: I1206 10:34:04.295377 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-twd5p"] Dec 06 10:34:04 crc kubenswrapper[4672]: I1206 10:34:04.309125 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-twd5p"] Dec 06 10:34:04 crc kubenswrapper[4672]: I1206 10:34:04.340720 4672 scope.go:117] "RemoveContainer" containerID="ced811d68daa99c95159dbf54e94156e9dc982b0a37bc66954866a5583da691e" Dec 06 10:34:04 crc kubenswrapper[4672]: E1206 10:34:04.341197 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ced811d68daa99c95159dbf54e94156e9dc982b0a37bc66954866a5583da691e\": container with ID starting with ced811d68daa99c95159dbf54e94156e9dc982b0a37bc66954866a5583da691e not found: ID does not exist" containerID="ced811d68daa99c95159dbf54e94156e9dc982b0a37bc66954866a5583da691e" Dec 06 10:34:04 crc kubenswrapper[4672]: I1206 10:34:04.341249 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ced811d68daa99c95159dbf54e94156e9dc982b0a37bc66954866a5583da691e"} err="failed to get container status \"ced811d68daa99c95159dbf54e94156e9dc982b0a37bc66954866a5583da691e\": rpc error: code = NotFound desc = could not find container \"ced811d68daa99c95159dbf54e94156e9dc982b0a37bc66954866a5583da691e\": container with ID starting with ced811d68daa99c95159dbf54e94156e9dc982b0a37bc66954866a5583da691e not found: ID does not exist" Dec 06 10:34:04 crc kubenswrapper[4672]: I1206 10:34:04.341280 4672 scope.go:117] "RemoveContainer" containerID="562add6428d8a7f5be65aa96248541ee1dcee4dcb4853345be25ef0d16d91e8b" Dec 06 10:34:04 crc kubenswrapper[4672]: E1206 10:34:04.341759 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"562add6428d8a7f5be65aa96248541ee1dcee4dcb4853345be25ef0d16d91e8b\": container with ID starting with 562add6428d8a7f5be65aa96248541ee1dcee4dcb4853345be25ef0d16d91e8b not found: ID does not exist" containerID="562add6428d8a7f5be65aa96248541ee1dcee4dcb4853345be25ef0d16d91e8b" Dec 06 10:34:04 crc kubenswrapper[4672]: I1206 10:34:04.341796 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"562add6428d8a7f5be65aa96248541ee1dcee4dcb4853345be25ef0d16d91e8b"} err="failed to get container status \"562add6428d8a7f5be65aa96248541ee1dcee4dcb4853345be25ef0d16d91e8b\": rpc error: code = NotFound desc = could not find container \"562add6428d8a7f5be65aa96248541ee1dcee4dcb4853345be25ef0d16d91e8b\": container with ID starting with 562add6428d8a7f5be65aa96248541ee1dcee4dcb4853345be25ef0d16d91e8b not found: ID does not exist" Dec 06 10:34:04 crc kubenswrapper[4672]: I1206 10:34:04.341823 4672 scope.go:117] "RemoveContainer" containerID="e3092d06f1011fe44149ccc732fe5efd6fd170904016dcd3d0c3695a4d854745" Dec 06 10:34:04 crc kubenswrapper[4672]: E1206 10:34:04.342113 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3092d06f1011fe44149ccc732fe5efd6fd170904016dcd3d0c3695a4d854745\": container with ID starting with e3092d06f1011fe44149ccc732fe5efd6fd170904016dcd3d0c3695a4d854745 not found: ID does not exist" containerID="e3092d06f1011fe44149ccc732fe5efd6fd170904016dcd3d0c3695a4d854745" Dec 06 10:34:04 crc kubenswrapper[4672]: I1206 10:34:04.342159 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3092d06f1011fe44149ccc732fe5efd6fd170904016dcd3d0c3695a4d854745"} err="failed to get container status \"e3092d06f1011fe44149ccc732fe5efd6fd170904016dcd3d0c3695a4d854745\": rpc error: code = NotFound desc = could not find container \"e3092d06f1011fe44149ccc732fe5efd6fd170904016dcd3d0c3695a4d854745\": container with ID starting with e3092d06f1011fe44149ccc732fe5efd6fd170904016dcd3d0c3695a4d854745 not found: ID does not exist" Dec 06 10:34:04 crc kubenswrapper[4672]: I1206 10:34:04.557694 4672 scope.go:117] "RemoveContainer" containerID="94b4f355b405b1790b65eef3318ce600b63c6d71249d935e5c628d5678535d3e" Dec 06 10:34:04 crc kubenswrapper[4672]: E1206 10:34:04.558353 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:34:04 crc kubenswrapper[4672]: I1206 10:34:04.568771 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e" path="/var/lib/kubelet/pods/cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e/volumes" Dec 06 10:34:13 crc kubenswrapper[4672]: I1206 10:34:13.329899 4672 generic.go:334] "Generic (PLEG): container finished" podID="c237ec5d-7c8c-423b-b427-d5064e2bce86" containerID="ef2eea298a1e2adfb29f0904d8bd4192330b9efb9d8fc035b8a55f241bb604cc" exitCode=0 Dec 06 10:34:13 crc kubenswrapper[4672]: I1206 10:34:13.329966 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2jnct/must-gather-c8qqv" event={"ID":"c237ec5d-7c8c-423b-b427-d5064e2bce86","Type":"ContainerDied","Data":"ef2eea298a1e2adfb29f0904d8bd4192330b9efb9d8fc035b8a55f241bb604cc"} Dec 06 10:34:13 crc kubenswrapper[4672]: I1206 10:34:13.332444 4672 scope.go:117] "RemoveContainer" containerID="ef2eea298a1e2adfb29f0904d8bd4192330b9efb9d8fc035b8a55f241bb604cc" Dec 06 10:34:14 crc kubenswrapper[4672]: I1206 10:34:14.236022 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2jnct_must-gather-c8qqv_c237ec5d-7c8c-423b-b427-d5064e2bce86/gather/0.log" Dec 06 10:34:15 crc kubenswrapper[4672]: I1206 10:34:15.557079 4672 scope.go:117] "RemoveContainer" containerID="94b4f355b405b1790b65eef3318ce600b63c6d71249d935e5c628d5678535d3e" Dec 06 10:34:15 crc kubenswrapper[4672]: E1206 10:34:15.557349 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:34:27 crc kubenswrapper[4672]: I1206 10:34:27.557964 4672 scope.go:117] "RemoveContainer" containerID="94b4f355b405b1790b65eef3318ce600b63c6d71249d935e5c628d5678535d3e" Dec 06 10:34:27 crc kubenswrapper[4672]: E1206 10:34:27.558798 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:34:27 crc kubenswrapper[4672]: I1206 10:34:27.770704 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2jnct/must-gather-c8qqv"] Dec 06 10:34:27 crc kubenswrapper[4672]: I1206 10:34:27.771290 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-2jnct/must-gather-c8qqv" podUID="c237ec5d-7c8c-423b-b427-d5064e2bce86" containerName="copy" containerID="cri-o://2ceeb04e454cdd87a68b109a5e54b511d54f99d82cd79a3052f9ddec67621ed3" gracePeriod=2 Dec 06 10:34:27 crc kubenswrapper[4672]: I1206 10:34:27.787278 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2jnct/must-gather-c8qqv"] Dec 06 10:34:28 crc kubenswrapper[4672]: I1206 10:34:28.212740 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2jnct_must-gather-c8qqv_c237ec5d-7c8c-423b-b427-d5064e2bce86/copy/0.log" Dec 06 10:34:28 crc kubenswrapper[4672]: I1206 10:34:28.213358 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2jnct/must-gather-c8qqv" Dec 06 10:34:28 crc kubenswrapper[4672]: I1206 10:34:28.386095 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhhvq\" (UniqueName: \"kubernetes.io/projected/c237ec5d-7c8c-423b-b427-d5064e2bce86-kube-api-access-lhhvq\") pod \"c237ec5d-7c8c-423b-b427-d5064e2bce86\" (UID: \"c237ec5d-7c8c-423b-b427-d5064e2bce86\") " Dec 06 10:34:28 crc kubenswrapper[4672]: I1206 10:34:28.386192 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c237ec5d-7c8c-423b-b427-d5064e2bce86-must-gather-output\") pod \"c237ec5d-7c8c-423b-b427-d5064e2bce86\" (UID: \"c237ec5d-7c8c-423b-b427-d5064e2bce86\") " Dec 06 10:34:28 crc kubenswrapper[4672]: I1206 10:34:28.403794 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c237ec5d-7c8c-423b-b427-d5064e2bce86-kube-api-access-lhhvq" (OuterVolumeSpecName: "kube-api-access-lhhvq") pod "c237ec5d-7c8c-423b-b427-d5064e2bce86" (UID: "c237ec5d-7c8c-423b-b427-d5064e2bce86"). InnerVolumeSpecName "kube-api-access-lhhvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:34:28 crc kubenswrapper[4672]: I1206 10:34:28.472434 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2jnct_must-gather-c8qqv_c237ec5d-7c8c-423b-b427-d5064e2bce86/copy/0.log" Dec 06 10:34:28 crc kubenswrapper[4672]: I1206 10:34:28.473021 4672 generic.go:334] "Generic (PLEG): container finished" podID="c237ec5d-7c8c-423b-b427-d5064e2bce86" containerID="2ceeb04e454cdd87a68b109a5e54b511d54f99d82cd79a3052f9ddec67621ed3" exitCode=143 Dec 06 10:34:28 crc kubenswrapper[4672]: I1206 10:34:28.473095 4672 scope.go:117] "RemoveContainer" containerID="2ceeb04e454cdd87a68b109a5e54b511d54f99d82cd79a3052f9ddec67621ed3" Dec 06 10:34:28 crc kubenswrapper[4672]: I1206 10:34:28.473207 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2jnct/must-gather-c8qqv" Dec 06 10:34:28 crc kubenswrapper[4672]: I1206 10:34:28.488069 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhhvq\" (UniqueName: \"kubernetes.io/projected/c237ec5d-7c8c-423b-b427-d5064e2bce86-kube-api-access-lhhvq\") on node \"crc\" DevicePath \"\"" Dec 06 10:34:28 crc kubenswrapper[4672]: I1206 10:34:28.505960 4672 scope.go:117] "RemoveContainer" containerID="ef2eea298a1e2adfb29f0904d8bd4192330b9efb9d8fc035b8a55f241bb604cc" Dec 06 10:34:28 crc kubenswrapper[4672]: I1206 10:34:28.612107 4672 scope.go:117] "RemoveContainer" containerID="2ceeb04e454cdd87a68b109a5e54b511d54f99d82cd79a3052f9ddec67621ed3" Dec 06 10:34:28 crc kubenswrapper[4672]: E1206 10:34:28.614160 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ceeb04e454cdd87a68b109a5e54b511d54f99d82cd79a3052f9ddec67621ed3\": container with ID starting with 2ceeb04e454cdd87a68b109a5e54b511d54f99d82cd79a3052f9ddec67621ed3 not found: ID does not exist" containerID="2ceeb04e454cdd87a68b109a5e54b511d54f99d82cd79a3052f9ddec67621ed3" Dec 06 10:34:28 crc kubenswrapper[4672]: I1206 10:34:28.614210 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ceeb04e454cdd87a68b109a5e54b511d54f99d82cd79a3052f9ddec67621ed3"} err="failed to get container status \"2ceeb04e454cdd87a68b109a5e54b511d54f99d82cd79a3052f9ddec67621ed3\": rpc error: code = NotFound desc = could not find container \"2ceeb04e454cdd87a68b109a5e54b511d54f99d82cd79a3052f9ddec67621ed3\": container with ID starting with 2ceeb04e454cdd87a68b109a5e54b511d54f99d82cd79a3052f9ddec67621ed3 not found: ID does not exist" Dec 06 10:34:28 crc kubenswrapper[4672]: I1206 10:34:28.614245 4672 scope.go:117] "RemoveContainer" containerID="ef2eea298a1e2adfb29f0904d8bd4192330b9efb9d8fc035b8a55f241bb604cc" Dec 06 10:34:28 crc kubenswrapper[4672]: E1206 10:34:28.614637 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef2eea298a1e2adfb29f0904d8bd4192330b9efb9d8fc035b8a55f241bb604cc\": container with ID starting with ef2eea298a1e2adfb29f0904d8bd4192330b9efb9d8fc035b8a55f241bb604cc not found: ID does not exist" containerID="ef2eea298a1e2adfb29f0904d8bd4192330b9efb9d8fc035b8a55f241bb604cc" Dec 06 10:34:28 crc kubenswrapper[4672]: I1206 10:34:28.614660 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef2eea298a1e2adfb29f0904d8bd4192330b9efb9d8fc035b8a55f241bb604cc"} err="failed to get container status \"ef2eea298a1e2adfb29f0904d8bd4192330b9efb9d8fc035b8a55f241bb604cc\": rpc error: code = NotFound desc = could not find container \"ef2eea298a1e2adfb29f0904d8bd4192330b9efb9d8fc035b8a55f241bb604cc\": container with ID starting with ef2eea298a1e2adfb29f0904d8bd4192330b9efb9d8fc035b8a55f241bb604cc not found: ID does not exist" Dec 06 10:34:28 crc kubenswrapper[4672]: I1206 10:34:28.641160 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c237ec5d-7c8c-423b-b427-d5064e2bce86-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c237ec5d-7c8c-423b-b427-d5064e2bce86" (UID: "c237ec5d-7c8c-423b-b427-d5064e2bce86"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:34:28 crc kubenswrapper[4672]: I1206 10:34:28.691685 4672 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c237ec5d-7c8c-423b-b427-d5064e2bce86-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 06 10:34:30 crc kubenswrapper[4672]: I1206 10:34:30.575074 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c237ec5d-7c8c-423b-b427-d5064e2bce86" path="/var/lib/kubelet/pods/c237ec5d-7c8c-423b-b427-d5064e2bce86/volumes" Dec 06 10:34:38 crc kubenswrapper[4672]: I1206 10:34:38.557300 4672 scope.go:117] "RemoveContainer" containerID="94b4f355b405b1790b65eef3318ce600b63c6d71249d935e5c628d5678535d3e" Dec 06 10:34:38 crc kubenswrapper[4672]: E1206 10:34:38.557922 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:34:49 crc kubenswrapper[4672]: I1206 10:34:49.557123 4672 scope.go:117] "RemoveContainer" containerID="94b4f355b405b1790b65eef3318ce600b63c6d71249d935e5c628d5678535d3e" Dec 06 10:34:49 crc kubenswrapper[4672]: E1206 10:34:49.558758 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:35:00 crc kubenswrapper[4672]: I1206 10:35:00.557462 4672 scope.go:117] "RemoveContainer" containerID="94b4f355b405b1790b65eef3318ce600b63c6d71249d935e5c628d5678535d3e" Dec 06 10:35:00 crc kubenswrapper[4672]: E1206 10:35:00.558296 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:35:13 crc kubenswrapper[4672]: I1206 10:35:13.556406 4672 scope.go:117] "RemoveContainer" containerID="94b4f355b405b1790b65eef3318ce600b63c6d71249d935e5c628d5678535d3e" Dec 06 10:35:13 crc kubenswrapper[4672]: E1206 10:35:13.557221 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:35:28 crc kubenswrapper[4672]: I1206 10:35:28.557307 4672 scope.go:117] "RemoveContainer" containerID="94b4f355b405b1790b65eef3318ce600b63c6d71249d935e5c628d5678535d3e" Dec 06 10:35:28 crc kubenswrapper[4672]: E1206 10:35:28.558032 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4s7nh_openshift-machine-config-operator(b0e78155-0eda-42cd-b11b-fbd9e5cc1e39)\"" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" Dec 06 10:35:42 crc kubenswrapper[4672]: I1206 10:35:42.570371 4672 scope.go:117] "RemoveContainer" containerID="94b4f355b405b1790b65eef3318ce600b63c6d71249d935e5c628d5678535d3e" Dec 06 10:35:43 crc kubenswrapper[4672]: I1206 10:35:43.226038 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" event={"ID":"b0e78155-0eda-42cd-b11b-fbd9e5cc1e39","Type":"ContainerStarted","Data":"b63e2ec37f5e0533f5e7a8dac66759a9f4e83b8e06da90f03fa3cfbeaa30f591"} Dec 06 10:36:40 crc kubenswrapper[4672]: I1206 10:36:40.876966 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fvmrf"] Dec 06 10:36:40 crc kubenswrapper[4672]: E1206 10:36:40.878131 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e" containerName="extract-content" Dec 06 10:36:40 crc kubenswrapper[4672]: I1206 10:36:40.878151 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e" containerName="extract-content" Dec 06 10:36:40 crc kubenswrapper[4672]: E1206 10:36:40.878172 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c237ec5d-7c8c-423b-b427-d5064e2bce86" containerName="gather" Dec 06 10:36:40 crc kubenswrapper[4672]: I1206 10:36:40.878180 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c237ec5d-7c8c-423b-b427-d5064e2bce86" containerName="gather" Dec 06 10:36:40 crc kubenswrapper[4672]: E1206 10:36:40.878194 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c237ec5d-7c8c-423b-b427-d5064e2bce86" containerName="copy" Dec 06 10:36:40 crc kubenswrapper[4672]: I1206 10:36:40.878202 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c237ec5d-7c8c-423b-b427-d5064e2bce86" containerName="copy" Dec 06 10:36:40 crc kubenswrapper[4672]: E1206 10:36:40.878210 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e" containerName="extract-utilities" Dec 06 10:36:40 crc kubenswrapper[4672]: I1206 10:36:40.878218 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e" containerName="extract-utilities" Dec 06 10:36:40 crc kubenswrapper[4672]: E1206 10:36:40.878237 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e" containerName="registry-server" Dec 06 10:36:40 crc kubenswrapper[4672]: I1206 10:36:40.878245 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e" containerName="registry-server" Dec 06 10:36:40 crc kubenswrapper[4672]: I1206 10:36:40.878456 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="c237ec5d-7c8c-423b-b427-d5064e2bce86" containerName="copy" Dec 06 10:36:40 crc kubenswrapper[4672]: I1206 10:36:40.878492 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd44bfb4-5ec4-433a-b31c-cb6464a9ee2e" containerName="registry-server" Dec 06 10:36:40 crc kubenswrapper[4672]: I1206 10:36:40.878503 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="c237ec5d-7c8c-423b-b427-d5064e2bce86" containerName="gather" Dec 06 10:36:40 crc kubenswrapper[4672]: I1206 10:36:40.880258 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fvmrf" Dec 06 10:36:40 crc kubenswrapper[4672]: I1206 10:36:40.893522 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fvmrf"] Dec 06 10:36:40 crc kubenswrapper[4672]: I1206 10:36:40.968032 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/def391ae-bf6f-4af9-88a4-112b6d1bb5a8-catalog-content\") pod \"community-operators-fvmrf\" (UID: \"def391ae-bf6f-4af9-88a4-112b6d1bb5a8\") " pod="openshift-marketplace/community-operators-fvmrf" Dec 06 10:36:40 crc kubenswrapper[4672]: I1206 10:36:40.968115 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/def391ae-bf6f-4af9-88a4-112b6d1bb5a8-utilities\") pod \"community-operators-fvmrf\" (UID: \"def391ae-bf6f-4af9-88a4-112b6d1bb5a8\") " pod="openshift-marketplace/community-operators-fvmrf" Dec 06 10:36:40 crc kubenswrapper[4672]: I1206 10:36:40.968169 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f5cz\" (UniqueName: \"kubernetes.io/projected/def391ae-bf6f-4af9-88a4-112b6d1bb5a8-kube-api-access-5f5cz\") pod \"community-operators-fvmrf\" (UID: \"def391ae-bf6f-4af9-88a4-112b6d1bb5a8\") " pod="openshift-marketplace/community-operators-fvmrf" Dec 06 10:36:41 crc kubenswrapper[4672]: I1206 10:36:41.070803 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/def391ae-bf6f-4af9-88a4-112b6d1bb5a8-catalog-content\") pod \"community-operators-fvmrf\" (UID: \"def391ae-bf6f-4af9-88a4-112b6d1bb5a8\") " pod="openshift-marketplace/community-operators-fvmrf" Dec 06 10:36:41 crc kubenswrapper[4672]: I1206 10:36:41.070929 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/def391ae-bf6f-4af9-88a4-112b6d1bb5a8-utilities\") pod \"community-operators-fvmrf\" (UID: \"def391ae-bf6f-4af9-88a4-112b6d1bb5a8\") " pod="openshift-marketplace/community-operators-fvmrf" Dec 06 10:36:41 crc kubenswrapper[4672]: I1206 10:36:41.071009 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f5cz\" (UniqueName: \"kubernetes.io/projected/def391ae-bf6f-4af9-88a4-112b6d1bb5a8-kube-api-access-5f5cz\") pod \"community-operators-fvmrf\" (UID: \"def391ae-bf6f-4af9-88a4-112b6d1bb5a8\") " pod="openshift-marketplace/community-operators-fvmrf" Dec 06 10:36:41 crc kubenswrapper[4672]: I1206 10:36:41.071582 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/def391ae-bf6f-4af9-88a4-112b6d1bb5a8-catalog-content\") pod \"community-operators-fvmrf\" (UID: \"def391ae-bf6f-4af9-88a4-112b6d1bb5a8\") " pod="openshift-marketplace/community-operators-fvmrf" Dec 06 10:36:41 crc kubenswrapper[4672]: I1206 10:36:41.071958 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/def391ae-bf6f-4af9-88a4-112b6d1bb5a8-utilities\") pod \"community-operators-fvmrf\" (UID: \"def391ae-bf6f-4af9-88a4-112b6d1bb5a8\") " pod="openshift-marketplace/community-operators-fvmrf" Dec 06 10:36:41 crc kubenswrapper[4672]: I1206 10:36:41.091935 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f5cz\" (UniqueName: \"kubernetes.io/projected/def391ae-bf6f-4af9-88a4-112b6d1bb5a8-kube-api-access-5f5cz\") pod \"community-operators-fvmrf\" (UID: \"def391ae-bf6f-4af9-88a4-112b6d1bb5a8\") " pod="openshift-marketplace/community-operators-fvmrf" Dec 06 10:36:41 crc kubenswrapper[4672]: I1206 10:36:41.214300 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fvmrf" Dec 06 10:36:41 crc kubenswrapper[4672]: I1206 10:36:41.813720 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fvmrf"] Dec 06 10:36:42 crc kubenswrapper[4672]: I1206 10:36:42.832990 4672 generic.go:334] "Generic (PLEG): container finished" podID="def391ae-bf6f-4af9-88a4-112b6d1bb5a8" containerID="927f2c5cd800b30c1e7a1f74abe29c675b5cd6504eaec1f8d8673321a3802f7a" exitCode=0 Dec 06 10:36:42 crc kubenswrapper[4672]: I1206 10:36:42.833375 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fvmrf" event={"ID":"def391ae-bf6f-4af9-88a4-112b6d1bb5a8","Type":"ContainerDied","Data":"927f2c5cd800b30c1e7a1f74abe29c675b5cd6504eaec1f8d8673321a3802f7a"} Dec 06 10:36:42 crc kubenswrapper[4672]: I1206 10:36:42.833410 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fvmrf" event={"ID":"def391ae-bf6f-4af9-88a4-112b6d1bb5a8","Type":"ContainerStarted","Data":"d61ed2bafcafa88462586441eccb0cd13e6ad3bca3f8c776c75868760f6e2f10"} Dec 06 10:36:43 crc kubenswrapper[4672]: I1206 10:36:43.846549 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fvmrf" event={"ID":"def391ae-bf6f-4af9-88a4-112b6d1bb5a8","Type":"ContainerStarted","Data":"e2eccb0b6fd3f0e62e37e46e17778d77397ce2b54291e61036deb8880bda7781"} Dec 06 10:36:43 crc kubenswrapper[4672]: I1206 10:36:43.865260 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wr7ph"] Dec 06 10:36:43 crc kubenswrapper[4672]: I1206 10:36:43.867383 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wr7ph" Dec 06 10:36:43 crc kubenswrapper[4672]: I1206 10:36:43.886689 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wr7ph"] Dec 06 10:36:44 crc kubenswrapper[4672]: I1206 10:36:44.065687 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a41d094f-ed3b-4638-a0ca-1a6348561fc1-utilities\") pod \"redhat-operators-wr7ph\" (UID: \"a41d094f-ed3b-4638-a0ca-1a6348561fc1\") " pod="openshift-marketplace/redhat-operators-wr7ph" Dec 06 10:36:44 crc kubenswrapper[4672]: I1206 10:36:44.066046 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m2g6\" (UniqueName: \"kubernetes.io/projected/a41d094f-ed3b-4638-a0ca-1a6348561fc1-kube-api-access-5m2g6\") pod \"redhat-operators-wr7ph\" (UID: \"a41d094f-ed3b-4638-a0ca-1a6348561fc1\") " pod="openshift-marketplace/redhat-operators-wr7ph" Dec 06 10:36:44 crc kubenswrapper[4672]: I1206 10:36:44.066132 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a41d094f-ed3b-4638-a0ca-1a6348561fc1-catalog-content\") pod \"redhat-operators-wr7ph\" (UID: \"a41d094f-ed3b-4638-a0ca-1a6348561fc1\") " pod="openshift-marketplace/redhat-operators-wr7ph" Dec 06 10:36:44 crc kubenswrapper[4672]: I1206 10:36:44.167669 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a41d094f-ed3b-4638-a0ca-1a6348561fc1-utilities\") pod \"redhat-operators-wr7ph\" (UID: \"a41d094f-ed3b-4638-a0ca-1a6348561fc1\") " pod="openshift-marketplace/redhat-operators-wr7ph" Dec 06 10:36:44 crc kubenswrapper[4672]: I1206 10:36:44.167757 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m2g6\" (UniqueName: \"kubernetes.io/projected/a41d094f-ed3b-4638-a0ca-1a6348561fc1-kube-api-access-5m2g6\") pod \"redhat-operators-wr7ph\" (UID: \"a41d094f-ed3b-4638-a0ca-1a6348561fc1\") " pod="openshift-marketplace/redhat-operators-wr7ph" Dec 06 10:36:44 crc kubenswrapper[4672]: I1206 10:36:44.167841 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a41d094f-ed3b-4638-a0ca-1a6348561fc1-catalog-content\") pod \"redhat-operators-wr7ph\" (UID: \"a41d094f-ed3b-4638-a0ca-1a6348561fc1\") " pod="openshift-marketplace/redhat-operators-wr7ph" Dec 06 10:36:44 crc kubenswrapper[4672]: I1206 10:36:44.168115 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a41d094f-ed3b-4638-a0ca-1a6348561fc1-utilities\") pod \"redhat-operators-wr7ph\" (UID: \"a41d094f-ed3b-4638-a0ca-1a6348561fc1\") " pod="openshift-marketplace/redhat-operators-wr7ph" Dec 06 10:36:44 crc kubenswrapper[4672]: I1206 10:36:44.168132 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a41d094f-ed3b-4638-a0ca-1a6348561fc1-catalog-content\") pod \"redhat-operators-wr7ph\" (UID: \"a41d094f-ed3b-4638-a0ca-1a6348561fc1\") " pod="openshift-marketplace/redhat-operators-wr7ph" Dec 06 10:36:44 crc kubenswrapper[4672]: I1206 10:36:44.188674 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m2g6\" (UniqueName: \"kubernetes.io/projected/a41d094f-ed3b-4638-a0ca-1a6348561fc1-kube-api-access-5m2g6\") pod \"redhat-operators-wr7ph\" (UID: \"a41d094f-ed3b-4638-a0ca-1a6348561fc1\") " pod="openshift-marketplace/redhat-operators-wr7ph" Dec 06 10:36:44 crc kubenswrapper[4672]: I1206 10:36:44.211888 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wr7ph" Dec 06 10:36:44 crc kubenswrapper[4672]: I1206 10:36:44.717088 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wr7ph"] Dec 06 10:36:44 crc kubenswrapper[4672]: I1206 10:36:44.857620 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wr7ph" event={"ID":"a41d094f-ed3b-4638-a0ca-1a6348561fc1","Type":"ContainerStarted","Data":"5bfd82353f3c032a394f23d7c78da596f86aeaa7f03a16172cf29076864200ca"} Dec 06 10:36:44 crc kubenswrapper[4672]: I1206 10:36:44.859748 4672 generic.go:334] "Generic (PLEG): container finished" podID="def391ae-bf6f-4af9-88a4-112b6d1bb5a8" containerID="e2eccb0b6fd3f0e62e37e46e17778d77397ce2b54291e61036deb8880bda7781" exitCode=0 Dec 06 10:36:44 crc kubenswrapper[4672]: I1206 10:36:44.859777 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fvmrf" event={"ID":"def391ae-bf6f-4af9-88a4-112b6d1bb5a8","Type":"ContainerDied","Data":"e2eccb0b6fd3f0e62e37e46e17778d77397ce2b54291e61036deb8880bda7781"} Dec 06 10:36:45 crc kubenswrapper[4672]: I1206 10:36:45.872572 4672 generic.go:334] "Generic (PLEG): container finished" podID="a41d094f-ed3b-4638-a0ca-1a6348561fc1" containerID="e82462a760d70564e47b7b061e17022d1907d995bc44ebfbb1a637a61004fc6a" exitCode=0 Dec 06 10:36:45 crc kubenswrapper[4672]: I1206 10:36:45.872661 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wr7ph" event={"ID":"a41d094f-ed3b-4638-a0ca-1a6348561fc1","Type":"ContainerDied","Data":"e82462a760d70564e47b7b061e17022d1907d995bc44ebfbb1a637a61004fc6a"} Dec 06 10:36:45 crc kubenswrapper[4672]: I1206 10:36:45.877937 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fvmrf" event={"ID":"def391ae-bf6f-4af9-88a4-112b6d1bb5a8","Type":"ContainerStarted","Data":"58839aa56ea4d59b32743b31dd85105ed7a86dff59e8a7787c3344102c704105"} Dec 06 10:36:47 crc kubenswrapper[4672]: I1206 10:36:47.899131 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wr7ph" event={"ID":"a41d094f-ed3b-4638-a0ca-1a6348561fc1","Type":"ContainerStarted","Data":"8e0b5aedca43bbb71b9e55c6a9cc7258444d6eff796a83b8ab46b1449628376c"} Dec 06 10:36:47 crc kubenswrapper[4672]: I1206 10:36:47.916982 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fvmrf" podStartSLOduration=5.453690915 podStartE2EDuration="7.916963645s" podCreationTimestamp="2025-12-06 10:36:40 +0000 UTC" firstStartedPulling="2025-12-06 10:36:42.836801406 +0000 UTC m=+5420.581061703" lastFinishedPulling="2025-12-06 10:36:45.300074136 +0000 UTC m=+5423.044334433" observedRunningTime="2025-12-06 10:36:45.931181102 +0000 UTC m=+5423.675441389" watchObservedRunningTime="2025-12-06 10:36:47.916963645 +0000 UTC m=+5425.661223932" Dec 06 10:36:51 crc kubenswrapper[4672]: I1206 10:36:51.215386 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fvmrf" Dec 06 10:36:51 crc kubenswrapper[4672]: I1206 10:36:51.216141 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fvmrf" Dec 06 10:36:51 crc kubenswrapper[4672]: I1206 10:36:51.284930 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fvmrf" Dec 06 10:36:51 crc kubenswrapper[4672]: I1206 10:36:51.937344 4672 generic.go:334] "Generic (PLEG): container finished" podID="a41d094f-ed3b-4638-a0ca-1a6348561fc1" containerID="8e0b5aedca43bbb71b9e55c6a9cc7258444d6eff796a83b8ab46b1449628376c" exitCode=0 Dec 06 10:36:51 crc kubenswrapper[4672]: I1206 10:36:51.937422 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wr7ph" event={"ID":"a41d094f-ed3b-4638-a0ca-1a6348561fc1","Type":"ContainerDied","Data":"8e0b5aedca43bbb71b9e55c6a9cc7258444d6eff796a83b8ab46b1449628376c"} Dec 06 10:36:52 crc kubenswrapper[4672]: I1206 10:36:52.007404 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fvmrf" Dec 06 10:36:52 crc kubenswrapper[4672]: I1206 10:36:52.451836 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fvmrf"] Dec 06 10:36:52 crc kubenswrapper[4672]: I1206 10:36:52.947304 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wr7ph" event={"ID":"a41d094f-ed3b-4638-a0ca-1a6348561fc1","Type":"ContainerStarted","Data":"7b5b524c2180e035cdf2eed19fc6563315a5ce904bad41cdc3cd6dc6682e07bd"} Dec 06 10:36:52 crc kubenswrapper[4672]: I1206 10:36:52.983350 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wr7ph" podStartSLOduration=3.436167727 podStartE2EDuration="9.98333114s" podCreationTimestamp="2025-12-06 10:36:43 +0000 UTC" firstStartedPulling="2025-12-06 10:36:45.875124977 +0000 UTC m=+5423.619385264" lastFinishedPulling="2025-12-06 10:36:52.42228839 +0000 UTC m=+5430.166548677" observedRunningTime="2025-12-06 10:36:52.982880888 +0000 UTC m=+5430.727141175" watchObservedRunningTime="2025-12-06 10:36:52.98333114 +0000 UTC m=+5430.727591437" Dec 06 10:36:53 crc kubenswrapper[4672]: I1206 10:36:53.955181 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fvmrf" podUID="def391ae-bf6f-4af9-88a4-112b6d1bb5a8" containerName="registry-server" containerID="cri-o://58839aa56ea4d59b32743b31dd85105ed7a86dff59e8a7787c3344102c704105" gracePeriod=2 Dec 06 10:36:54 crc kubenswrapper[4672]: I1206 10:36:54.212563 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wr7ph" Dec 06 10:36:54 crc kubenswrapper[4672]: I1206 10:36:54.212845 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wr7ph" Dec 06 10:36:54 crc kubenswrapper[4672]: I1206 10:36:54.430424 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fvmrf" Dec 06 10:36:54 crc kubenswrapper[4672]: I1206 10:36:54.574325 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5f5cz\" (UniqueName: \"kubernetes.io/projected/def391ae-bf6f-4af9-88a4-112b6d1bb5a8-kube-api-access-5f5cz\") pod \"def391ae-bf6f-4af9-88a4-112b6d1bb5a8\" (UID: \"def391ae-bf6f-4af9-88a4-112b6d1bb5a8\") " Dec 06 10:36:54 crc kubenswrapper[4672]: I1206 10:36:54.574465 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/def391ae-bf6f-4af9-88a4-112b6d1bb5a8-utilities\") pod \"def391ae-bf6f-4af9-88a4-112b6d1bb5a8\" (UID: \"def391ae-bf6f-4af9-88a4-112b6d1bb5a8\") " Dec 06 10:36:54 crc kubenswrapper[4672]: I1206 10:36:54.574516 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/def391ae-bf6f-4af9-88a4-112b6d1bb5a8-catalog-content\") pod \"def391ae-bf6f-4af9-88a4-112b6d1bb5a8\" (UID: \"def391ae-bf6f-4af9-88a4-112b6d1bb5a8\") " Dec 06 10:36:54 crc kubenswrapper[4672]: I1206 10:36:54.575062 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/def391ae-bf6f-4af9-88a4-112b6d1bb5a8-utilities" (OuterVolumeSpecName: "utilities") pod "def391ae-bf6f-4af9-88a4-112b6d1bb5a8" (UID: "def391ae-bf6f-4af9-88a4-112b6d1bb5a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:36:54 crc kubenswrapper[4672]: I1206 10:36:54.575251 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/def391ae-bf6f-4af9-88a4-112b6d1bb5a8-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:36:54 crc kubenswrapper[4672]: I1206 10:36:54.579857 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/def391ae-bf6f-4af9-88a4-112b6d1bb5a8-kube-api-access-5f5cz" (OuterVolumeSpecName: "kube-api-access-5f5cz") pod "def391ae-bf6f-4af9-88a4-112b6d1bb5a8" (UID: "def391ae-bf6f-4af9-88a4-112b6d1bb5a8"). InnerVolumeSpecName "kube-api-access-5f5cz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:36:54 crc kubenswrapper[4672]: I1206 10:36:54.625130 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/def391ae-bf6f-4af9-88a4-112b6d1bb5a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "def391ae-bf6f-4af9-88a4-112b6d1bb5a8" (UID: "def391ae-bf6f-4af9-88a4-112b6d1bb5a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:36:54 crc kubenswrapper[4672]: I1206 10:36:54.677293 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5f5cz\" (UniqueName: \"kubernetes.io/projected/def391ae-bf6f-4af9-88a4-112b6d1bb5a8-kube-api-access-5f5cz\") on node \"crc\" DevicePath \"\"" Dec 06 10:36:54 crc kubenswrapper[4672]: I1206 10:36:54.677321 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/def391ae-bf6f-4af9-88a4-112b6d1bb5a8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:36:54 crc kubenswrapper[4672]: I1206 10:36:54.964538 4672 generic.go:334] "Generic (PLEG): container finished" podID="def391ae-bf6f-4af9-88a4-112b6d1bb5a8" containerID="58839aa56ea4d59b32743b31dd85105ed7a86dff59e8a7787c3344102c704105" exitCode=0 Dec 06 10:36:54 crc kubenswrapper[4672]: I1206 10:36:54.964592 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fvmrf" Dec 06 10:36:54 crc kubenswrapper[4672]: I1206 10:36:54.964580 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fvmrf" event={"ID":"def391ae-bf6f-4af9-88a4-112b6d1bb5a8","Type":"ContainerDied","Data":"58839aa56ea4d59b32743b31dd85105ed7a86dff59e8a7787c3344102c704105"} Dec 06 10:36:54 crc kubenswrapper[4672]: I1206 10:36:54.966857 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fvmrf" event={"ID":"def391ae-bf6f-4af9-88a4-112b6d1bb5a8","Type":"ContainerDied","Data":"d61ed2bafcafa88462586441eccb0cd13e6ad3bca3f8c776c75868760f6e2f10"} Dec 06 10:36:54 crc kubenswrapper[4672]: I1206 10:36:54.966961 4672 scope.go:117] "RemoveContainer" containerID="58839aa56ea4d59b32743b31dd85105ed7a86dff59e8a7787c3344102c704105" Dec 06 10:36:54 crc kubenswrapper[4672]: I1206 10:36:54.983690 4672 scope.go:117] "RemoveContainer" containerID="e2eccb0b6fd3f0e62e37e46e17778d77397ce2b54291e61036deb8880bda7781" Dec 06 10:36:54 crc kubenswrapper[4672]: I1206 10:36:54.999637 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fvmrf"] Dec 06 10:36:55 crc kubenswrapper[4672]: I1206 10:36:55.007716 4672 scope.go:117] "RemoveContainer" containerID="927f2c5cd800b30c1e7a1f74abe29c675b5cd6504eaec1f8d8673321a3802f7a" Dec 06 10:36:55 crc kubenswrapper[4672]: I1206 10:36:55.010222 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fvmrf"] Dec 06 10:36:55 crc kubenswrapper[4672]: I1206 10:36:55.039922 4672 scope.go:117] "RemoveContainer" containerID="58839aa56ea4d59b32743b31dd85105ed7a86dff59e8a7787c3344102c704105" Dec 06 10:36:55 crc kubenswrapper[4672]: E1206 10:36:55.040265 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58839aa56ea4d59b32743b31dd85105ed7a86dff59e8a7787c3344102c704105\": container with ID starting with 58839aa56ea4d59b32743b31dd85105ed7a86dff59e8a7787c3344102c704105 not found: ID does not exist" containerID="58839aa56ea4d59b32743b31dd85105ed7a86dff59e8a7787c3344102c704105" Dec 06 10:36:55 crc kubenswrapper[4672]: I1206 10:36:55.040295 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58839aa56ea4d59b32743b31dd85105ed7a86dff59e8a7787c3344102c704105"} err="failed to get container status \"58839aa56ea4d59b32743b31dd85105ed7a86dff59e8a7787c3344102c704105\": rpc error: code = NotFound desc = could not find container \"58839aa56ea4d59b32743b31dd85105ed7a86dff59e8a7787c3344102c704105\": container with ID starting with 58839aa56ea4d59b32743b31dd85105ed7a86dff59e8a7787c3344102c704105 not found: ID does not exist" Dec 06 10:36:55 crc kubenswrapper[4672]: I1206 10:36:55.040314 4672 scope.go:117] "RemoveContainer" containerID="e2eccb0b6fd3f0e62e37e46e17778d77397ce2b54291e61036deb8880bda7781" Dec 06 10:36:55 crc kubenswrapper[4672]: E1206 10:36:55.040537 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2eccb0b6fd3f0e62e37e46e17778d77397ce2b54291e61036deb8880bda7781\": container with ID starting with e2eccb0b6fd3f0e62e37e46e17778d77397ce2b54291e61036deb8880bda7781 not found: ID does not exist" containerID="e2eccb0b6fd3f0e62e37e46e17778d77397ce2b54291e61036deb8880bda7781" Dec 06 10:36:55 crc kubenswrapper[4672]: I1206 10:36:55.040559 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2eccb0b6fd3f0e62e37e46e17778d77397ce2b54291e61036deb8880bda7781"} err="failed to get container status \"e2eccb0b6fd3f0e62e37e46e17778d77397ce2b54291e61036deb8880bda7781\": rpc error: code = NotFound desc = could not find container \"e2eccb0b6fd3f0e62e37e46e17778d77397ce2b54291e61036deb8880bda7781\": container with ID starting with e2eccb0b6fd3f0e62e37e46e17778d77397ce2b54291e61036deb8880bda7781 not found: ID does not exist" Dec 06 10:36:55 crc kubenswrapper[4672]: I1206 10:36:55.040574 4672 scope.go:117] "RemoveContainer" containerID="927f2c5cd800b30c1e7a1f74abe29c675b5cd6504eaec1f8d8673321a3802f7a" Dec 06 10:36:55 crc kubenswrapper[4672]: E1206 10:36:55.040773 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"927f2c5cd800b30c1e7a1f74abe29c675b5cd6504eaec1f8d8673321a3802f7a\": container with ID starting with 927f2c5cd800b30c1e7a1f74abe29c675b5cd6504eaec1f8d8673321a3802f7a not found: ID does not exist" containerID="927f2c5cd800b30c1e7a1f74abe29c675b5cd6504eaec1f8d8673321a3802f7a" Dec 06 10:36:55 crc kubenswrapper[4672]: I1206 10:36:55.040793 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"927f2c5cd800b30c1e7a1f74abe29c675b5cd6504eaec1f8d8673321a3802f7a"} err="failed to get container status \"927f2c5cd800b30c1e7a1f74abe29c675b5cd6504eaec1f8d8673321a3802f7a\": rpc error: code = NotFound desc = could not find container \"927f2c5cd800b30c1e7a1f74abe29c675b5cd6504eaec1f8d8673321a3802f7a\": container with ID starting with 927f2c5cd800b30c1e7a1f74abe29c675b5cd6504eaec1f8d8673321a3802f7a not found: ID does not exist" Dec 06 10:36:55 crc kubenswrapper[4672]: I1206 10:36:55.273541 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wr7ph" podUID="a41d094f-ed3b-4638-a0ca-1a6348561fc1" containerName="registry-server" probeResult="failure" output=< Dec 06 10:36:55 crc kubenswrapper[4672]: timeout: failed to connect service ":50051" within 1s Dec 06 10:36:55 crc kubenswrapper[4672]: > Dec 06 10:36:56 crc kubenswrapper[4672]: I1206 10:36:56.568041 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="def391ae-bf6f-4af9-88a4-112b6d1bb5a8" path="/var/lib/kubelet/pods/def391ae-bf6f-4af9-88a4-112b6d1bb5a8/volumes" Dec 06 10:37:04 crc kubenswrapper[4672]: I1206 10:37:04.274315 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wr7ph" Dec 06 10:37:04 crc kubenswrapper[4672]: I1206 10:37:04.330959 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wr7ph" Dec 06 10:37:04 crc kubenswrapper[4672]: I1206 10:37:04.518216 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wr7ph"] Dec 06 10:37:06 crc kubenswrapper[4672]: I1206 10:37:06.089141 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wr7ph" podUID="a41d094f-ed3b-4638-a0ca-1a6348561fc1" containerName="registry-server" containerID="cri-o://7b5b524c2180e035cdf2eed19fc6563315a5ce904bad41cdc3cd6dc6682e07bd" gracePeriod=2 Dec 06 10:37:06 crc kubenswrapper[4672]: I1206 10:37:06.543238 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wr7ph" Dec 06 10:37:06 crc kubenswrapper[4672]: I1206 10:37:06.736016 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m2g6\" (UniqueName: \"kubernetes.io/projected/a41d094f-ed3b-4638-a0ca-1a6348561fc1-kube-api-access-5m2g6\") pod \"a41d094f-ed3b-4638-a0ca-1a6348561fc1\" (UID: \"a41d094f-ed3b-4638-a0ca-1a6348561fc1\") " Dec 06 10:37:06 crc kubenswrapper[4672]: I1206 10:37:06.736261 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a41d094f-ed3b-4638-a0ca-1a6348561fc1-utilities\") pod \"a41d094f-ed3b-4638-a0ca-1a6348561fc1\" (UID: \"a41d094f-ed3b-4638-a0ca-1a6348561fc1\") " Dec 06 10:37:06 crc kubenswrapper[4672]: I1206 10:37:06.736310 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a41d094f-ed3b-4638-a0ca-1a6348561fc1-catalog-content\") pod \"a41d094f-ed3b-4638-a0ca-1a6348561fc1\" (UID: \"a41d094f-ed3b-4638-a0ca-1a6348561fc1\") " Dec 06 10:37:06 crc kubenswrapper[4672]: I1206 10:37:06.737552 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a41d094f-ed3b-4638-a0ca-1a6348561fc1-utilities" (OuterVolumeSpecName: "utilities") pod "a41d094f-ed3b-4638-a0ca-1a6348561fc1" (UID: "a41d094f-ed3b-4638-a0ca-1a6348561fc1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:37:06 crc kubenswrapper[4672]: I1206 10:37:06.743141 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a41d094f-ed3b-4638-a0ca-1a6348561fc1-kube-api-access-5m2g6" (OuterVolumeSpecName: "kube-api-access-5m2g6") pod "a41d094f-ed3b-4638-a0ca-1a6348561fc1" (UID: "a41d094f-ed3b-4638-a0ca-1a6348561fc1"). InnerVolumeSpecName "kube-api-access-5m2g6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:37:06 crc kubenswrapper[4672]: I1206 10:37:06.839198 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m2g6\" (UniqueName: \"kubernetes.io/projected/a41d094f-ed3b-4638-a0ca-1a6348561fc1-kube-api-access-5m2g6\") on node \"crc\" DevicePath \"\"" Dec 06 10:37:06 crc kubenswrapper[4672]: I1206 10:37:06.839235 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a41d094f-ed3b-4638-a0ca-1a6348561fc1-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:37:06 crc kubenswrapper[4672]: I1206 10:37:06.847581 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a41d094f-ed3b-4638-a0ca-1a6348561fc1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a41d094f-ed3b-4638-a0ca-1a6348561fc1" (UID: "a41d094f-ed3b-4638-a0ca-1a6348561fc1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:37:06 crc kubenswrapper[4672]: I1206 10:37:06.941475 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a41d094f-ed3b-4638-a0ca-1a6348561fc1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:37:07 crc kubenswrapper[4672]: I1206 10:37:07.106975 4672 generic.go:334] "Generic (PLEG): container finished" podID="a41d094f-ed3b-4638-a0ca-1a6348561fc1" containerID="7b5b524c2180e035cdf2eed19fc6563315a5ce904bad41cdc3cd6dc6682e07bd" exitCode=0 Dec 06 10:37:07 crc kubenswrapper[4672]: I1206 10:37:07.107040 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wr7ph" event={"ID":"a41d094f-ed3b-4638-a0ca-1a6348561fc1","Type":"ContainerDied","Data":"7b5b524c2180e035cdf2eed19fc6563315a5ce904bad41cdc3cd6dc6682e07bd"} Dec 06 10:37:07 crc kubenswrapper[4672]: I1206 10:37:07.107067 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wr7ph" Dec 06 10:37:07 crc kubenswrapper[4672]: I1206 10:37:07.107086 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wr7ph" event={"ID":"a41d094f-ed3b-4638-a0ca-1a6348561fc1","Type":"ContainerDied","Data":"5bfd82353f3c032a394f23d7c78da596f86aeaa7f03a16172cf29076864200ca"} Dec 06 10:37:07 crc kubenswrapper[4672]: I1206 10:37:07.107112 4672 scope.go:117] "RemoveContainer" containerID="7b5b524c2180e035cdf2eed19fc6563315a5ce904bad41cdc3cd6dc6682e07bd" Dec 06 10:37:07 crc kubenswrapper[4672]: I1206 10:37:07.139508 4672 scope.go:117] "RemoveContainer" containerID="8e0b5aedca43bbb71b9e55c6a9cc7258444d6eff796a83b8ab46b1449628376c" Dec 06 10:37:07 crc kubenswrapper[4672]: I1206 10:37:07.176771 4672 scope.go:117] "RemoveContainer" containerID="e82462a760d70564e47b7b061e17022d1907d995bc44ebfbb1a637a61004fc6a" Dec 06 10:37:07 crc kubenswrapper[4672]: I1206 10:37:07.189320 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wr7ph"] Dec 06 10:37:07 crc kubenswrapper[4672]: I1206 10:37:07.203762 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wr7ph"] Dec 06 10:37:07 crc kubenswrapper[4672]: I1206 10:37:07.222745 4672 scope.go:117] "RemoveContainer" containerID="7b5b524c2180e035cdf2eed19fc6563315a5ce904bad41cdc3cd6dc6682e07bd" Dec 06 10:37:07 crc kubenswrapper[4672]: E1206 10:37:07.223467 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b5b524c2180e035cdf2eed19fc6563315a5ce904bad41cdc3cd6dc6682e07bd\": container with ID starting with 7b5b524c2180e035cdf2eed19fc6563315a5ce904bad41cdc3cd6dc6682e07bd not found: ID does not exist" containerID="7b5b524c2180e035cdf2eed19fc6563315a5ce904bad41cdc3cd6dc6682e07bd" Dec 06 10:37:07 crc kubenswrapper[4672]: I1206 10:37:07.223512 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b5b524c2180e035cdf2eed19fc6563315a5ce904bad41cdc3cd6dc6682e07bd"} err="failed to get container status \"7b5b524c2180e035cdf2eed19fc6563315a5ce904bad41cdc3cd6dc6682e07bd\": rpc error: code = NotFound desc = could not find container \"7b5b524c2180e035cdf2eed19fc6563315a5ce904bad41cdc3cd6dc6682e07bd\": container with ID starting with 7b5b524c2180e035cdf2eed19fc6563315a5ce904bad41cdc3cd6dc6682e07bd not found: ID does not exist" Dec 06 10:37:07 crc kubenswrapper[4672]: I1206 10:37:07.223551 4672 scope.go:117] "RemoveContainer" containerID="8e0b5aedca43bbb71b9e55c6a9cc7258444d6eff796a83b8ab46b1449628376c" Dec 06 10:37:07 crc kubenswrapper[4672]: E1206 10:37:07.224030 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e0b5aedca43bbb71b9e55c6a9cc7258444d6eff796a83b8ab46b1449628376c\": container with ID starting with 8e0b5aedca43bbb71b9e55c6a9cc7258444d6eff796a83b8ab46b1449628376c not found: ID does not exist" containerID="8e0b5aedca43bbb71b9e55c6a9cc7258444d6eff796a83b8ab46b1449628376c" Dec 06 10:37:07 crc kubenswrapper[4672]: I1206 10:37:07.224058 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e0b5aedca43bbb71b9e55c6a9cc7258444d6eff796a83b8ab46b1449628376c"} err="failed to get container status \"8e0b5aedca43bbb71b9e55c6a9cc7258444d6eff796a83b8ab46b1449628376c\": rpc error: code = NotFound desc = could not find container \"8e0b5aedca43bbb71b9e55c6a9cc7258444d6eff796a83b8ab46b1449628376c\": container with ID starting with 8e0b5aedca43bbb71b9e55c6a9cc7258444d6eff796a83b8ab46b1449628376c not found: ID does not exist" Dec 06 10:37:07 crc kubenswrapper[4672]: I1206 10:37:07.224072 4672 scope.go:117] "RemoveContainer" containerID="e82462a760d70564e47b7b061e17022d1907d995bc44ebfbb1a637a61004fc6a" Dec 06 10:37:07 crc kubenswrapper[4672]: E1206 10:37:07.224399 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e82462a760d70564e47b7b061e17022d1907d995bc44ebfbb1a637a61004fc6a\": container with ID starting with e82462a760d70564e47b7b061e17022d1907d995bc44ebfbb1a637a61004fc6a not found: ID does not exist" containerID="e82462a760d70564e47b7b061e17022d1907d995bc44ebfbb1a637a61004fc6a" Dec 06 10:37:07 crc kubenswrapper[4672]: I1206 10:37:07.224425 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e82462a760d70564e47b7b061e17022d1907d995bc44ebfbb1a637a61004fc6a"} err="failed to get container status \"e82462a760d70564e47b7b061e17022d1907d995bc44ebfbb1a637a61004fc6a\": rpc error: code = NotFound desc = could not find container \"e82462a760d70564e47b7b061e17022d1907d995bc44ebfbb1a637a61004fc6a\": container with ID starting with e82462a760d70564e47b7b061e17022d1907d995bc44ebfbb1a637a61004fc6a not found: ID does not exist" Dec 06 10:37:08 crc kubenswrapper[4672]: I1206 10:37:08.567320 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a41d094f-ed3b-4638-a0ca-1a6348561fc1" path="/var/lib/kubelet/pods/a41d094f-ed3b-4638-a0ca-1a6348561fc1/volumes" Dec 06 10:37:42 crc kubenswrapper[4672]: I1206 10:37:42.319634 4672 patch_prober.go:28] interesting pod/machine-config-daemon-4s7nh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:37:42 crc kubenswrapper[4672]: I1206 10:37:42.320141 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4s7nh" podUID="b0e78155-0eda-42cd-b11b-fbd9e5cc1e39" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"